Sony's Augmented Reality
By Niall McKay in Tokyo, Wed Jun 19 00:00:00 GMT 2002

Technology companies are often great at coming up with large strategic visions and terrible at implementing them. Not Sony.


As soon as Sony threw its weight behind ubiquitous computing, connected digital devices such as cameras, video cameras, MP3 Audio, PCs etc started to roll off its production line. First they used USB and removable media such as the Memory Stick, then Firewire, now they are using 802.11 and Bluetooth (or indeed all of the above). Take a trip to any electronics store and you'll see that the company understands gizmos.

So trying sang a sneak preview of the future I took a trip to Sony Computer Science Laboratory. Unlike the many corporate research laboratories, which are placed high on a hill somewhere away from the grubby hands of product development people, CSL is based near the old Sony Walkman factory just a stone's throw away for the 30,000-worker campus in Shinagawa in central Tokyo. You can see why, Sony like many Japanese companies leaves the big thinking to the US and European research institutions and focuses on the more practical applications and getting products out the door. There are no ivory towers here.

Simple Simon


Certainly, designing products that excite the public is easier said than done. Now that Bluetooth and 802.11 and other wireless technologies are being implemented across the company's product line the next phase is to design new user interface technology that makes these products easier to use. In short the mission here is to bring computer devices to the next level where they interact both with each other and the real world in a more integrated fashion.

"You really can not ask users to type in DNS and Internet Protocol or even email address when all they are trying to do is move a photo from one computer to another," says Dr. Jun Rekimoto, Director of Sony's Interaction Laboratory. We are standing the shared area in the Lab. There are dozens of small Sony sub notebooks, large screen displays, projectors, web cameras and various projects dotted around the lobby. Around the sides are offices with white Perspex walls. So virtually every bit of all space can and is used as a either a whiteboard or projection screen. "In order to design the future computer environment, you've got to live in it," he explains.

"What we are interested in is designing new human computer interaction interfaces for highly portable computers, that are aware of their surroundings and that can assist the user automatically rather than having to wait from a specific command," he says. "Before the end of the decade, we expect that such computing devices will be as commonplace as today's Walkmans, electronic hearing aids, eyeglasses, and wristwatches."

Dr. Rekimoto then points a chopstick or pen like device at one hand held computer, grabs an object (in this case its an MPEG Movie) and points to screen on the wall and the movie appears. What he has actually done is transfer the movie from one computer to another. The technology is called pick and drop because it goes a stage further than drag and drop technology.

"You see, no need to mess around with DNS addresses and the rest. What we are trying to do is remove the physical boundaries between various types of computing devices. In the future our current graphical user interface will just not work because it is too limited in is application," he says. "I am Asian so I like using the chopsticks metaphor, but you could use anything including your finger."

Treating Information like a Physical Object


He then drops the movie on the screen and proceeds to write some text with the pen (or chopstick). Then its time for another object from another computer - this time it's a photograph, which is dropped onto the screen. Now a graphic and so on. In fact he can even drop and animation onto the screen and grab another object such as a Java routine than influences the behavior of that object. By the time he is finished the screen is filled with animations, movies and text.

Collaborative project work, teaching or just sharing a movie, MP3 songs or photographs are the applications that Dr. Rekimoto sites for his creation. "Programs such as PowerPoint are very good if you know exactly what you are going to say but they are no good in an interactive environment where the presentation is constantly changing."

Furthermore, Dr. Rekimoto points out that is would be a great way for a school teacher to create an animated presentation for students by simply picking and dropping software objects from a laptop or PDA to a presentation screen.

He then moves on to his Tile computer and he explains that we often define computers in much too limited a fashion. Using a computer currently excludes people who are not used to the graphical user interface. The tile computer is a flat surface, which takes 12 4 inch X 4 inch tiles. Each tile has a different but specific function. For example one tile provides weather information, another address book information and so on. To operate this computer the user (its actually designed for children or old people) places the tile on the surface and the tile light up with the relevant information.

Both of these technologies contribute to Sony CLS's "Augmented Surfaces" concept. This allows users to transfer digital information between disparate computer environments such as Laptops or PDA but also uses video cameras to enable the computer system to recognize physical objects such as videotapes. Using projectors, white boards and active displays it is possible to attach digital notes to physical objects.

For example, a video tape lying on a surface might have a digital note that says that has had one rough edit and that the final cut must be ready the following day. So not only does Dr. Rekimoto want to remove the digital boundaries between different computers but also remove the boundaries between physical objects and digital objects.

A Trend Emerges


Indeed, Sony is not the only company working on this concept. In fact, far from it, many large technology companies, universities and research institutions such as Microsoft, IBM, Philips, Intel, HP, Stanford Research Institute, the EU, Sun, Stanford, Berkley, MIT and University of Washington, to mention but a few, have similar ubiquitous computing and user interface projects on the go. (Please see The Incredible Shrinking Computer Pt. 1 and 2)

The next stop on the tour is wearable key or personal id tag. This combines technology such as active badge (which carries the users personal information in a radio based Id tag with a concept called BodyNet that uses the static electricity in human skin to create a network. That way different devices such as a wristwatch and a PDA could communicate via the human body and be relayed, if necessary, to the out side network using Bluetooth. "Currently, we have BodyNet running at 9.6kbps but we can get the speed up to about one megabit per second with a little work," says Rekimoto.

We then retire to his office for a chat and some more demonstrations. His disk is littered with Sony notebooks and PDAs. I counted six but there were probably others lurking under the stacks of clutter. He booted up his personal computer and showed explained that current graphical user interface technology is pretty inadequate even for today's computing needs.

"So one way to make current computers more powerful would be to make them three dimensional. This means adding the third dimension of a time line to the current 2D interface."

His desktop, sports a WindowsXP interface with a difference. It has a timeline. Move the timeline to last week your last week's desktop appears. The user interface simply stores an image of the file and on the desktop and keeps track of where the file is moved. "Flat file folders are often a bad way of storing information because a file may belong to several different categories," says Rekimoto.

Certainly, concepts such as Augmented Reality or Pick and Drop have been around for a while but its only now that we have wireless technologies that will support high speed data transfer that they can become a reality. The computing device is no longer the ivory colored box tied to the one spot by its size, weight and the spaghetti like mess of wires. In those days the GUI served us well but now we are no longer constrained by the technology so the sooner we move to new user interfaces like the ones that Dr. Rekimoto and his team are developing the sooner we'll be able to remove boundaries created by computers.

And because Sony is working on these concepts I would be very surprised if one or more of these technologies didn't find their way into PDA, sub notebook PCs and other devices in the next 12 - 24 months. But that was one question Dr. Rekimoto could not answer. Time will tell.

It's Mobile Asia Week on TheFeature! Check back daily for reports, analysis and in-depth articles.

Niall McKay is a freelance journalist based in Tokyo Japan. He can be reached at www.niall.org.