By Niall McKay in Silicon Valley, Tue Sep 26 00:00:00 GMT 2000
Last year, scientists at IBM sat down to figure out what sort of a world they could dream up if they took today's emerging technologies -- the Internet, cell-phones, electronic organizers, MP3 players, laptop computers, global positioning satellite receivers, Bluetooth networking devices, voice recognition technology, and so on -- and put them all together. They couldn't come up with a reasonable answer.
IBM chairman and CEO Lou Gerstner took $185 million of the annual research budget and gave it to 45 of the company's top behavioral and computer scientists, telling them to go and find out just what kind of world that might be.
The result is an on-going, four-year worldwide effort called Planet Blue, a project not so much about developing new technologies, but finding new ways to work with the myriad of existing ones.
"We are trying to remove the technology from the consciousness of the user," says Michael Karasick, CTO of IBM's pervasive computing division. "We want computing to disappear into the background, leaving only the power of computation and global access."
So the company is developing a number of new environments in which this can be achieved, giving users access to personalized data anytime, anywhere, whether they are in a car, at home, in a foreign city, or at a social function, using the wireless infrastructure as the glue that holds all this computational power together.
An old dream, but a modern work-in-progress
Not that this is a new idea. Indeed, the late Mark Weiser of Xerox's PARC coined the term "ubiquitous computing" in 1988 and worked on the Defense Advanced Research Projects Agency-funded project until his death in 1999. The project was then absorbed in to other ventures, including Bluetooth and WAP-enabled devices research.
IBM's scientists have picked up where Weiser left off. For example, Dan Russell, director of the user sciences and experience group at IBM's Almaden Research Center, is developing large wall displays known as whiteboard technology. The displays are equipped with radio frequency receivers (although current radio technology is being used, IBM's Tokyo research facility is developing a similar technology using i-mode and cellular phones, and in the near future, Bluetooth will be utilized) that can read electronic identification tags. When a person is wearing a tag, the display can read their identity, access their calendar, find out who they are meeting at Almaden, and then automatically display personalized information. As the person approaches the display, information will become more detailed.
As the visitor proceeds, they will be able to use the same tag (be it in a cell phone or perhaps a Bluetooth-enabled watch) to access their personal computer data on a local terminal. Furthermore, as the meeting progresses, they will be able to drag and drop items of interest that they have gleaned from their visit into a virtual briefcase, where it can be e-mailed or stored to their personal computer when they leave.
"We need to be able to provide people with an easy way to achieve this, so that when they move away from their personal computer, the tag technology can automatically do the tunneling through whatever firewalls and security systems," says Russell.
The concept is remarkably similar to Larry Ellison's network computer idea. However, Russell and his cohorts don't care where the data resides as long as it is accessible over the wireless infrastructure from any location. "One vision is that you can carry the hard disk in your pocket, or in some device in your pocket, but the important thing is that you can access it and record the information as you go," he says. Indeed, the micro-drive could also be equipped with a Bluetooth chip so that it could be seamlessly accessed from a local PC.
Russell is also working on other whiteboard applications such as a "room in two places," which enables a boardroom in Silicon Valley to connect with a boardroom in New York using video, audio, and software conferencing. The whiteboard will not only simultaneously display data the two groups are working with, but also everything that is presented and said during the conference.
Even a simple conversation would relate to a number of different kinds of information, such as contacts, calendars, visual data and so on. So Big Blue plans to make the creation of shared tasks transparent and allow for access to a number of different types of information from different kinds of devices. However, it plans to do this in such a way that people don't even know that they are using a computer system. "We are building small desktop displays that might, for example, show a picture of your spouse and then automatically switch over to a calendar when you have an appointment," says Russell.
And just in case you nodded off during in the last twenty minutes of that important meeting, IBM wants to provide you with a virtual time buffer. Using extremely high-density storage devices, the company plans to record everything said or presented in, say, the last 60 minutes. That way you could rewind and store information that you might need in the future.
However, IBM is still missing some key pieces of the puzzle. For one thing, superior voice recognition systems must be developed for computing to become truly transparent. That way users can bark commands at will and devices will understand and carry out the given tasks. And of course these activities demand high-capacity broadband networks.
But these developments are coming, and IBM's Karasick sees the market demanding advances: "In five to ten years' time, we believe that most people will be knowledge workers, so we need to provide them with new ways to access, manage and store data."
And it'll be amazing to see how they do just that.
Niall McKay is a San Francisco-based writer and journalist who covers science and technology. He is a contributing editor to the Red Herring Magazine, a columnist for the Irish Times and writes for the Financial Times, Wired Magazine, Salon, and the New York Times.