The Incredible Shrinking Computer Pt. 1: Universities
By Niall McKay in Silicon Valley, Thu Mar 29 00:00:00 GMT 2001
Researchers across the world are attempting to come up with a computing architecture that is pervasive, invisible, uses new interfaces, supports dozens of device-types and is completely reliable. The result?
Michael Dertouzos, director of the Massachusetts Institute of Technology's Laboratory for Computer Science believes that today's computing architecture creates human servitude. "We have been taken hostage by $500 PCs that force us to feed them data, nurse them, and obey their stupid commands."
The futurist and author of a new book called The Unfinished Revolution believes that it's high time that machines became "human centric."
Call it human centric, call it invisible, call it ubiquitous - the basic idea is the same: That computers are too difficult to use, and too invasive - requiring us to deal with file systems, applications programming interfaces and user interfaces. That they are too unreliable, and that we spend an inordinate amount of time inputting data - something that computers should be able to do themselves. And also that we spend too much time telling them what our preferences are when they should learn them on the fly.
The solution, put forward in 1988 by Xerox PARC's Mark Weiser, is Ubiquitous Computing. The idea is that we'll no longer interact with specific computing devices, but that computing capabilities will be all around us in everything that we use. Where we will no longer interact with a desktop computer but our desks become computers. Where the walls in our offices and homes come with embedded displays, cameras, and sensors so that we no longer need to tell computers what we need but they will learn it for themselves.
The network that ties all this computing capability together will become smart enough to deliver up the information we need, when we need it, without being asked.
Seems like over at the US Defense Department's Advanced Research Projects Agency in Washington, the guys and gals responsible for funding the research for everything from anti-biological warfare tools to the Internet to the graphical user interface have been wrestling with the problem as well. In 1999, they sent out a request for proposals to the research and development community - asking them to come up with an architecture that was pervasive, invisible, that used new user interfaces such as speech and vision, that supported dozens of different types of devices, and that was completely reliable.
DARPA's request for proposals sparked off a bevy of research projects both in the universities and corporate research institutions and refocused some existing research.
The sum of money DARPA offered was a mere $10 million. Small potatoes by today’s R&D budget standards - in 2000 IBM and Microsoft spent $5.8 billion and $4.3 billion respectively. However, this was simply for the discovery process. Indeed, any day now, DARPA is due to announce the second round which insiders expect will be a much greater amount.
The initiative is called the Ubiquitous Computing Project and five US universities including the Massachusetts Institute of Technology, Carnegie Mellon University, the University of California at Berkley, the University of Washington, and Oregon Graduate Institute all stepped up to the challenge.
Meanwhile in the commercial world, literally dozens of computer and communications companies, such as, Microsoft, HP, SRI, Intel, Philips, Sun and IBM are working on similar projects.
The biggest and perhaps the most ambitious academic research project is headed by Dertouzos at MIT and is called Project Oxygen.
Over the next four years MIT, Nokia, Philips Electronics, Hewlett Packard, Acer and Delta Electronics will invest $50 million and the man hours of 250 researchers in an attempt to create a system that uses several devices that hear, see and respond to the users' needs.
By augmenting today's input device, the keyboard, with speech and vision technology the problem of inputting data will be the a thing of the past. Indeed, all we'll need do is tell the computer to record this transaction and its eyes (camera) or ears (microphones) will kick in and capture the data.
Several months ago I visited Dertouzos at his office in MIT. As I entered the room the Greek-born scientist swept his arm around the room and said "See if you can spot the technology."
Indeed, I could see no evidence of a computer or technology in the room. "That's what technology should look like," he said as he pressed a remote control to reveal a computer screen. "It should be all around you but you shouldn't have to see it."
According to Dertouzos, in the future, computation will be available everywhere like the air that we breathe. Hence the project name Oxygen. We will not need to carry personalized devices. In fact, any device will have the ability to personalize itself for us by downloading our data from the network.
MIT's Oxygen currently has three prototypes that work together in one single system. The Handy 21, a smartphone-like device, the Enviro 21 PC-like computer, and a networking architecture to tie the two together called Network 21.
The Handy 21 will be the users handheld computer (thus Handy) and will not have a keyboard. Instead it will have voice recognition software and a small screen. The Handy 21 will also have a camera, a microphone, speakers, GPS for tracking, wireless networking technology such as Bluetooth as well as a videophone.
In the home it will act as a node to the Envro 21 and as a remote control for household audio visual systems. When you leave the house, it will act as personal digital assistant. Pop it in your pocket, connect a wireless headset and chatter away with your buddies. Its GPS or tracking devices will guide you to your destination.
Indeed, it's much like the device that GSM vendors are telling us we will have when UMTS arrives. However, what's interesting about the Handy 21 is that all its circuitry will be digital so each time it changes function the software will be loaded to carry out that task. Furthermore, it will have two antennas - one for broadband access and another for narrow band access. That way, in the home it can use Bluetooth to deliver video telephone conversations to the user while outside it can use its narrow band connection to communicate with the rest of the world.
Keep in mind that MIT isn't claiming that it's inventing the device of the century. What it is doing is building a prototype for a ubiquitous computing environment.
The Enviro 21 will be the central control for the home or office. It will be liked to a network of sensors and actuators that will collect data such as the room temperature control the heating. It will also be linked to the security system and so on. The Network 21 will tie the whole lot together. Sounds dandy. But despite the fact that MIT's war chest is full of interesting technologies, there is still a great deal of work to be done before Oxygen becomes a reality.
Meanwhile, Carnegie Mellon University's Human Computer Interaction Institute in Pittsburgh is working on a similar initiative called Project Aura.
Aura works on the principle that every person should have a personal information digital aura. The aura spans desktop, personal and infrastructure computers.
But Aura is more focused on the tasks rather than the hardware. It uses speech recognition software, video cameras, language translators, or any currently available technology that will make one's life easier. Its goal is to remove the distraction from using computers. So what's new about Aura?
Well it's a hell of a system integration task for a start.
It uses computers, cameras, actuators, and sensors are embedded in the environment to allow the user to focus on tasks and ideas rather than having to boot up Widows start typing.
The Endeavour Expedition
At the University of California at Berkeley Dr. Randy Katz is heading up the school’s Endeavour (British spelling intentional) Expedition.
Sounds like some bizarre Star Trek episode but it's the school's invisible computing initiative. The focus is to design and prototype a self-organizing adaptive information system. This "fluid software" will allow data and software systems to automatically distribute themselves among a variety of information devices.
That way, Endeavour will enable the creation of applications that not only use mobile code but also nomadic data. Like Aura and Oxygen, the system will also be designed to interact with sensors and actuators (typically Micro-mechanical Devices or MEMS). However, UC Berkley has got a stage further in the creation of this vision with its Sensornet (which is part of Endeavour).
"The idea is to create context aware computing using software that can learn about its environment," says Jason Hong, a computer scientist at UC Berkeley.
Hong's particular interest is creating a set of low-level APIs that will enable devices talk to each other. For example, a Bluetooth enabled personal digital assistant may not have the ability to connect to the Internet by itself, but it maybe able to talk to a device that can. So it would send out a request that would say something like, "Does any device here have the ability to connect to the Internet?" A PC in the area may reply and the email would be sent from the Personal Digital Assistant to the PC via Bluetooth and then forwarded to the Internet.
At the University of Washington, the computer science department's Portolano Project takes a slightly different approach.
The computer science department is taking advantage of the university's intention to build a new cell biology laboratory.
Professor Gaetano Borriello says that the computer science department decided that there were already too many invisible computing projects focused on the consumer market.
"However, the problem with most cell biology laboratories is that there is really no way to accurately record the experiment process," he says. "The great majority of what is done is in some biologists head and at best in a notebook somewhere."
So Professor Borriello and his team are outfitting every device in the new cell biology laboratory with radio frequency tags that can capture and record data.
For example, each sample for an experiment will have an active badge so when a biologist places it under a microscope the computer control system will know what material is being examined and at what settings. Pipettes containing chemicals will also be able to relay how much liquid is in them and how much was used in a particular experiment. The system will be augmented with speech and vision technology that will record the biologists' notes as they are working.
"Everything will be stored by a central database so that experiment can be accessed and viewed at later date," he says.
The Georgia Tech project's aim is to develop concepts, techniques, and tools for the next generation systems software in pervasive computing environments. The university is concentrating on two main areas of research.
First, the creation of information driven applications, such as micro region weather forecasting and virtual presence software. Second, Infopipe, system software that supports information flows through a number of environments to a number of devices. Infopipe will support and control the information flow from sensors to applications.
In the next installment, TheFeature delves into the corporate world. Microsoft, HP, SRI, Intel, Philips, Sun and IBM are all working on projects with similar goals - and from interesting perspectives. Check back soon...
Niall McKay is a contributing editor for the Red Herring magazine. He can be reached at www.niall.org.