Animals are equipped with six senses - taste, touch, smell, hearing, vision, and kinesthesia (the sense of body position and movement) - which allow them to perceive changes in their environment and do something about it. Mobile devices don't have any senses. They have no idea what's going on in the world around them. That's a shame, because if a cell phone could sense where it was and what its owner was doing, it would be more useful and easier to operate.
Today's user interfaces require you to give explicit commands in order to access different functions on mobile devices. You have to press a series of buttons, spin a thumbwheel, or utter a voice command to get them to do anything. All these actions require conscious effort and attention on the part of the user. If you're sitting at your desk, that's not a problem, because you've got both hands free and there usually isn't anything going on to distract you (unless you work at home with a five-year-old like I do). But if you're driving your car on a noisy, crowded freeway at night, it's a bad idea to be staring at your mobile's display while punching numbers in the keypad.
What if your mobile could tell the difference between day and night? Between being on a desk and being stowed away in a briefcase? Between being held sideways and being held right-side-up? To do this, your mobile would need to be less like a machine and more like a living creature. It would need to be able to sense its surroundings.
In laboratories around the world, researchers are developing sense-enabled mobiles that appear to be as aware as animals. No, you won't have to housebreak your sense-enabled mobile or teach it not to chew the furniture, but you may have to train it before it learns how to get along with the way you live.
Sensors Go Mobile
People behave in very predictable ways when they use mobile devices. Think of how you use your mobile. You probably pull it out of your pocket or case several times a day. You hold it up to your face when you want to read a message, and you hold it to your ear to answer a call. You set it on tables, dock it to your PC, and hold it in your hand. You repeat these same gestures over and over again. Wouldn't it be nice if the device could sense these frequently repeated gestures and anticipate what you wanted to do? With interactive sensing, your mobile can do just that.
The thinking behind interactive sensing goes like this: Since you can sense your environment, your device ought to be able to do the same, and adapt to those conditions in a way that makes the device easier to use.
Thanks to the advent of micro electro-mechanical systems (MEMS for short), sophisticated sensors are becoming smaller, cheaper, and less power-hungry. MEMS are tiny silicon-chip components fabricated in the same way microprocessors are. Your car's airbag has a MEMS accelerometer in it that senses a collision and triggers the bag to deploy. There are many different kinds of sensors, not all of them MEMS based, and they're being tested for use in mobile devices in research labs around the world. In the next few years, your mobile device will come loaded with a host of sensors that can perceive your gestures and its surroundings.
Here's a list of some of the most interesting micro-sensors, and how they're being used in mobiles:
Proximity sensor. This simple infrared device can detect how far away it is from a target. A proximity sensor could be used to turn a phone on when you hold it up to your head.
Touch sensor. Like the touchpad on a laptop, a touch sensor can tell when it is in contact with human skin. A mobile device with touch sensors can power on when you grip it. It can tell when a left handed person is using it, and do things like adjust a vertical navigation bar to the appropriate side of the display screen. Also, if you are holding a phone with touch sensors, it can be set to vibrate instead of ring when there's an incoming call.
Tilt sensor. Instead of using a scroll wheel or buttons to navigate a web page on a mobile, tilt sensors could allow you to scroll a page simply by tilting the device in your hand so that that information "slides" into view. And if your device takes digital pictures, it'll be able to tell if you are taking a snapshot in landscape or portrait mode and then adjust the orientation so the image doesn't appear sideways when you download it to your PC.
Acceleration sensor. Accelerometers could give a mobile device the ability to sense whether it was inside a moving car, in the pocket of a walking person, or sitting on a desk. If you phone knew it was in a car, it could go into a mode that would let you control it by your voice, minimizing the amount of visual and manual attention you'd need to devote to the phone. Once the car stopped, it could go back into its standard energy-saving mode.
Video camera. The components needed to make a video camera are becoming dirt-cheap. If your phone can start to see things, it won't be long before it can start recognizing them, too. There are lots of biometric possibilities here. One example: you could set your vision-equipped phone to lock-up if it doesn't recognize the shape of the ear it's being held up against.
Light sensor. An inexpensive light meter would be able to sense the level of ambient light and adjust the display's brightness and contrast accordingly.
Microphone. Talking in a club? A noise sensor will know, and pump up the volume so you can hear the person on the other end.
The Power of Mix and Match
A single sensor can add a lot of functionality to a mobile, but the real magic happens with you begin to combine sensors. For example, if you equip a mobile phone with a digital thermometer and a light sensor, the phone would be able to figure out that it was in a pocket, a purse, or out in the open. And it would be able to adjust its ring volume accordingly.
Endowing a mobile with the sense of touch and tilt will give it the capability of turning itself on when a user is both touching the device and holding it at an angle that indicates the user is looking at the display. This combo could also be used to prevent a device from powering off after a certain amount of time if a user is holding the device and looking at it.
At Microsoft Research, Ken Hinckley, Mike Sinclair, and Eric Horvitz worked with Jeff Pierce at Carnegie Mellon University to make a prototype handheld computer that used different sensors to enhance its functionality. For this experiment, the researchers wanted a device that could perform several sense-enable tasks: 1. Figure out when a user wanted to record a voice memo; 2. Switch the display from portrait to landscape mode when the device was tilted sideways; 3. Power up when the user picked it up; and 4. Scroll the display when the user tilted the device one way or another. The researchers took an off-the-shelf Cassiopeia E-105 palm sized PC and equipped it with a tilt sensor, touch sensors, and a proximity range sensor. The sensors were all selected for their low cost, low power consumption, and ability for capturing the kind of natural gestures people make when they use mobile devices.
When the researchers tested their prototype on seven different users, six of them said they preferred using the sensed gesture to using a button-activated voice memo feature. Why? Because it requires less conscious effort to record a voice memo on a sense-enable device. Recording a voice memo on a standard device requires eight distinct steps: 1. Picking up the device; 2. Finding the correct button; 3. Positioning the finger over the button; 4. Pressing and holding the button; 5. Listening for the beep that signals that the device is in recording mode; 6. Recording the message; 7. Releasing the button; and 8. Listening for the double beep that signals that the recording has ended. With the sense-enabled phone, users can record a voice memo in five steps: 1. Picking up the device; 2. Listening for the beep; 3. Recording the message; 4. Moving the device away from your face; and 5. Listening for the double beep. Not only are there fewer steps to complete, they are also easier to perform.
The testers similarly praised the other features that the researchers incorporated into the prototype. Nevertheless there were a number of problems with the prototype. The device exhibited both false-positives (activating a function by mistake) and false negatives (not activating a desired function on command). Adding different sensors could help, just as they do in animals. For example, the effect of hearing, seeing, and feeling a train rumble by is much greater than just seeing it, hearing it, or feeling it. The researchers also concluded that they'd need to run long-term tests to make sure that some of these nifty features don't become annoying time-wasters once the novelty wears off.
One thing's for sure. Sense enabled devices are coming, and they'll become a permanent element of the mobile Internet. As one Compaq researcher wrote in the title of his paper about a tilt interface, "Rock 'n' Scroll is Here to Stay."
Mark Frauenfelder is a writer and illustrator from Los Angeles.