How the iPhone Works
[Editor’s Note: This is one of our favorite articles to update. While we parse Apple’s announcements, here’s the not-so-basic technology behind the iPhone.]
Electronic devices can use lots of different methods to detect a person’s input on a touch screen. Most of them use sensors and circuitry to monitor changes in a particular state. Many, including the iPhone, monitor changes in electrical current. Others monitor changes in the reflection of waves. These can be sound waves or beams of near-infrared light. A few systems use transducers to measure changes in vibration caused when your finger hits the screen’s surface or cameras to monitor changes in light and shadow.
The basic idea is pretty simple — when you place your finger or a stylus on the screen, it changes the state that the device is monitoring. In screens that rely on sound or light waves, your finger physically blocks or reflects some of the waves. Capacitive touch screens use a layer of capacitive material to hold an electrical charge; touching the screen changes the amount of charge at a specific point of contact. In resistive screens, the pressure from your finger causes conductive and resistive layers of circuitry to touch each other, changing the circuits’ resistance.
Most of the time, these systems are good at detecting the location of exactly one touch. If you try to touch the screen in several places at once, the results can be erratic. Some screens simply disregard all touches after the first one. Others can detect simultaneous touches, but their software can’t calculate the location of each one accurately. There are several reasons for this, including the following:
- Many systems detect changes along an axis or in a specific direction instead of at each point on the screen.
- Some screens rely on system-wide averages to determine touch locations.
- Some systems take measurements by first establishing a baseline. When you touch the screen, you create a new baseline. Adding another touch causes the system to take a measurement using the wrong baseline as a starting point.
The Apple iPhone is different — many of the elements of its multi-touch user interface require you to touch multiple points on the screen simultaneously. For example, you can zoom in to Web pages or pictures by placing your thumb and finger on the screen and spreading them apart. To zoom back out, you can pinch your thumb and finger together. The iPhone’s touch screen is able to respond to both touch points and their movements simultaneously.