Guide to Innovation: Smart Phones
As cell phones get smarter, they reach deeper into our lives
When Steve Jobs introduced the iPhone 4 this past June, it already was obsolete. It had been eclipsed by HTC’s EVO 4G, which runs on Sprint’s nascent 4G WiMAX network. EVO’s big leap forward centers on Sprint’s 4G network, which can deliver data such as Web pages, e-mail attachments, and music and video streams and downloads up to 10 times faster than 3G networks. EVO also includes mobile hotspot capability, which turns the 4G cellular connection into WiFi to deliver an Internet connection to up to eight devices—laptops, other phones, even Apple’s iPad—simultaneously.
The future of cell phones is not the iPhone 4 or any specific piece of hardware but 4G, the next generation of broadband wireless networks. 4G’s faster data delivery speeds will enable some truly behavior-changing technologies, from the continual monitoring of a person’s vital signs to eliminating the need to carry money or credit cards. Ever faster wireless networks, combined with increasingly sophisticated hardware and mobile operating system software such as Apple’s iOS and Android, have evolved to the point that talking on the phone—the primary purpose behind the invention of the cell phone in the first place—has become secondary to other uses.
The last three years have brought extraordinary advancements in network, hardware, and software technology, which have fueled vigorous adoption of cell phones. Ownership has jumped from 34 million American users in 1995 to 285 million last year. Today 91 percent of the U.S. population has a cell phone. Cell phone consultant Chetan Sharma estimates that total mobile subscriptions worldwide will surpass 5 billion by the end of the year.
“The cell phone has clearly been the most adopted technology in the world,” say Sharma. “As it has moved from being a luxury to a utility to a necessity, both the cellular network technology and the devices have evolved to meet the growing demand and need for communications.”
The growth of cell phones is a story of the search for capacity—how to enable more users to use the wireless networks and do more with them. The solution to today’s capacity problems—the creation of 4G networks and beyond—also has brought high-definition TV, the ultimate killing of two technological birds with one stone.
The First Cell Phone
While cell phone technology seems new, the idea is old. Bell Labs researcher Donald H. Ring first broached the notion of permanent “cells” in 1947. For the next 20 years, “mobile phones” meant car phones powered by huge trunk batteries and connected to each other and the landline system via a single citywide antenna. But one antenna restricted the number of users—only 20 simultaneous mobile phone conversations could be held at one time. AT&T and Motorola lobbied the Federal Communications Commission (FCC) for additional bandwidth in the 806–960 MHz frequency band, a range owned by local VHF television stations. In May 1970, after a fierce battle with the broadcast industry, the FCC assigned the frequencies to mobile phones.
At the same time, an AT&T team led by Joel Engel and Richard Frenkiel proposed that a metropolitan area should be carved up into adjoining hexagonal “cells,” each with its own low-power transmitter. As a user drove from one cell to another, the system would “hand off” the call transmission from a transmitter in one cell to a new transmitter in the next. This cellular scheme allowed different users to use the same channels in different cells, vastly expanding the number of possible channels available and the number of users a system could support.
AT&T angled for a monopoly of this new telephone network, which would gain it the same kind of dominance it had exercised over the landline system. Motorola, the primary equipment supplier to the car phone market, countered AT&T’s proposal, arguing that car phones weren’t the only possible users of such frequencies. A monopoly would therefore inhibit technological innovation. To prove its point, Motorola embarked on a five-month program to create a handheld portable phone. (This story is related in “Hold the Phone” in the Winter 2007 issue of Invention & Technology).
All the legal rigmarole took a decade to sort out. The first commercial cell phone call was made on October 13, 1983, from a Chrysler convertible in the parking lot of Chicago’s Soldier Field, placed by Bob Burnett, president of Ameritech, Chicago’s first cell phone carrier, to Alexander Graham Bell’s great-grandson Edwin Grosvenor (today the publisher of Invention & Technology), then in Germany.
Cell phones soon became more popular than anyone had anticipated, and more spectrum was needed to meet unanticipated demand. In 1988 the FCC approved development of digital “advanced cellular” technology, which led to a confusing evolution of cell systems, beginning with time division multiple access (TDMA), which morphed in 1992 into global system for mobile communications (GSM), currently used by AT&T and T-Mobile, and code division multiple access (CDMA), now used by Verizon and Sprint, a few years later. In 1993 Congress authorized the FCC to auction off digital spectrum in the 1900 MHz range, and the first digital networks were launched in two years. Faster 2G networks followed in 2001, and 3G debuted in 2003.
But demand continued to outstrip capacity: speedier networks invited more usage, requiring carriers to seek more spectrum. Even as carriers demanded 1900 MHz spectrum, they were eyeing television spectrum in the 700 MHz range. In an effort to hold off additional spectrum grabs by the wireless industry, television broadcasters bluffed, telling the FCC that they need the spectrum for new high-definition TV. The FCC called the broadcasters’ bluff. In late 1987 it created the Advisory Committee on Advanced Television Service (ACATS) and declared an open competition for the creation of an American HDTV system. Digital TV would occupy a different set of frequencies, however, opening up enormous swaths of current analog TV spectrum for wireless communication.
Although broadcasters had received their spectrum for free to serve the public interest, Congress realized that wireless carriers would pay big money for this analog TV spectrum. Once the FCC adopted the ACATS digital TV standards on Christmas Eve 1996, it could now sell the analog TV spectrum. It took another decade for the FCC to move broadcasters off the analog spectrum. On January 24, 2008, 62 MHz of spectrum in the 700 MHz band went under the government’s gavel.
After the smoke cleared 38 days later, the government had sold $19.6 billion of air, $9.63 billion to Verizon, mostly for the coveted “C” block, which would enable the creation of a seamless nationwide network for the carrier’s LTE (Long Term Evolution) 4G network. AT&T spent $6.64 billion for the “B” block for its LTE 4G network.
Sprint had already procured a 120–150 MHz swath in the 2.5–2.7 GHz band for its WiMAX 4G network. Sprint launched its WiMAX service, run by Clearwire, in October 2008, and it will be available in 44 markets by the end of this year. Sprint’s bandwidth is four to five times wider than the 700 MHz bands bought by Verizon and AT&T, and therefore it can accommodate more users.
By the time the analog TV spectrum was shut off on June 12, 2009, Verizon and AT&T had begun developing their LTE networks. Verizon plans to roll out to between 25 and 30 markets this year, which would result in coverage for about 100 million people. AT&T will unveil its LTE network in 2011.
iPhone Easy
On the hardware side, iPhone’s original launch on June 29, 2007, shook up the cell phone world. Apple rejected the industry convention in which the carrier retained control over its phones’ hardware and operating system. Not only did Apple take control of these, it also allowed third-party developers to create nearly 200,000 applications (“apps”), available from Apple and not the carrier.
The resulting Mac-like iPhone operating system, now called iOS, was easier to use than a toaster. iPhone users were enticed with an ever-evolving menu of first-ever technologies: a capacitive touchscreen, operated by simultaneous finger gestures such as pinch-and-zoom to increase or decrease the size of images, text, or Web pages; visual voice mail, which lets the user listen to the message of choice instead of having to listen to the entire mailbox in sequence; a three-axis accelerometer, which enabled the phone’s screen to change in relation to the position in which it is held; proximity sensors, which shut off the screen when the device is brought up to the user’s ear during a call; and a melding of the iPod and iPhone, which enables a user to carry one gadget instead of two.
The Cell Phone Redefined
The iPhone’s extraordinary array of innovative features suddenly became the new normal. All touchscreen phones now feature most if not all of these attributes. And subsequent iPhone-like phones have raised expectations about cell phone performance and capabilities.
In July 2005 Google bought Android, a small mobile software company. On November 5, 2007, the giant search-engine company announced the arrival of the Android mobile phone operating system. The first Android-powered phone, the HTC G1 from T-Mobile, appeared a year later, the first serious threat to iPhone’s innovations.
Google engineers designed Android to be a truly open operating system, thus initiating a Mac vs. Windows–like battle. Critics claim that iOS, which also powers the iPod Touch and the iPad, is not completely open because Apple strictly controls developer entry to its App Store and is the only iOS hardware maker. In contrast, any handset maker can create an Android cell phone—varying Android phones from Samsung, HTC, and Motorola are available from Verizon, Sprint, T-Mobile, and even AT&T—and there are now more than 70,000 Android apps available.
Apple’s and Android’s data-centric advancements have helped make using a cell phone for conversation alone passé. Sharma reports that consumers now devote less than 32 percent of their time on the iPhone talking. Last year, for the first time, mobile data traffic exceeded voice traffic.
The Era of Multi-Touch
Apple’s successful implementation of multigesture touch is the iPhone’s most obvious hardware influence. But touchscreens are not a new technology. Original patents date back to the late 1940s, but the first practical solutions—for an “electrographic apparatus” describing a resistive-sheet, electrostatic/capacitive-coupling digitizer tablet—were patented in 1986 by Scriptel’s Robert G. Kable, the chief engineer of the Columbus, Ohio–based company.
In 1994 the IBM Simon Personal Communicator—a mobile phone featuring a large touchscreen display with a pager, personal digital assistant (PDA) functions, and a fax machine—became the first device to merge a wireless phone with handheld personal computer capacities. The $900 Simon was available only to BellSouth customers in 15 states. In 1999 Qualcomm introduced the pdQ 800, the first modern and widely available touchscreen smart phone, which offered wireless phone service, Web access, and PDA capabilities in a single pocket-sized device. In 2001 Samsung unveiled the SPH-i300, the first combined color Palm PDA cell phone.
But all these devices used a single-touch resistive touchscreen, usually requiring an oft-misplaced stylus or plastic pen. Resistive screens use two thin electrically charged layers. The screens make contact when pressed, creating a voltage change that signals the location of the touch to the controlling software. Fingers can be used on a resistive touchscreen, but there is no swiping, dragging, or multifinger use.
Technically, capacitive touchscreens don’t respond to touch but rather to proximity. These screens emit a weak electrostatic field that is disrupted when a finger, with its own electric charge, nears it. The operating software then interprets the location of the distortion and initiates a series of commands.
Theoretically a touchscreen could register multiple touches, but the mechanism eluded engineers until the spring of 1999, when University of Delaware graduate student Wayne Westerman published a doctoral dissertation entitled “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface.” While finishing his dissertation, Westerman and his faculty advisor, John Elias, formed FingerWorks, which began to make and market multigesture PC control products in late 1994, including several iGesture pads. In early 2005 Apple bought FingerWorks and hired Westerman. In September 2007 Apple applied for a patent that describes the entire iPhone touchscreen user experience, with Westerman’s name listed among the authors.
In March 2006 Apple applied for a patent for a “force imaging input device and system,” which describes a method for detecting the strength of the pressure of a touch to add a level of user control. The force exerted on a screen determines the action the software takes. This past January, Apple filed another touch patent application and was granted another. The former would allow a thinner, lighter, and brighter touchscreen to be manufactured with fewer parts. The latter describes a touchscreen with an integrated proximity sensor, which would activate control without actually touching the screen. A user could wake up a device, for instance, merely with the wave of a hand over its screen. Or a hovering finger could bring up a menu in a music application to enable the user to then choose play, pause, or search.
Sensory Input
As with the multi-touch touchscreen, the technology behind the iPhone’s ability to orient its screen according to how the device is held was not original with Apple. The hardware consists of microelectromechanical systems (MEMS), essentially electrical motors shrunk down to microscopic size to fit on a microchip. Legendary physicist Richard Feynman was the first to theorize in 1959 that size was not a barrier to advanced technology. It took more than 30 years—and the development of powerful microprocessors etched in silicon—for MEMS engineering to catch up to Feynman’s theory. The first commercial accelerator MEMS appeared in cars to control notoriously finicky airbag deployment in the early 1990s. MEMS accelerators accurately sensed the g-forces of a car’s sudden deceleration, then determined whether to deploy the airbag. Accidental—and often dangerous—deployments were signficantly reduced.
Early accelerometers could detect movement in a limited number of directions or axes. In 2003 STMicroelectronics perfected three-axis (up/down, left/right, forward/backward) accelerators, which were used to spectacular effect in Nintendo’s Wii videogame systems, first released in September 2006. Nine months later, STMicroelectronics’ three-axis accelerometer chip found its way into the first iPhone. The new iPhone 4 upgrades the accelerometer with a gyroscope that now also reads pitch, yaw, and roll. These six-axis motion-sensing devices will enable the development of near 3-D game movement and interaction.
But not every recent cell phone advance comes from the iPhone. Two new mobile technologies still in their infancy—near-field technology and telehealth—hold the potential to further change cell phone use and the daily communications.
iWallet
Many Americans and Europeans today don’t leave home without their keys, cell phone, music player, and wallet or purse containing money and credit cards. Those owning an iPhone can leave their music players behind. Implementation of near-field communications (NFC) technology may soon enable a person to leave the house without a wallet and keys, too.
NFC is an open-standard contactless chip technology that facilitates two-way communication similar to radio frequency identification (RFID) in the 13.56 MHz frequency range. When two NFC devices are within a few centimeters of each other, they communicate with each other and perform predefined tasks. Some credit cards, such as the Blink from Chase, now use NFD.
Unlike magnetic strips, however, NFC chips can be programmed to perform a number of tricks such as trading contact information, door locking and unlocking, coupon redemption, and, most important, financial transactions. “NFC builds on [magnetic strips],” explains Jonathan Collins, principal NFC analyst for ABI Research. “A cell phone has a screen and processing to enhance NFC’s functionality.”
With a cell phone’s processing power and storage memory, an NFC phone can store credit and loyalty cards along with banking information. An NFC phone conceivably could be used to check into a hotel, become the room key, and check out and pay for the stay. Trials in France suggest that many people will eventually use their NFC phones to replace subway cards or tokens on mass transit. A person simply waves his phone while entering and exiting the system, and the fare is paid.
In 2004 Nokia, Sony, and Philips teamed up to create the Near Field Communication Forum. Five months later, the first four NFC specifications were released. It has proven difficult to work out certain critical security issues, however, which has considerably slowed implementation.
Keeping the Doctor Away
Telehealth may represent the cell phone technology with the largest potential impact on society. In a nutshell, biosensors built into glasses, belt buckles, watches, bras, jewelry, and other wearable objects can regularly measure a person’s temperature, heart rate, blood pressure, and brain waves—and then send that data wirelessly to a 4G cell phone and on to a doctor or monitoring system. Such remote monitoring could mean fewer office visits.
AT&T researchers are collaborating with hospital and university partners to develop “smart slippers,” which would monitor acceleration and pressure in a patient’s gait. The wireless sensors would not only alert caregivers when a patient falls but could also identify unsteady locomotion that may precede a fall. “If grandpa isn’t walking too well because he took pills he wasn’t supposed to take, his family can be notified as to why he is so wobbly,” explains Bob Miller director of AT&T’s Communications Technology Research Group.
“The implications of this research are far-reaching,” writes AT&T chairman John Donovan. “If we can enable advanced health care services remotely while allowing patients to stay more frequently in the comfort of their own homes, we have the potential to improve quality of care and quality of life for patients and reduce costs for providers. It’s one health care reform that’s hard to debate.”
Finite Resources
But transmitting all these new data will chew into finite capacity, and moving to 4G is unlikely to solve network capacity issues for all carriers. In a never-ending feedback loop, speedier networks enable more data-hungry applications such as HTML Web browsing, photo and video sharing, peer-to-peer game playing, GPS navigation, video downloading, and video chatting, all of which necessitate the creation of more network capacity. According to Sharma, each American on average used 20 megabytes of network capacity per month in 2007; this year it’s 370 megabytes per month. He predicts that each person will use almost three gigabytes a month by 2014. This year the cumulative volume of U.S. data traffic is likely to exceed one exabyte, which is 1 billion gigabytes.
“Spectrum is a finite resource, and the demands being placed on the national spectrum are enormous, if not unrealistic,” Sharma warns. “One doesn’t need to be a spectrum expert to determine that the majority of the spectrum has been taken. What’s left is just reallocation of the spectrum. While the mobile data tsunami is occurring right now, even the identification of available spectrum, its reallocation and assignment, is often a long, multiyear process.”