The widespread adoption of personal computers in the early 1980s fundamentally changed the way people access data, information, and knowledge. A decade later, the Internet ushered in a new era of connectivity and interaction. However, it wasn't until Apple introduced the iPhone in 2007 that the dream of ubiquitous and pervasive mobile connectivity began to take shape; suddenly, it was possible to handle an array of sophisticated and increasingly complex tasks.
Today, consumers deposit checks via mobile banking apps that snap a photo and convert paper to digital data. They scan barcodes on food packages to obtain nutritional information or snap a photo of a wine label to view information and ratings--and even save the wine in a ‘virtual cellar’ for future reference. They increasingly use smartphones to store tickets to concerts, shows, and sporting events, and securely pay for goods via a mobile wallet. The list of possibilities grows daily.
"In many respects, the future has arrived," says James Landay, a professor in the computer science department at Stanford University. "Although mobile computing is nothing new--many researchers have been working on it for more than two decades--the idea of ubiquitous computing and digital functionality is beginning to take shape. Mobile apps are fundamentally changing the way we approach computing and how we go about daily tasks."
Yet, it is not just sheer processing power driving change. Today's smartphones and tablets are packed with a growing array of chips and sensors--accelerometers, barometers, compasses, geolocation sensors, cameras, microphones, and much more--and they increasingly connect to powerful cloud-based processing tools. This makes it possible to tap into remote databases and sophisticated algorithms that handle speech recognition, image processing, and other sophisticated tasks that a device cannot accomplish on its own.
Moreover, these mobile devices and apps deliver a level of contextual information that a desktop computer or laptop cannot. "The ability to deliver things when and where people need and want them completely transforms the experience," says Satya Ramaswamy, vice president and global head of the digital enterprise unit at Tata Consultancy Services. Adds Jason Underwood, an instructional designer at Northern Illinois University, "The lower cost of mobile and handheld devices along with advanced functionality and constant connectivity fundamentally redraws the computing landscape."
An app such as Fooducate is a good example of this new order of computing. It provides instant insight and feedback about the quality and healthfulness of food scanned at supermarket shelves--and assists with meal planning. This helps drive better decisions about food purchases. Meanwhile, the app Shazam tags music playing in a store, restaurant, or on the radio so an individual can identify the song and purchase it through an online music store such as iTunes. Other apps such as Curb and Uber make it possible to tap an icon and order a taxi or car from a specific location; the driver shows up within a few minutes and, after the ride has ended, it is also possible to pay through the app and receive an e-mail receipt.
Groups of apps that work in conjunction with various personal devices and the Web are further transforming the computing landscape--and introducing the Internet of Things. In some cases, these mobile platforms rely on application programming interfaces (APIs) to tie together an entire ecosystem of functions and capabilities that break through the boundaries and limitations of a single device or technology tool. The data may travel across multiple servers en route to an app.
All of this makes it possible for a Fitbit or Jawbone activity wristband to stream data to its own app but also to connect via APIs to other apps, such as MyFitnessPal and RunKeeper, in order to capture activity data and food consumption in a far more comprehensive way. In addition, the system can pull data from Internet-enabled treadmills, exercise bikes, and scales to provide a far more comprehensive and accurate snapshot of activity level, calories burned, and calories consumed over the course of a day.
We have only begun to explore the capabilities and possibilities of mobile apps, Landay says. As processing power increases, new and better sensors appear, more sophisticated connectivity takes shape (including mesh networks), cloud computing expands, and more advanced software and algorithms emerge, entirely new applications will appear. Future mobile apps will incorporate facial and voice recognition, situational awareness, augmented reality, and far more contextual data obtained from sensors embedded in our clothing and bodies--as well as from the world around us.
Says Landay, "Mobile platforms and apps are driving enormous changes in business, education, healthcare and daily life. They are redefining how we think about computing and the digital world."
Samuel Greengard is an author and journalist based in West Linn, OR.
No entries found