We live in a global society where technology, especially information and communication technology, is changing the way businesses create and capture value, how and where we work, and how we interact and communicate. In her seminal 1988 book, In the Age of the Smart Machine: The Future of Work and Power,45 Shoshana Zuboff was among the first scholars to weave together the technological, sociological, and psychological processes that have converged to shape the modern workplace. Her insights concerned the nature of information and its significance in restructuring and redefining the patterns and meanings of work, even though at the time of her study the worldwide diffusion of the Internet had not yet occurred. Academic literature, not only in business9,32,42 but also in medicine,15,38 engineering,23,40 physical sciences,30 and social sciences,21,37 echo these observations in more recent times. To illustrate the effects of the changes on organizations, we consider their implications for the management of human talent.
Key Insights
- As technology keeps advancing, we need to think beyond augmenting or automating jobs to how to rnanage the messy process of the creative destruction of jobs as we create the new ones.
- What enables or constrains people in the workplace is the way they use and rnanage technology, not technology itself.
- Technology-driven changes demand from us an understanding of the technology in relation to the entire work system, the relational and non-relational roles and interactions of human participants and/or machines.
The new wave of technological innovation features the emerging general paradigm known as “ubiquitous computing,”a or an environment where computational technology permeates almost everything, enabling new ways of connecting people, computers, and objects. Ever-cheaper cost for computation has resulted in the proliferation of computing devices, including personal computers, embedded (enabled by micro-miniaturization) and networked industrial sensors and processors, speech-recognition and eye-tracking devices, mobile devices, radio-frequency-identification and near-frequency-communication tags and labels, global-positioning-systems-enabled devices, smart televisions, car navigation systems, drones, wearable sensors, robots, and 3D virtual reality. The ubiquitous computing infrastructure also enables collection of enormous quantities of structured and unstructured data, requiring the adjective “big” to distinguish this new paradigm of development. Ubiquitous computing also blurs the boundaries between industries, nations, companies, providers, partners, competitors, employees, freelancers, outsourcers, volunteers, and customers. They also yield opportunities to unify the physical space, which has always used information to try to make an inherently inefficient system more efficient, and the electronic space, which enables information accessibility to overcome the limitations of the physical space. Merging the physical and the electronic also has implications for privacy and security, as well as how companies are organized and manage human talent.
Given these rapid advances and our increased reliance on technology, the question of how to manage technology-enabled change in work and employment is highly salient for companies and their executives. General predictions anticipate significant changes in knowledge acquisition, sharing, and distribution, as well as related ripple effects in the workplace.b Work is defined here as the application of human, informational, physical, and other resources to produce products and services.5 If one accepts that work does not exist without people and executives are inherently concerned with the management of people within organizations, then they bear some responsibility for understanding the effects of technology on work and employment. This article thus aims to interpret the progress, direction, and managerial implications of current research in work and employment. We begin with three lessons for executives based on our review of relevant literature. We then examine how technology affects six key areas of talent management as organizations move from traditional to ubiquitous computing. We conclude with a series of questions for managers in the six areas.
Methodology
This article is part of a larger project aimed at examining how technology is changing work and organizations.12 Our conclusions are based on a comprehensive review of the literature in management, industrial/organizational psychology, labor economics, human-factors design, and information and computer technology.
Lesson 1. The effect of ubiquitous computing on jobs is a process of creative destruction. Ubiquitous computing is not the first technology to affect jobs. From steam engines to robotic welders to ATMs, technology has long displaced human workers, often creating new and higher-skilled jobs in its wake. Mass production of the automobile threw many blacksmiths out of work but also created far more jobs building and selling cars. Over the past 30 years, the digital revolution, coupled with global business markets, have displaced many of the middle-skill jobs behind 20th century middle-class life in Western industrial countries. The number of typists, cashiers, travel agents, bank tellers, and production-line jobs has fallen dramatically, particularly in the U.S. and Europe, but there are more computer programmers and web designers than ever before. Displaced workers with obsolete skills are always hurt, but the total number of jobs has never declined over time.2
Paradoxically, although productivity, a key indicator of growth and wealth creation, is at record levels and technological innovation has never been greater, over the past several decades, median wages in the U.S. have not risen.19 This pattern is inconsistent with economic theory, which holds that when productivity increases, any automation that reduces the need for labor will increase business revenue and personal income. That will, in turn, generate demand for new products and services, that will, likewise, create new jobs for displaced workers. One explanation for this pattern is that advances in information and communications technology are destroying more jobs in developed economies than the advances are creating. Technological progress is thus eliminating the need for many types of jobs, leaving the typical worker worse off than before.10 According to one 2017 study,18 approximately 47% of total U.S. employment is at risk of automation.
Not all researchers concur with this conclusion, however. Although labor economists generally agree that the digital revolution is opening a great divide between a skilled and wealthy few and everyone else, hollowing out the middle class,7 it is not clear that all of it can be attributed to the effects of technology. The data is far from conclusive. One result of the change is the simultaneous increase in both job openings and unemployment relative to the early 2000s,17 suggesting the types of skills in demand by employers today do not match up with those of the existing labor force. Other plausible explanations, including events related to global trade and the financial crises of the early and late 2000s, could account for the relative slow pace of job creation since the turn of the century. The problem for researchers and executives is that it is difficult to separate the effects of technology from other macroeconomic effects.39
To be sure, the advent of machine learning, where computers teach themselves tasks and rules by analyzing large datasets, will lead to large-scale worker dislocation, as automated areas (such as speech recognition, pattern recognition, and image classification) eliminate large numbers of white-collar jobs.18 We agree that many jobs performed by humans today, notably bookkeepers, auditing clerks, financial analysts, graphic designers, and medical transcribers, will be substantially taken over by robots or digital agents by 2025. Other jobs will disappear as a result of structural changes in the economy (such as the long-term decline in demand for coal, as cleaner sources of energy become cheaper and more readily available).
Unlike effective managers, however, machines have not yet learned to tolerate high levels of ambiguity or to inspire people at every level in organizations. Consider ambiguity. The bigger and broader the question to be addressed, the more likely human synthesis will be required to address it because, although machines can provide many pieces of the solution, they cannot assemble the “big picture.” The process of assembly entails discerning why a company is doing what it is doing, where it is trying to go, and how it proposes to get there. Success depends on the ability of executives to tolerate ambiguity and synthesize and integrate a variety of types and forms of information. The big picture represents the glue that holds a company together. Moreover, when it comes to engaging and inspiring people to move in the same direction, empathizing with customers, and developing talent, humans will continue to enjoy a strong comparative advantage over machines.
Even if today’s information and communication technologies limit the potential growth of employment, history suggests it is a temporary, though painful, shock. As workers adjust their skills and entrepreneurs create opportunities based on the new technologies, the number of jobs will rebound. At the same time, human ingenuity will create new jobs, industries, and ways to make a living, just has it has since the dawn of the Industrial Revolution,41 following Joseph Schumpeter’s “gale of creative destruction.”
Lesson 2. Ubiquitous computing can be used to enable or constrain people at work. To illustrate how that works, consider electronic monitoring systems, robots, and wearable computing devices. Each shares computer science’s expressed ubiquitous computing vision of interweaving technology into everyday life, making technology pervasive, and facilitating physical and virtual interactions.
Electronic monitoring systems. Monitoring refers to systems, people, and processes used to collect, store, analyze, and report the actions or performance of individuals or groups on the job.3,8 Our focus here is on electronic monitoring and surveillance systems. Monitoring today may assume a variety of forms (such as telephone, video, Internet, and Global Positioning Systems). In the past, U.S. courts generally sided with employers when choosing to monitor their employees, arguing that because monitoring takes place during work hours through organizational assets (such as corporate computer networks and email systems), monitoring is acceptable.22
Many organizations equip machinery, shipments, infrastructure, devices, and even employees with networked sensors and actuators that enable them to monitor their environment, report their status, receive instructions, and take action based on the information they receive. By monitoring these resources in real time, companies can better control the flow of operations and avoid disruptions by taking immediate action as problems arise. Organizations are also developing policies on using blogs and social networks (such as Facebook) outside of work, potentially affecting employees’ perceptions of trust and loss of personal control.26 Monitoring per se is neither good nor bad, depending instead on how it is implemented. To be sure, monitoring can be beneficial, as self-initiated systems demonstrate. Systems that enable employees to track their activities at work have led to increased productivity by helping them understand better how they allocate their time.33 Such understanding allows workers to reallocate their time, tasks, and activities to accomplish work goals more effectively.
A comprehensive review of research in this area concluded that attitudes in general, and attitudes toward monitoring in particular, will be more positive when organizations monitor their employees within supportive organizational cultures.4 Supportive cultures welcome employee input into the monitoring system’s design, focusing on groups of employees rather than singling out individuals, and focusing on performance-related activities. Theoretical and empirical researchers have identified three additional features of monitoring systems that contribute to employee perceptions of fairness or invasiveness:6 consistency in how data is collected and used; freedom from bias (such as selective monitoring); and the accuracy of the data being collected. Conversely, when monitoring systems are viewed as invasive or unfair, organizations run the risk that employees may not comply with rules and procedures, slack off on the job, or engage in deviant behavior.4
Technological progress is eliminating the need for many types of jobs, leaving the typical worker worse off than before.
It is important to note an additional factor that may be associated with electronic monitoring systems—when organizations impose control they reduce autonomy and increase perceived job demands, both contributing to employee burnout.31 Evidence from a variety of manufacturing contexts indicates that close supervision is associated with increased stress.24 With electronic monitoring, a supervisor or higher-level manager need not even be present to do the monitoring. As a result, the potential for constant monitoring creates a type of control employees often regard as particularly stressful. As a general conclusion, when electronic monitoring is seen as control-based rather than developmental, employees are likely to experience more negative outcomes.13
Robots. Robotsc have been on factory floors for decades. Years ago, they were mostly big, expensive machines that had to be surrounded by cages to keep them from smashing into humans. They could typically perform only a single task (such as spot welding) over and over, albeit extremely quickly and precisely. They were neither affordable nor practical for small businesses. Today, however, so-called collaborative machines are designed to work along-side people in close settings. They cost as little as $20,000 and offer small businesses incentives to automate in order to increase overall productivity and lower labor costs.1 Moreover, advances in artificial intelligence, combined with improved sensors, are making it possible for robots to make more complex judgments and to learn to execute tasks on their own, enabling them to manage well in uncertain and fluid situations, many involving humans.
Not only are robots being embedded into organizational social systems, they are becoming social actors within those systems. Historically, the terms “co-worker” and “teammate” implied fellow humans, but this may no longer be the case, as co-worker robots, or “co-bots,” enter the workplace as team members.14 As they evolve, robots are likely to become more adaptable to the work environment, with multimodal interfaces enabling them to communicate more efficiently and effectively with human teammates, receiving, as well as transmitting, information.36
A key challenge to human-factors specialists is how to design human-robot control interfaces that are simple and easy to use yet robust, because the connections that allow remote robots to take action without a human operator could be subject to hacking. Social acceptance is critical. If robots are truly to be team members, humans must accept them, communicate effectively with them, develop shared mental models with them, and perhaps, most important, trust them. As robots perform more and more autonomous tasks, operators’ workloads should, in theory, decrease, freeing them to perform other tasks. Yet the allocation of functions between humans and robots is an area that needs considerable attention because automation has been shown to create its own set of problems, including decreased situational awareness, distrust of automation, misuse, abuse, and disuse, complacency, decrements in vigilance, and negative effects on other facets of human performance.36 Research and theory in work analysis, teams, selection, training, motivation, and performance management can aid successful design and integration of robots into work teams and organizations.14,28
There is an additional concern that managers must address—that workers view robots as competitors for jobs and resist their installation. For surviving workers, robots can indeed augment their capabilities, but the fear of job loss is real. At Fanuc Corporation’s 86,000-square-foot factory in Oshino, Japan, which makes industrial robots, only four people staff the entire factory. In another, robots can assemble an industrial motor in just 40 seconds.34 Robots threaten the jobs of white-collar workers as well. As an example, consider that robots now perform work in corporate finance departments that used to require teams of people, as software automates many corporate bookkeeping and accounting tasks. Between 2004 and 2015, the median number of full-time employees in the finance department at big companies declined 40%, from 119 to approximately 71 people for every $1 billion of revenue.29 Jobs most in jeopardy include accounts-payable clerks, inventory-control analysts, and accounts-receivable clerks who send invoices to customers, track payments, and forecast customer default rates.29
The potential for constant monitoring creates a type of control employees often regard as particularly stressful.
Not all robots or robot makers will displace humans, however. For example, Kiva robots, owned and manufactured by Amazon Robotics, is designed to scurry across large warehouses, fetching racks of ordered goods and delivering them to humans, who package the orders. A warehouse equipped with Kiva robots can handle up to four times as many orders as a similar unautomated warehouse, where human workers might spend as much as 70% of their time walking or transporting themselves to retrieve ordered goods. Most of Kiva’s customers are e-commerce retailers, some growing so quickly they cannot hire people fast enough. By making distribution operations cheaper and more efficient, robotic technology has helped many of these retailers survive and even expand. Such advances illustrate that while some aspects of work can be automated, humans still excel at certain tasks (such as packaging various items together). Kiva robots are designed and built to work with people, taking over tasks humans do not want to do or are not very good at. While they can enhance the productivity of these workers, clerical and some professional jobs could be more vulnerable, as the marriage of artificial intelligence and big data gives machines more human-like abilities to reason and solve new types of problems.39
Wearable computing devices, or “wearables.” Wearablesd generally comprise three broad categories:44 “quantified self” products that allow people to measure their activities (such as physical activity and sleep, as with Fitbit and Jawbone); enhancement technologies (such as Google Glass, prosthetic devices, and exoskeletons that help elderly people or those with disabilities); and virtual reality devices (such as headsets and telepresence systems), as with architects using them to see what their designs will look like in practice. Telepresence systems enable executives to experience the feeling of “being there,” attending meetings without having to travel. These devices are now possible thanks to four developments: improved computing power, increased speed of broadband access, the spread of sensors, and cloud computing.44 Smart vending machines are another example of how the nature of work is changing. Embedded sensors, combined with broadband access and cloud computing, make it possible to monitor them remotely for items out of stock, temperature changes, and pilferage. While the promise of wearable computing devices is obvious, there are potential drawbacks as well. The first is distraction, as people are cognitively half present and half absent, constantly checking their smart-phones as they walk along or stand in line. How often? They check them an average of 3.1 hours a day, according to one study by Meeker.27 This can wreak havoc on work/life integration, as there is no boundary by time or geography as to when or where people work.43 Another drawback is that digital devices make human interaction more difficult as the devices compete constantly for people’s attention.
Despite the drawbacks, many uses of wearable technology are emerging beyond consumer applications, becoming popular in industries as varied as construction, building maintenance, medicine, manufacturing, energy, and oilfield services. As an example, consider how a company in building maintenance might use wearables to preserve and transmit institutional memory. Workers nearing retirement are not always well-suited to climbing ladders or scaffolding to significant heights where mechanical equipment might be located. They leave that task to younger workers wearing special safety glasses equipped with cameras, microphones, speakers, detachable flash drives, and wireless antennas. Through Bluetooth connections to their phones, the younger workers could then transmit live video feeds of their actions back to a ground-based command center staffed by veteran older workers monitoring the videos and offering further guidance.20
Lesson 3. Ubiquitous computing is changing the nature of competition, work, and employment in ways that are profound and that need active management. Before it was possible to access inexpensive computational support, hoarding information was a source of power, and information moved in one direction only—up the corporate hierarchy. In today’s business environment of ubiquitous computing, the contrast could not be starker. While the changes made possible by today’s technology might be impressive, and digital innovation will continue for the foreseeable future, technology by itself does not ensure profitable business performance.
A comprehensive 2014 review of research at the junction of leadership and technology concluded that researchers tend to treat technology either as a contextual aspect of business performance relevant to the leadership process or as a set of tools that leaders and followers can use to communicate with each other.35 The complex, pliable, changing, and ever-expanding portfolio of Internet tools, information, and media is altering how consumers and businesses act in situations where previously they would have acted differently. Before the Internet, it was impossible to, say, communicate instantaneously or asynchronously across time and space or access vast bodies of information without visiting a library or other physical repository.
With the Internet, people have easy access to information they previously could not have found. Indeed, technologies trigger change by altering workers’ non-relational roles—the business-related tasks they perform and how they perform them. These changes may then lead to changes in the nature of the interactions workers have with other members of their role set, or fellow workers with whom they interact while doing their work, as well as others in their role set (such as co-bots). If role relations change in either way, then the social network is likely to change as well. If it does, one can say technology has altered the work system. Changes in role relations are thus key to a broad range of effects in work systems. To be sure, technology is altering role relations in profound ways.
Technology and Talent Management
The way technology is altering work settings and the work people do, particularly in the new era of ubiquitous computing, affects the way organizations manage their human talent and raises compelling questions for managers. Consider pre-employment testing. Traditionally, candidates would take tests at an employer’s site, in a quiet, distraction-free, comfortable place, where the employer could prevent breaches of security by checking candidate identification, eliminating opportunities for collusion, and controlling test materials at all times. Now consider unproctored Internet testing, where candidates, not employers, decide what conditions are best. Technology can deliver simulations or pre-employment assessments to any location at any time, raising a number of other security and trust issues that might influence test outcomes of interest, including the reliability and validity of the measures, adverse impact, size of the applicant pool, differences in means and standard deviations, applicant reactions, and perceptions of procedural justice.
There is certainly great potential for deepening management’s understanding of and ability to predict behavior in the domain of technology and talent management. Figure 1 outlines how the shift from traditional to ubiquitous computing technologies affects six conventional areas of talent management:11 work design, workforce planning, recruitment and staffing, training and development, performance management and compensation management, and the management of careers. Figure 2 outlines key questions for managers when moving from traditional to ubiquitous computing technology in these areas. Note the relevance of the lessons mentioned earlier, particularly lesson 2—that ubiquitous computing can be used to enable or to constrain people at work—as managers seek to address the questions in Figure 2.
Figure 1. Six areas of talent management supported by traditional and ubiquitous computing technologies.
Figure 2. Questions for managers when moving from traditional to ubiquitous computing in six areas of talent management.
Conclusion
Research on technology and organizations provides valuable insight regarding what managers know about the effects of technology. Based on a review of this research, we identified three main conclusions about how ubiquitous computing affects work and organizations: how the effect on jobs reflects a process of creative destruction; how it can be used to enable or constrain people at work; and how it is changing the nature of competition, work, and employment in ways that are profound and that need to be actively managed. We explored the effects of ubiquitous computing on six key areas of talent management, identifying a series of questions to help guide decision making as managers transition from traditional to ubiquitous computing in these areas.
Ultimately, the critical issue for managers to consider is not technology itself but that technology is fundamentally social, grounded in specific historical and cultural contexts. As it becomes embedded in everyday activities and social relations, technology affects all manner of human and organizational elements (such as governance structures, work routines, information flow, decision making, human interactions, and social actions). Fulfilling the potential of technology in work and employment will thus require recreating the way organizations operate in a world of digital ubiquity to maximize positive consequences for individuals and organizations and minimize the negative. Managing in a manner that inspires human performance includes framing the right questions, responding to exceptional circumstances highlighted by intelligent algorithms, and letting humans do things machines cannot.16 Each organization’s leaders, along with other stakeholders, must decide what technologies are adopted, how they are implemented, and the extent to which they augment or detract from worker autonomy, personal competence and control, and interpersonal connections with other human workers. At a broader level, there is a strong need for responsible public policies across institutions, not only to enhance competition, maximize economic surplus, and optimize its allocation across stakeholders, but also to minimize social and human risks and abuses. Establishing such policies will be an ongoing challenge for years to come.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment