Understanding customers and rewarding the best has always been at the center of running a successful business. However, thanks to big data and analytics, the concept is changing. A growing number of organizations are turning to algorithms to "score" customers and determine what price they pay, how long they must wait on hold, and whether they can return items they have purchased.
"Digital technology has enabled consumers to be more knowledgeable about prices, and online shopping has made it easier for them to hunt for bargains," explains Sunil Gupta, Edward W. Carter Professor of Business Administration and chair of the General Management Program at Harvard University. "To mitigate this price shopping behavior and build loyalty, firms are increasingly providing better service to their loyal and high value customers."
From retailers to airlines, wireless carriers to auto dealers, the concept is gaining traction. In addition, universities, criminal justice systems, and other government entities are adopting proprietary scoring methods to determine who is accepted, who goes to prison, and what benefits or penalties they receive. In China, for example, people who score low on the country's social credit system are unable to buy train tickets, book flights, obtain desired housing, and buy certain goods.
By the Numbers
"The idea of scoring customers isn't new," says Peter Fader, a professor of marketing at the Wharton School of the University of Pennsylvania. Marketers and retailers have used the concept in one form or another for nearly half a century, he says. Of course, banks routinely use scores to determine credit limits and lending practices. However, he says, "Advances in data science have transformed the field. A variety of different kinds of scores are being developed and used in more creative and expansive ways."
At the heart of customer scoring is a concept called customer lifetime value (CLV). A proprietary algorithm typically determines which customers are most profitable and which do not contribute to the bottom line (or may even cost a company money through excessive support or returns). Fader, who co-founded the scoring firm Zodiac Inc. (which Nike Inc. purchased in 2018), says that the basic CLV scoring algorithm revolves around three factors: recency, frequency, and monetary value.
Yet a CLV algorithm might incorporate hundreds or thousands of data points. Today, almost every consumer has a CLV score associated with every major company he or she does business with, and the score is used to make key decisions, often with no human intervention. "It helps companies determine which customers to acquire and retain, where to focus their resources for better service, and which marketing channels to reach for acquiring high-value customers," Gupta says.
What's more, the use of algorithmic scoring is spreading beyond the business world. In the U.S., criminal justice sentencing sometimes takes place on the basis of automated scoring methods. The Chinese social credit system, scheduled to be fully operational across that country in 2020, strives to make it "difficult to move" for those deemed "untrustworthy" due to breaking laws or a demonstrated unwillingness to engage in state-sponsored activities. This reportedly includes factors such as traffic fines, smoking in unauthorized areas, buying too many video games, posting false news, or an unwillingness to give blood or engage in volunteer work.
Algorithms Gone Wrong?
Not surprisingly, there are concerns about these scoring methods and the repercussions that surround them. For starters, "Organizations using these methods walk a fine line," says Tom Meyvis, a professor of marketing at the Stern School of Business, New York University. "You can wind up with customers that, at one end of the spectrum, are angry about being mistreated, and others, at the other end of the spectrum, feel a deep sense of entitlement and are easily disappointed."
Another issue revolves around the lack of transparency in scoring. "Consumers do not know how the models work or how organizations develop policies and make decisions," observes Marc Rotenberg, executive director of the Electronic Privacy Information Center (EPIC). Not surprisingly, this lack of transparency also opens the door to potential misuse and abuse, particularly when zip codes or other demographic information are plugged into an algorithm. "This creates concerns about discrimination and privacy," Rotenberg says.
Because these systems are secretive, and it's not clear what criteria is being used, EPIC is now proposing a set of universal guidelines for the artificial intelligence (AI) used in these scoring methods. EPIC criteria would address everything from transparency and data quality to cyber-security.
Rotenberg believes government use of these methods particularly raises alarms. "One of the reasons to regulate the use of data and AI is that once a government establishes a scoring system, there's no form of appeal."
Nevertheless, it's clear CLV algorithms aren't going away. Gupta says that in the years to come, they will become increasingly sophisticated, incorporating factors such as word of mouth, referrals, and other factors not directly related to purchases. Machine learning also will make the technology more precise. "As a firm gets more data about consumers over time, it is able to predict their future value more accurately."
Concludes Fader, "It's critical for organizations to maintain high standards and get the science right. If the technology is used effectively, it can benefit everyone. If it isn't used right, it becomes creepy, offensive, and ineffective."
Samuel Greengard is an author and journalist based in West Linn, OR, USA.