Digital games and educational apps for children can be a boon. They keep youngsters engaged in interactive play and learning, and can give parents a break.
Unfortunately, though, a large percentage of those games' characters and features are designed not to altruistically enlighten children, but to make them spend more time on the platform–and to get their parents to spend more money on extra features.
"We assume adults are better at recognizing persuasion pressure and are hopefully less magically engaged with their parasocial relationships with characters," said Jenny Radesky, M.D., principal investigator of the Radesky Lab at the University of Michigan Medical School. "Kids' relationships with Elmo or Daniel Tiger or Strawberry Shortcake are very important to them, and they are more likely to follow those characters' instructions."
In her lab's most recent research on children's mobile apps, Radesky found concerning evidence that game developers are putting their interests ahead of their young audience in designing and creating their products: only 20% of 133 mobile apps played by 160 children aged 3 to 5 had no manipulative design features intended to better monetize the child's experience.
The manipulative features Radesky and her colleagues found included parasocial relationship pressure, fabricated time pressure, navigation constraints, and the use of "attractive lures" to encourage longer game play or more in-app purchases. These features are usually tied to data collection mechanisms that exploit a child's inherent trust.
The study, published in the Journal of the American Medical Association's JAMA Open, seemed to confirm concerning results published elsewhere in recent months:
- An analysis of evident privacy policies in products in the Google and Apple app stores by fraud, privacy, and compliance data analytics firm Pixalate, found 11% of child-directed apps in the Google Play store, and 21% of those in the Apple store, had potential access to users' personal information but no detectable privacy policy; almost 250,000 had no discernible country of origin, a nightmare for enforcement agencies.
- An examination by Human Rights Watch of how well (or poorly) educational technology deployed for remote schooling during the Covid-19 pandemic protected children's privacy found that 145 of 169 educational applications "appeared to engage in data practices that put children's rights at risk, contributed to undermining them, or actively infringed on these rights."
Radesky said that while it is evident across all this research that children's privacy and priorities are given short shrift by app and game designers, it is also an encouraging sign that children's needs are now being given more widespread attention. What is not so evident is some sort of consensus about how best to address these shortcomings.
Numerous existing laws such as the U.S. federal government's Children's Online Privacy Protection Act (COPPA), in place since 1998 and updated in 2013, and the European Union's General Data Protection Regulation (GDPR), in place since 2018, offer some level of privacy protection. However, the increasing complexity of the digital ecosystem has revealed loopholes in some of these policies that allow app developers to skirt the boundaries–and sometimes, to cross the line–of what's ethical, if not outright illegal.
For example, COPPA's primary goal is to place parents in control of the information online games and services collect from their children under age 13. According to the Federal Trade Commission (FTC), it applies without question to developers whose products collect, use, or disclose personal information from those children, or on whose behalf such information is collected or maintained (such as when personal information is collected by an ad network to serve targeted advertising).
However, the law opens a significant loophole by also stipulating that COPPA applies to operators of general audience websites or online services with actual knowledge that they are collecting, using, or disclosing personal information from children under 13, and to websites or online services that have actual knowledge that they are collecting personal information directly from users of another website or online service directed to children. Privacy advocates say this narrow definition can give developers of games that children are likely to play plausible deniability by marketing their app to "general audiences," though children may play it more often than adults.
Moreover, among the plethora of policy solutions, some technology policy experts say the cure could be as problematic as the ailment.
Conflating issues hurts proposed legislation
"One problem with a lot of the good ideas about updating COPPA are that bills don't separate out 'data privacy' issues and 'speech harms' issues," said Jason Kelley, associate director of digital strategy at the Electronic Frontier Foundation, a digital rights advocacy group.
For instance, Kelley drew attention to the Kids Online Safety Act (KOSA), currently wending its way through the U.S. Senate. Kelley characterized KOSA as well-intentioned, but a broad-brush and heavy-handed bill that would simplistically deny adolescents struggling with issues such as eating disorders or gender identity access to platforms that could help them address their situation. When Tumblr banned discussions of anorexia, it discovered that the keywords used in pro-anorexia content were the same ones used to discourage anorexia, Kelley wrote in an EFF blog post outlining objections to the bill. Additionally, he argued, political philosophies held by an attorney general in a conservative state such as Texas could result in platforms being compelled to remove content globally: "KOSA would empower the Texas attorney general to define material that is harmful to children, and the current position of the state would include resources for trans youth," he wrote. "This would allow the state to force online services to remove and block access to that material everywhere—not only Texas."
Kelley said proposed legislation that best improves COPPA focuses tightly on strengthening children's privacy and banning targeted advertising for children online, rather than adding the more amorphous categories of safety and digital equity into the mix. He said one example of a focused bill is the "Protecting the Information of our Vulnerable Children and Youth Act," aka the Kids PRIVCY Act, introduced by Rep. Kathy Castor of Florida in the House of Representatives. Kelley said that bill recognizes the developmental differences between children under 13 and under 18, and allows those older children to have more say in how their information is collected. The bill also tightens up the "actual knowledge" clause of that applies to services "likely" to be accessed by children or teenagers," which means "the possibility of more than a de minimis number of children or teenagers accessing the digital service is more probable than not."
Added Kelley, "The next few months seem to be pivotal for data privacy nationally—because of the midterms, it's incumbent on Congress to get some wins, and we're carefully working with them to make sure those wins are real, and not legislation that on balance, is two steps forward and two steps back."
Private sector gains tools, regulators
Encouraging as some proposed legislation in the U.S. may be, the reality is that many of the apps with the most egregious privacy practices are also practically out of the reach of U.S.-based enforcement agencies.
"COPPA does have an international scope, so if you created an app and you are directing it to children in the U.S., you are subject to COPPA," said Allison Lefrak, senior vice president at Pixalate, a fraud protection, privacy, and compliance analytics platform. However, she added, as a practical matter, for the FTC to go after some random offshore app developer is not a high-percentage action.
Lefrak may be an example of how the industry is addressing children's privacy in a more-self-regulating way than relying on patchworks of regulation and legislation that may be compatible, or not, from jurisdiction to jurisdiction. Prior to joining Pixalate in October 2021, she led several high-stakes investigations at the FTC, including a 2019 investigation that led to a settlement with TikTok, and a 2014 investigation that led to a settlement with Snapchat.
Among her keystone efforts at Pixalate is overseeing the company's new automated COPPA compliance toolkit. Lefrak said the tool uses input data such as content ratings from the Entertainment Software Ratings Board (ESRB), category as listed in an app store, and tell-tale keywords such as unicorn, princess, or popular children's characters such as Dora The Explorer.
"Customers were relying on manual review previously to do this, and there are 5 million apps available between the two stores, so it's impossible to review all of them manually,' Lefrak said. Consumers can take advantage of a free version of the COPPA compliance software that allows look-up of any app the tool categorizes. The business-side model also allows customers in the advertising technology industry to create lists to verify publishers are correctly classifying apps as likely to be child-directed ,and are following best practices such as complying with real-time bidding guidance around COPPA stipulations.
Another FTC veteran, Stacy Feuer, who was the commission's assistant director for international consumer protection, was hired by the ESRB in January 2022 to oversee its Privacy Certified (EPC) program. The program offers ESRB members guidance in complying with COPPA and GDPR regulations, and the U.K.'s Age Appropriate Design Code, which was enacted in 2020 and mandates protections on apps "likely" to be accessed by children, rather than COPPA's "actual knowledge" stipulation.
Though both Feuer and Lefrak had significant experience at the FTC, Feuer said one shouldn't extrapolate too much from her hiring, citing the ESRB's long tenure as a self-regulating member organization. "EPC's members, many of whom have been with the program for almost two decades, have long understood the importance of protecting children's data and keeping them safe online," Feuer said. "They commit significant resources to privacy compliance."
The Emergence of Best Practices and standards
Ultimately, Feuer said, finding a way to strengthen children's privacy safeguards will require a combination of legislation, responsible development by reputable vendors, vigilant monitoring of child-oriented apps by the Google and Apple app stores, and industry cooperation through organizations such as ESRB and technical standards bodies.
"It's extremely difficult to hold fly-by-night foreign operators with no discernible country of origin responsible for compliance," she said. "It takes a lot of time and effort just to track these types of entities down, and often it's impossible to do so. Plus, they can just spring up again under a new name if they are shut down. It's a difficult problem that requires international cooperation and creative solutions."
As an example, she cited a 2014 warning letter the FTC sent a Chinese vendor named BabyBus regarding possible COPPA violations, including collecting precise geolocation data. Shortly thereafter, she said, the app stores suspended the company's apps.
A new entry into this patchwork, though, is perhaps the most comprehensive single global effort to address children's privacy online to date: IEEE 2089-2021, the organization's standard for an age-appropriate digital services framework. Published in November 2021, the standard is based on the principles for defending children's safety and autonomy on the Internet developed by the London-based 5Rights Foundation.
"While there are localized efforts to address children's rights and safety in digital products and services for children, there has never been consensus-driven guidance applicable on a global scale," Konstantinos Karachalios, the managing director of IEEE SA, said in announcing the standard's publication.
Radesky said she hoped to see widespread adoption of the standard. "The 5Rights Foundation has really brought the issues surrounding this much more into our discourse," she said. "It has been much clearer to people that the mechanics of the Internet were designed by adults for adults.
"Kids are different, they understand things in unexpected ways. They are more magical, less logical, and more easily taken advantage of. They are more reliant on the adults in their world to help them recognize what is in their best interest and what is not. I don't want to be heavy handed – that's very individual based on the child and the family. But I'm glad to see a confluence of thinking about how we can redesign at least part of the digital ecosystem to be more aligned with kids' needs in mind."
Gregory Goth is an Oakville, CT-based writer who specializes in science and technology.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment