Pruthvirajsinh Zala was a 19-year-old first-year law student when he walked into the high court of Gujarat to argue his inaugural case. “It’s a big enormous building,” he says. “All black stone. There’s a giant statue of Gandhi. My petition says Pruthvirajsinh Zala v. State of Gujarat, so I am panicking—there are two government lawyers there against me.”
The subject of Zala’s case was PlayerUnknown’s Battlegrounds, or PUBG, a battle royale–style video game released two years ago by the South Korean company Bluehole. In the game, you’re dropped on a virtual island with 100 other players, and the winner is the last one standing. It’s similar to Fortnite: so similar, in fact, that the company behind PUBG sued Epic Games, the makers of Fortnite, claiming its formula had been stolen. (The case was eventually dropped.)
Arguably, PUBG is even more global than its rival, and since it is available on mobile phones and free to play, it caught on particularly in lower-income countries. India has perhaps as many as 50 million players, with regular tournaments held around the country and millions of people watching celebrity streamers in Hinglish, Tamil, and Telugu.
But while kids went wild for it, parents were freaking out.
There were concerns about bullying within the game, and reports of violence when parents tried to limit children’s play. PUBG was blamed for several deaths, including a 16-year-old boy who killed himself after his parents took the game away and two people reportedly so absorbed in playing that they were hit by a train.
PUBG responded by instituting age restrictions, face recognition, and parental controls. It even added a “health warning” that pops up after you’ve been playing for six hours straight. But that wasn’t enough for Gujarat. In March 2019, the state announced an outright ban on the game, supposedly a temporary block to help students concentrate on preparing for exams.
It’s not unusual for the authorities in India to interfere with technology use in this way. Apar Gupta, the head of the India Internet Freedom Foundation, says the country has one of the highest rates of internet shutdowns in the world. But while these usually put the burden of enforcement on service providers, the PUBG ban targeted players. Anyone caught playing faced fines and even potential jail time.
After the ban was enacted, 21 people were arrested in Gujarat. Mostly, they were Muslim teenagers and young men who gathered to play PUBG in tea shops in lower-middle-class neighborhoods. This is one of the first times in the world that arrests have been made merely for playing a game, according to Gregory Boyd, a partner at the law firm Frankfurt Kurnit Klein & Selz who specializes in gaming.
That’s when Zala—who was studying for his law degree at Nirma University in Gujarat’s largest city, Ahmedabad—stepped in. He’s never played PUBG. “I don’t have time for mobile games,” he says. “I spend more time studying.”
But he saw a miscarriage of justice. “This is arbitrary,” he told me. “This is completely unconstitutional. I’m not saying the game is all good, but if you’re banning it, you have to justify it.”
The right to be alone and make choices
A fundamental human right to play video games? It might seem like a joke. But the concept that children have intrinsic digital rights is emerging around the world—and if everything goes the way its proponents would like, it could have a direct impact on how major technology companies like Google and Facebook do business.
The movement traces its roots to 1989, during the infancy of the commercial internet. That’s when the United Nations introduced the Convention on the Rights of the Child. It covers life-and-death matters, like the right of children not to be separated from their parents unless it is in their best interests. But it also has a lot to say about the rights of people under 18 to access media, to be consulted on matters that concern them, to express themselves, to seek out useful information, and to have their privacy protected. And it enshrines their freedom of association and assembly, much of which now happens online.
These were the very rights that were violated by the PUBG ban, argued Zala. The government’s lawyers said the ban was meant to ensure public safety. The game was addictive, and players were disturbing the peace.
“When a citizen plays PUBG in her own house or on her balcony, it is their choice,” Zala countered in the petition. “It is respecting their right to privacy, their right to be alone, and their right to make choices.”
This is a somewhat novel idea. Until now, protecting children on the web has primarily meant keeping them off it, just as the Indian government tried to do with PUBG. And even that’s not something we do very well.
In the US, the Children’s Online Privacy Protection Act, or COPPA, says that children under 13 shouldn’t be profiled or tracked for the purposes of targeted advertising, and shouldn’t have their data traded. This means they can’t use many services, such as social media, without lying about their age—but it’s an easy lie that’s rarely checked. 5Rights, a UK-based foundation that advocates for children, estimates that companies have assembled around 70,000 separate data points about any given child by the age of 18.
The trouble with this approach to protecting children is exemplified by YouTube. Last September the US Federal Trade Commission fined Google and YouTube a record $170 million for violations of COPPA. The complaint pointed out that while YouTube officially said it was not for anyone under 13, it was simultaneously touting itself to advertisers as “the new Saturday morning cartoons.” Rohit Chopra, a dissenting member of the bipartisan FTC who thought the fine was far too small, pointed out that YouTube almost certainly earned far more by “illegally spying on children” than it paid out.
Sonia Livingstone, who directs the Preparing for a Digital Future initiative at the London School of Economics, argues that having a binary toggle at age 13 neither keeps children reliably safe nor allows them freedom to explore. “They should have access to all the resources that are going to help them,” she says. “When we get very kind of risk-focused, that’s when we curb kids, narrow down their range of options, and then they don’t get the opportunities to develop and express themselves and engage as agents in the world.”
Real Life. Real News. Real Voices
Help us tell more of the stories that matterBecome a founding member
Livingstone is one of the people thinking about how to apply a “child rights framework” to digital media instead. Along with 5Rights—which takes its name from a list of liberties that emerged partly in “deliberative consultation” with juries of children aged 11 to 14—she’s been helping the UK government come up with an “Age Appropriate Design Code” for the web. The goal, says 5Rights policy lead Jay Harman, is to make the internet a less predatory, less booby-trapped place to roam.
Those who adhere to the code cannot share data for under-18-year-olds, must do away with persuasive nudges meant to keep users on their site (such as autoplay or infinite scroll), and must shield young users from unsavory content suggestions and refrain from exposing their location. All of this must be explained in child-friendly language, with safety warnings if a user tries to change the settings.
The code says that sites can either offer this level of protection to all users, verify age through reliable means such as driver’s licenses, or allow children to self-declare. But, it adds, if they are caught mishandling young users there will be penalties.
Complying with the code could require a fundamental redesign of many services, user experiences, and revenue models, foremost among them behavioral advertising. And it’s likely to spread beyond the UK market too.