We had a detailed discussion about how countries around the world are starting to tighten verification rules in games with Natalia Vozyan, who holds the position of VP of Legal & Operations at the company Xsolla.
Natalia Vozyan
Alexander Semenov, App2Top: We’re increasingly reading news reports about tightened internet verification tools. Let’s first try to understand when this topic even started?
Natalia Vozyan, Xsolla: The idea of user verification is rooted in the discussions about protecting children in the digital space. This agenda started gaining momentum around 2020, when every reputable legal journal was discussing the risks for children online.
Who first seriously took on verification? China?
Natalia: No, China is also one of the pioneers, but the first serious step was the Children’s Online Privacy Protection Act (COPPA) — the U.S. law that was enacted on April 21, 2000. Then came China, and only relatively recently, in 2018, the General Data Protection Regulation (GDPR) came into force in the European Union.
Did it set a milestone?
Natalia: Absolutely. GDPR effectively introduced the concept of “know your customer” (KYC) related to personal data, including that of children. After that, many countries adopted the European experience as a basis and created local regulations modeled after it.
Where is verification being implemented particularly actively today?
Natalia: Recently, in 2025, the UK implemented the Online Safety Act, aimed at regulating content for children and requiring verification when selling digital content.
The most radical measures were introduced in Brazil just this year. It’s no longer enough to simply declare “I confirm I am over 18”; verification through real documents or biometrics is now required. Self-declaration is completely banned.
You mentioned self-declaration. Before we move on, let’s stop here and clarify what you meant.
Natalia: Self-declaration is one of the verification tools. There are three types overall. The first is self-declaration, where a user simply checks a box saying they are over 18. It’s quick and convenient but absolutely unreliable. The second is verification through data like a credit card, phone number, or ID, which is slightly more reliable. The third is verification through documents or biometrics, like passport uploads, Face ID, or checks with government databases, which are highly reliable.
And now we are moving away from self-declaration, right?
Natalia: And it makes sense: modern children start using phones before they can talk and know they can just answer “yes” to being over 18 to access content. Regulators have realized this and are moving towards more accurate tools. Complaints from parents about unauthorized spending by children — in-game purchases using parental credit cards for significant amounts — have acted as an additional trigger, creating public pressure and accelerating the transition to stricter identity verification methods.
Including in Russia?
Natalia: The situation is different here: user verification remains conditional, and the protection of certain groups is mainly regulated by the law “On Advertising.” However, initiatives for mandatory identification for internet access are increasingly mentioned in the public domain.
You earlier noted that tightening verification is logical. From a state institution perspective, yes. But I’m curious, what’s your personal take on the current situation?
Natalia: Speaking personally, the balance of interests is clearly off here. There are industries where the same KYC is entirely justified, but the gaming industry is not the financial sector. A game developer doesn’t need to know the user’s passport data to provide its services. Nonetheless, in Brazil, they’re now required to store biometrics, which is both an operational burden and a massive regulatory risk.
Regulators are essentially shifting the responsibility of nurturing and monitoring children from parents to businesses. Monitoring what content a child consumes and how much time they spend in games is primarily a parental function. Businesses should certainly observe reasonable restrictions, but obligating a game developer to collect biometrics and verify the age of every user is a disproportionate burden that doesn’t match the nature of this business.
I’m not questioning the importance of compliance and protecting vulnerable groups — these are fundamental values. The question is about proportionality: regulation should take industry specifics into account and not create barriers where lighter tools are sufficient.
You rightly pointed out that all this is a disproportionate burden for businesses. But do I understand correctly that verification primarily concerns not the game developers themselves but gaming platforms?
Natalia: Look, the current wave of verification restrictions has primarily impacted two large categories.
The first is financial services: banks, payment systems, and crypto exchanges. Here, user verification has long been standard, and KYC and other procedures are industry standards, so no one is particularly surprised.
The second category is entertainment services. This is complex. It includes streaming platforms, social networks, online gambling, and, of course, video games. These are under the most regulatory pressure because regulators perceive them as the main risks for children and adolescents — whether it’s access to inappropriate content, excessive online time, or in-game purchases.
The logic of regulators is generally understandable: if a platform allows minors access to content or sells them anything, it should control that. The question is whether the proposed tools are proportionate to this task.
What are regulators demanding or aiming for from them?
Natalia: In terms of declared objectives, they are quite noble: protecting minors from inappropriate content, combating fraud, and ensuring transparency of the digital environment. On paper, verification looks like a tool for public good.
But in practice, the picture is more complicated. By requiring businesses to know their users, regulators effectively gain a ready-made infrastructure for collecting data about citizens without incurring their own costs. Upon request from a government body, any company must provide the accumulated information. This is convenient.
There’s also the punitive aspect: non-compliance with verification requirements entails serious penalties, becoming a tangible source of budget replenishment.
I wouldn’t say that the legal goal is just beautifully phrased. But for regulation to truly work in society’s interest and not just on paper, a dialogue between the regulator and businesses is essential. At the moment, there’s a clear lack of this balance.
Speaking of verification changes specifically in the gaming sector, it’s essential to focus on China, where restrictions were implemented earlier than anywhere else (I mean in practice, not just in legislative activity). Tell us about the experience of Chinese game developers, what challenges they faced, and how they tackled them.
Natalia: China can indeed be called a true pioneer in this area. As early as 2007, several ministries jointly proposed restrictions for minors in video games to protect their physical and mental health. They introduced a time grading system: how many hours a day different age groups could spend in a game.
As is often the case, developers were left to handle the task alone without specific tools. For several years, the major players in the market experimented on their own: collecting data from identification documents, implementing biometrics, all at the cost of conversions and massive operational expenses.
Only in 2021 did the government finally systemically address the problem. The NPPA (National Press and Publication Administration) launched a centralized national verification infrastructure, fundamentally changing the model. Now developers make requests to a government database, which returns the user’s age in real-time and automatically implements the necessary restrictions. Thus, the burden of storing sensitive data was relieved from businesses.
However, the problem isn’t entirely resolved. Workarounds still exist: children use their parents’ documents, and some parents knowingly help children circumvent verification. Developers identify such users post-factum, for example, when contacted for support or refund requests, identifying minors through their manner of communication and phrasing.
I also heard that South Korea attempted a similar experience but eventually chose to abandon it?
Natalia: Korea is a prime example of how strict regulation can evolve into a more balanced approach. In 2007, the country introduced an internet real-name verification system requiring users of major sites to verify their identity through a resident number. Concurrently, the so-called Shutdown Law, prohibiting children under 16 from playing online games from midnight to 6 AM, was in effect.
However, in 2012, the Constitutional Court annulled the internet verification system, recognizing it as an excessive restriction on freedom of speech and citizens’ rights to anonymity.
Shutdown Law lasted longer, but in 2021 the government also abandoned it. Compulsory restrictions were replaced with a parental choice system: now it’s up to parents to decide when and for how long their child can play.
So, Korea transitioned from government control to parental responsibility, which, in my opinion, is a logical direction for the gaming industry.
We started our conversation with the European Union. Now I’d like to return to it in the context of measures taken in Asia. Compared to China or the recent situation in South Korea, it seems the gaming sector in Europe hasn’t been feeling the change in verification approaches much yet. Is that so?
Natalia: Overall, yes, that’s correct. I think the European business is still digesting the consequences of GDPR implementation. This regulation imposed a colossal burden, with the documentation alone describing all data processing processes requiring significant time and financial resources. So far, the industry hasn’t experienced any new major upheavals.
In the gaming sphere, PEGI — the pan-European age rating system — plays the role of a protective buffer. Together with data protection and refund rules, it currently maintains reasonable order without requiring strict verification.
However, GDPR already includes an important provision: processing a child’s data under 16 without explicit parental consent is not allowed. This formally entails age verification requirements. But since they haven’t been cemented in specific binding acts for the gaming industry yet, businesses navigate these norms without resorting to checking every user’s passport.
You also mentioned the British Online Safety Act. It seems verification is stricter in the UK than in mainland Europe. I know even Steam had to take it into account.
Natalia: The UK is one of the most serious examples of a regulator truly intending to enforce what’s written on paper.
The new Online Safety Act prohibits self-declaration, verification through payment methods not guaranteeing majority, or restrictions through user agreements — none of these meet the “highly effective verification” threshold according to the regulator.
And yes, Valve responded to the law by updating Steam’s policy: now British users must verify to access games with an 18+ rating. Valve opted for an approach based on an attached credit card, an elegant solution with minimal impact on conversion. Although many experts doubt if it will pass as “highly effective.”
Additionally, the British regulator recently fined AVS Group, managing 18 adult websites, £1 million for lacking proper age verification. The issue wasn’t the absence of verification per se — AVS’s system accepted photo uploads without a “liveness” check, allowing a child to just present someone else’s photo to the camera.
This law is already changing large players’ behavior: YouTube, Spotify, and several other platforms have implemented ID-based verification.
Strict. What’s happening in the States? You mentioned they were the first to start verification; what’s the situation now?
Natalia: Discussing the U.S. isn’t easy because the situation is evolving simultaneously at state and federal levels — moving in different directions.
At the state level, the process is progressing rapidly. In 2025, mandatory age verification laws took effect in nine states, including Florida, Georgia, and Missouri. For now, the regulators’ main targets are adult content platforms and social networks. They haven’t yet reached the gaming industry per se, but the trend is clear: the boundaries are continually expanding.
At the federal level, there are attempts to create a single standard — for instance, in November 2025, a congressman introduced the Safer GAMING Act, mandating online game developers to embed parental control tools, particularly the ability to disable chats between minors and other users. A national standard would be a sensible solution, as tracking each state’s requirements and adjusting individually is practically impossible.
As for future prospects, the picture is mixed. The First Amendment guaranteeing free speech already acts as a real barrier: courts have blocked several state laws on this basis. Research also indicates that verification doesn’t achieve its goals, as users simply migrate to other platforms or use VPNs. After Florida enacted its law, VPN demand increased by over a thousand percent.
Amid tightening practices, should we expect full-fledged passport verification for games with high age ratings?
Natalia: Full-fledged passport verification for games with high age ratings is no longer a question of “if,” but “when” and “where.” Brazil has essentially already reached this stage. The UK is heading in the same direction. China has implemented a centralized model where the state itself acts as the verification operator.
Technically, this is feasible — the question is at what cost. A platform with a long-standing user base, where users have spent years building accounts and investing real money, will suddenly have to ask all these people to show their passports. The audience’s reaction is predictable, and conversion losses are inevitable. Additionally, developers will become operators of sensitive personal data, facing all related challenges — storage systems, leak protection, compliance with local data processing requirements.
The question isn’t whether this will happen — but how ready the industry will be when it does.
Oh, challenging times ahead. Thank you for the conversation!

