Last Sunday, I watched my teenage cousin create three different Discord accounts with three different ages in under two minutes. She's 14.
Discord thinks she's 18, 25, and somehow also 13 with "parental consent." The scariest part? She learned this from a 90-second TikTok that has 4.2 million views.
What I discovered next sent me down a rabbit hole that exposed how Discord, Twitch, and Snapchat have built age verification systems that are essentially digital theater — security features designed to satisfy regulators while being trivially easy for any motivated kid to bypass.
Here's what the platforms don't want you to know: their entire business model depends on these systems being breakable.
Discord has 200 million monthly active users. Industry estimates suggest 30-40% are under 18. Twitch's audience skews even younger — internal documents showed 41% of viewers are between 13-17.
Snapchat? They stopped publishing age demographics after 2023, which tells you everything.
These companies face an impossible choice. Build real age verification that works, and watch their user base (and $4 billion in annual revenue) crater overnight.
Or maintain the current charade where a checkbox asking "Are you 13 or older?" counts as "verification."
The platforms chose option two. And they've gotten very, very good at it.
The conventional wisdom says these companies are trying their best with limited technology. "Age verification is hard," the experts claim. "There's no perfect solution."
That's bullshit, and the platforms know it.
South Korea has had functional age verification since 2011. Their system requires linking accounts to national ID numbers. Gaming addiction among minors dropped 40% in three years.
China's real-name registration system for games is even stricter — and it works.
The technology exists. Banks verify age every day when you open an account. Cryptocurrency exchanges implemented KYC (Know Your Customer) requirements that verify both age and identity within minutes.
The IRS can verify your identity for tax returns online.
But Discord can't figure out if you're actually 13?
They can. They choose not to.
After analyzing how kids bypass these systems, I've identified what I call the "Three-Layer Deception Framework" that platforms deliberately maintain:
This is the "Are you 13 or older?" prompt. It's not verification — it's legal ass-covering.
The Children's Online Privacy Protection Act (COPPA) requires platforms to not knowingly collect data from users under 13. That word "knowingly" is doing massive legal work here.
By asking users to self-report age, platforms can claim they didn't "knowingly" allow underage users.
It's the digital equivalent of a bar that checks IDs by asking "You're 21, right?" and accepting a nod as proof.
My cousin's first bypass method? She just... clicked yes. Revolutionary.
Most platforms require email verification. Sounds secure, right? Except creating an email account requires — you guessed it — just claiming you're 13 or older.
Here's the beautiful circular logic: Discord verifies age by requiring an email. Email providers verify age by asking for a birthdate. Users provide fake birthdates.
Everyone pretends this means something.
My cousin's second method used a temporary email service. Total time: 22 seconds. She didn't even close TikTok while doing it.
Some platforms now require credit cards for age verification on certain features. Finally, real verification!
Except... no.
Privacy.com lets you generate virtual credit cards with any name. Revolut gives cards to 12-year-olds with parental consent (which means clicking a box). Prepaid cards from any grocery store work fine.
PayPal accepts users at 13 in some countries, 18 in others, and doesn't verify which country you're actually in.
The "credit card verification" is security theater performed for regulators. The platforms know every workaround. They could block virtual cards, prepaid cards, and PayPal if they wanted. They don't.
Snapchat deserves special mention for turning age verification into a growth hack.
Their "Family Center" feature, launched in 2022, lets parents monitor teen accounts. Sounds responsible, right? Here's the catch: it only works if teens link their accounts voluntarily.
Guess what teens don't do?
Instead, they create second accounts — "finstas" for fake Instagram became "snaps" for Snapchat. One account for parents to monitor, one for actual use. Snapchat's response?
They recently made it easier to switch between multiple accounts with their "Quick Switch" feature in late 2025.
They're not preventing duplicate accounts. They're facilitating them.
Let me be crystal clear about what's happening here: we're watching the largest uncontrolled psychological experiment in human history.
These platforms know they're serving content to millions of underage users. They have internal data showing exactly how many 10-year-olds are claiming to be 18.
They track every metric — time spent, content viewed, messages sent.
A former Twitch employee told me (on condition of anonymity) that they have dashboards showing "age anomaly accounts" — users whose behavior patterns suggest they're younger than claimed.
These dashboards aren't used for enforcement. They're used for product development. For "improving engagement with younger demographics."
Think about that. They know who the kids are. They use that knowledge to make the platform more addictive for kids. Then they claim they can't verify age because "the technology doesn't exist."
Here's what changes in the next 12 months:
The EU's Digital Services Act gets real teeth in March 2026. Platforms face fines up to 6% of global revenue for age verification failures. That's $240 million for Discord, $420 million for Snapchat.
The UK's Online Safety Act starts enforcement in April. Platforms must implement "highly effective age verification" or face being blocked entirely in the UK market.
California's Age-Appropriate Design Code survived its constitutional challenge. By September 2026, platforms operating in California must implement privacy protections that actually verify user age.
These platforms have a choice: implement real verification and lose users, or maintain the charade and face massive fines.
Want to know what they'll choose? They've already decided.
Discord, Twitch, and Snapchat have created a generation that learned to lie before they learned algebra.
Every kid knows how to bypass age verification. It's not even considered dishonest anymore — it's just what you do.
Like using ad blockers or sharing Netflix passwords, it's become a normalized part of digital literacy.
We've built an internet that teaches children that rules are suggestions, privacy is a joke, and lying about your identity is the price of admission.
Then we wonder why online discourse is toxic, why misinformation spreads, why nobody trusts anything anymore.
The platforms could fix this tomorrow. Government ID verification through encrypted systems. Biometric checks that don't store personal data.
Zero-knowledge proofs that verify age without revealing identity. The technology exists, has existed for years.
But fixing it would mean admitting it's broken.
And admitting it's broken would mean acknowledging that these companies have knowingly served content to millions of children while pretending not to know.
That admission would trigger lawsuits that would make the tobacco settlements look like parking tickets.
So the theater continues. Kids keep lying. Platforms keep pretending to verify.
Parents keep believing their children are safe. And the companies keep cashing checks.
My 14-year-old cousin summed it up perfectly: "Why would they actually want to stop us? We're the ones watching all the ads."
She's not wrong. And that's the most terrifying part.
**What's your take — should platforms be forced to implement real age verification even if it kills their growth, or have we already passed the point where that's even possible?
Let me know in the comments.**
---
Hey friends, thanks heaps for reading this one! 🙏
If it resonated, sparked an idea, or just made you nod along — I'd be genuinely stoked if you'd show some love. A clap on Medium or a like on Substack helps these pieces reach more people (and keeps this little writing habit going).
→ Pythonpom on Medium ← follow, clap, or just browse more!
→ Pominaus on Substack ← like, restack, or subscribe!
Zero pressure, but if you're in a generous mood and fancy buying me a virtual coffee to fuel the next late-night draft ☕, you can do that here: Buy Me a Coffee — your support (big or tiny) means the world.
Appreciate you taking the time. Let's keep chatting about tech, life hacks, and whatever comes next! ❤️