Discord will provide age verification

Discord plans to set every user’s account to default to teenagers — unless you’re willing to undergo an invasive age verification process.

On Monday, Discord, the popular social app, announced that it would be rolling out “enhanced safety features for teens.” According to the company, since March.

“(A)ll new and existing users worldwide will default to a teen-friendly environment, with updated communication settings, limited access to age-restricted spaces, and content filtering that preserves the privacy and meaningful connections that define Discord.”

This means that users may be required to complete an age verification process in order to access sensitive content. According to Discord, this includes “age-restricted channels, servers or commands, and select message requests.”

If you choose to opt out of the age verification process, your account will be subject to the following restrictions as stated by Discord:

  • Content Filters: Discord users will need to be of adult age to blur sensitive content or turn off settings.
  • Age-Restricted Areas – Only users who are age-verified as adults will have access to age-restricted channels, servers, and application commands.
  • Message Request Inbox: Direct messages from people the user may not know are routed to a separate inbox by default, and access to change this setting is limited to age-verified adult users.
  • Friend Request Notifications: People will receive friend request notifications from users they may not know.
  • Stage Restrictions: Only age-verified adults can speak on the stage on the servers.

The company says age verification will be done in one of two ways: by estimating the age of a face on the device from a video selfie, or by sending an ID form to one of Discord’s “vendor partners.”

If the latter sets off any alarm bells, it should. Almost exactly four months ago, in October 2025, there was a serious data breach that may have resulted in hackers stealing the passports and driver’s licenses of 2.1 million Discord users.

For its part, Discord said the target was only 70,000 users.

The hack did not target Discord directly, but rather a third-party customer service vendor, 5CA Systems. 5CA Systems says this was not a mistake.

Either way, Discord assures users that this process of user verification with privacy passing is completely secure. We are skeptical.

I would like to point out that I personally think keeping kids safe – especially on Discord is a great idea. I just think there’s a difference between solving problems or making a mess, and I believe Discord may be doing the latter.

But let’s look at it why Discord goes about it this way.

Welcome to the new internet

You might think that this kind of age verification is inconvenient. You might even think it’s unreasonable, given how Discord has messed up data in the past.

You’d be right on both counts. Unfortunately, this is the direction the internet is heading.

The story has been building for years, but the bottom line is this: US lawmakers have long sought a way to hold big companies accountable for children’s online safety. While this is a good idea in theory, like most government and technology solutions, it largely ends up being a bigger problem for the end user.

The latest attempt was the App Store Accountability Act (ASAA) of May 2025. It was designed to give parents more tools to protect their children online.

Of course, the main way they wanted to achieve this was to lean on Google and Apple. ASAA, which still hasn’t passed at the federal level, would require age verification at the app marketplace level.

Although it did not pass federally, several states attempted to pass their own versions. Currently, Utah and Louisiana have some version of age verification requirements.

A federal judge issued an injunction against the Texas law in December.

And we in Apple Insider let’s assume that this clumsy attempt will continue to trudge forward in one capacity or another, federally or state by state.

Apple has had tools for this scenario for some time. Parents can already set up devices for minor users and create a “child account” that can be managed and managed to a large extent at a distance through the parent account.

This includes limits on time spent on the device, content and privacy restrictions, and restrictions on which apps appear in the App Store. It also helps hide explicit content in podcasts, music videos, and Apple Books.

New parental control options in iOS 26

And if that wasn’t enough effort on Apple’s part, Apple made it even easier to set up child accounts in iOS 26. This includes a simplified setup process, expanded age ratings, and new communication limit settings.

And Google also has its own version of parent-managed child accounts. Both companies strive to stay at the forefront of keeping children safe online.

Of course, these current tools are active and require parents to set them up properly. This, by the way, is why the federal government wants to control the situation more.

A final note on Discord

Discord is a cross-platform social communication app. You can access Discord not only from your iPhone or Android smartphone, but also from tablets and desktop computers.

There is even a browser-based version of Discord that allows users to use the app from a list of supported browsers.

As a result, Discord cannot rely solely on Apple or Google authentication tools in all cases. This is especially true for younger users who only access Discord in a browser.

Leave a Comment