3Eyes / Blog

Discord Parental Controls - What Parents Need to Know in 2026

If your kid is between 10 and 20 years old, there's a good chance they're on Discord. And if you've never opened the app yourself, it can look confusing at first glance. This guide is written for parents who want a real understanding of what Discord is, what the risks look like, and what you can actually do about them.

What Discord Actually Is

Discord started as a voice chat app for gamers. You'd jump into a server with your friends, mute yourself while you focused, and unmute when you had something to say. It was faster and more reliable than Skype, and it caught on fast.

That was around 2015. In 2026, Discord is something much bigger.

Discord today is a combination of group chat, voice calls, video calls, screen sharing, community forums, and live streaming -- all in one app. It has more than 200 million monthly active users, and only a fraction of them are there for gaming. Kids use Discord for all kinds of reasons: gaming groups, obviously, but also school study groups, fan communities for musicians or shows, creative writing groups, sports team coordination, and just hanging out with friends after school.

The core structure is built around "servers." A server is a community you join, and it can have dozens of channels inside it -- text channels for different topics, voice channels where people talk in real time, and announcement channels for news. Some servers are tiny (five friends from school), and some are massive (hundreds of thousands of members). Anyone can create a server for free.

Within servers, there's also direct messaging. You can DM anyone you share a server with. This is a key point we'll come back to.

The minimum age to use Discord is 13. But Discord uses a self-reported birthday during account creation, and there's no verification. A 10-year-old who knows to enter a fake birthdate will get in without any friction. Research consistently shows significant usage among kids under 13.

Why Parents Should Pay Attention

Discord isn't inherently a bad platform. A lot of kids use it completely safely. But the architecture creates some real risks that are worth understanding before you decide what rules to set.

Moderation is decentralized. YouTube and TikTok have teams of content moderators and algorithmic filters. Discord does too, but the vast majority of moderation happens at the server level -- meaning individual volunteer moderators in each community decide what goes and what doesn't. A well-run server can be a great environment. A poorly run one can be anything.

NSFW servers exist and are accessible. Discord allows servers to be marked as NSFW (Not Safe For Work). Users must confirm they're 18 to access them. Like the account creation age, this is self-reported. Anyone who knows to click "I am 18" gets in.

In 2022, Discord removed more than 33,000 servers for child safety violations. That number reflects both the scale of the problem and the fact that Discord was actively finding and removing bad actors -- but it also tells you those communities existed in the first place.

Direct messages from strangers are possible. By default, anyone who shares a server with your child can send them a direct message. If your kid is in a server with 50,000 members, that's 50,000 people with potential DM access. This is where grooming happens -- not typically in public channels where moderators might notice, but in the private space of direct messages.

There's no comprehensive parental visibility. Unlike some platforms, Discord doesn't have a parent app that lets you read your kid's messages. Discord's Family Center (more on this in a moment) shows you some activity metadata, but not content.

Discord's Built-In Safety Settings

Discord has added more safety features over the past few years. They're worth using. Here's what exists and where to find it.

Family Center

Family Center is Discord's parental monitoring feature. To use it, your child must link their account to yours through the Family Center setup. This is important: it requires their participation, so it's not something you can set up unilaterally.

Once linked, Family Center shows you:

  • Which servers your child is active in
  • Who they've been sending DMs to (names only, not content)
  • Who they've added as friends recently
  • What apps they've connected to their Discord account

It does not show you message content. It also only works if your child keeps the link active -- they can disconnect it.

Find it at: User Settings > Family Center

Safe Direct Messaging

This setting controls what happens when someone you don't know sends you a direct message. With the filter on, Discord will scan incoming DMs for explicit images and warn you before showing them.

Options are:

  • Keep Me Safe -- scans DMs from everyone
  • My Friends Are Nice -- only scans DMs from non-friends
  • Do Not Scan -- no filtering

Set this to "Keep Me Safe" for any child's account.

Find it at: User Settings > Privacy & Safety > Safe Direct Messaging

Friend Request Settings

You can control who is allowed to send your child a friend request. Options are:

  • Everyone
  • Friends of Friends
  • Server Members

Setting this to "Friends of Friends" or "Server Members" significantly reduces contact from complete strangers. "Server Members" means only people in a shared server can reach out, which at least creates some context for who's trying to connect.

Find it at: User Settings > Privacy & Safety > Friend Requests

Explicit Image Filter for DMs

Separate from the Safe DM setting, there's a per-conversation explicit image filter. This one applies to images specifically. Keep it enabled.

You User Settings Privacy & Safety Family Center Authorized Apps Notifications Privacy & Safety SAFE DIRECT MESSAGING Keep Me Safe (recommended) FRIEND REQUESTS Friends of Friends or Server Members EXPLICIT IMAGE FILTER Filter All Direct Messages User Settings > Privacy & Safety

Per-Server Privacy Settings

Within each server your child belongs to, there's an option to disable direct messages from other server members for that specific server. Right-click on a server icon and look for Privacy Settings. This lets you be more restrictive in large public servers while allowing DMs from a smaller friend server.

What Discord's Controls Can't Do

It's worth being honest about the limits here.

You cannot read message content through any parental tool. Family Center shows activity but not what was said. If your child is being groomed in DMs, the content of those conversations isn't visible to you.

You cannot block specific servers by name from within Discord. There's no built-in list where you can say "block this server." You can see which servers your child is in through Family Center, but you'd have to ask them to leave or have a conversation about it.

Age gates are self-reported. If your child's Discord account shows their age as 18 (either because they lied during signup or changed it), they have access to NSFW content that Discord's filters would otherwise block.

Family Center requires consent. Your child has to agree to link accounts. They can unlink at any time. This makes it a monitoring tool for cooperative relationships, not a control tool for adversarial ones.

The filter only catches explicit images, not text. A predatory adult can write entirely in text and the Safe DM filter won't flag anything.

The Real Risk: DMs From Strangers

The most concrete danger on Discord for younger kids isn't a public channel -- it's the direct message. Grooming almost always starts in private conversation, and Discord's default settings allow it to happen easily.

Here's the lockdown sequence for your child's account:

Step 1: Restrict friend requests. Go to User Settings > Privacy & Safety. Change "Who can send you a friend request" to "Server Members" or "Friends of Friends." This means a complete stranger can't reach out unless they share a server with your child.

Step 2: Enable Safe Direct Messaging. Same Privacy & Safety menu. Set it to "Keep Me Safe." This scans incoming DMs for explicit images.

Step 3: Disable server DMs in large public servers. For any server with thousands of members, right-click the server icon and open Privacy Settings. Turn off "Allow direct messages from server members."

Step 4: Enable explicit image filtering in existing DM threads. In any open DM conversation, click the user's name and check that filtering is enabled.

Step 5: Review the server list regularly. Through Family Center or by just asking. A 12-year-old in a 200,000-person gaming server with no adult supervision is in a different risk environment than a 15-year-old in a 10-person friend group.

Discord DM Lockdown Checklist 1 Restrict Friend Requests Settings > Privacy & Safety > Who can send friend requests: "Server Members" 2 Enable Safe Direct Messaging Settings > Privacy & Safety > Safe Direct Messaging: "Keep Me Safe" 3 Block DMs in Large Public Servers Right-click server icon > Privacy Settings > Disable "Allow DMs from server members" 4 Enable Explicit Image Filtering Settings > Privacy & Safety > Explicit Image Filter: "Filter All Direct Messages" 5 Set Up Family Center Settings > Family Center > Connect parent account to see activity metadata

A System-Level Approach: Blocking Discord Entirely

Some families will decide, after reading this, that Discord isn't appropriate for their child yet. That's a completely reasonable conclusion, especially for younger teens. Here's how to actually enforce that decision.

Remove the app from the device. Start with the obvious: uninstall Discord from phones, tablets, and computers. This doesn't help with web access, but it removes the most convenient path.

Block discord.com at the DNS level. Most home routers let you set a custom DNS. Services like NextDNS or Pi-hole allow you to block specific domains across your entire home network. Add discord.com and discordapp.com to the blocklist. This prevents access on any device connected to your home Wi-Fi -- phones, laptops, game consoles, everything.

Use allowlist filtering instead of blocklist filtering. Blocklists are a game of whack-a-mole. For younger kids especially, an allowlist approach -- where only approved sites work -- is much harder to bypass. The child can access the sites you've approved, and everything else is blocked by default.

For families who allow Discord: the risk environment improves significantly with physical placement. A computer in a common area of the house, where a parent walking by can see the screen, creates a different dynamic than Discord on a phone in a bedroom at 11pm. This is less about surveillance and more about natural social norms. Kids behave differently when they know they might be seen.

Time limits also matter. Setting Discord to only be accessible during certain hours -- after school but not late at night, for example -- reduces the window for late-night conversations with people you don't know.

Age-Appropriate Recommendations

These are guidelines, not rigid rules. You know your kid better than any framework does.

Under 13: Discord's own Terms of Service prohibit accounts for children under 13. If your child is using Discord and they're under 13, they either have a fake age on their account or someone set it up for them. At this age, Discord is not appropriate. The content, the community structures, and the DM access are all designed for teenagers and adults. If your child wants to communicate with friends online, look at closed, family-friendly platforms instead.

Ages 13 to 15: If you decide Discord is okay, apply all the lockdown settings described above. Small servers with known friends only. Family Center linked and active. No large public servers. The app should be on a shared device, not a personal phone, if possible. Regular check-ins about who they're talking to. At this age, the conversation is just as important as the settings -- kids who understand why certain contacts are risky are better protected than kids who just have settings they don't understand.

Ages 16 and up: Gradually loosening restrictions as trust is established makes sense here. Many 16 and 17-year-olds use Discord without incident. The emphasis shifts from restriction to ongoing conversation -- do they feel comfortable telling you if something weird happens? Do they know what grooming looks like? Do they know they can block and report someone without it being a big deal?

How 3Eyes Fits In

3Eyes takes a different approach to the problem. Rather than relying on in-app settings that can be changed, or blocklists that can be worked around, 3Eyes uses an allowlist model. You define what sites and apps are accessible on the family computer. Everything else is blocked by default.

For Discord specifically, this means you can include it in the allowlist when your family decides it's appropriate, and exclude it until then. There's no workaround from the child's end -- access is controlled at the system level, not the app level. You can also set time windows, so Discord might be accessible from 3pm to 8pm on school days, and that's it.

If your child earns more access over time -- through demonstrated responsible behavior, getting older, having good conversations about online safety -- you adjust the allowlist. If something concerning happens, you adjust it back. The control stays with you, not with app settings that a teenager can modify in thirty seconds.

Discord is genuinely useful for kids. It's how a lot of social connection happens now, especially for teenagers who have friends in different schools or cities or countries. The goal doesn't have to be eliminating it. The goal is making sure you're making an active choice about it, rather than discovering it's been running on your kid's phone for two years without your knowledge.

Start with a conversation. Then set up the settings. Then decide what level of access makes sense for your family right now. That sequence, done thoughtfully, handles most of the risk.