When Victoria Taft asked me about Discord during our conversation on The Adult in the Room, she touched the live wire that corporate media keeps pretending is just low voltage chatter. I said then, and I will say again here, that the structure of platforms like Discord creates the perfect hunting ground for bad actors who want to radicalize, isolate, and activate vulnerable people without leaving obvious fingerprints. The last two weeks proved the point in the most tragic way possible.
Investigators say Tyler Robinson, the alleged assassin of Charlie Kirk, posted an admission in a Discord chat shortly before he was arrested. Multiple outlets reported that the FBI is interviewing dozens of people from that server while also noting there is not yet public evidence of a formal online conspiracy inside the chat room. The important takeaway is not whether a single chat room equals a criminal conspiracy. It is that the architecture of anonymous, invite-only servers lets malign operators move through crowds while carrying the impression of friendship, community, and shared purpose, which is the easiest way to push a susceptible mind from fantasy into action.
If you think this is a one-off, look closer. Discord keeps showing up around high-stakes national security and headline violence because the product makes it almost effortless to build cult-like microcommunities with minimal friction, soft hierarchies, and a steady drip of validation for the most extreme content. The alleged Trump shooter at Butler, Pennsylvania, had a Discord account that investigators examined after the attack. The Buffalo supermarket murderer wrote a diary on a private Discord server before killing ten people. The Pentagon leaker who dropped highly sensitive documents into a group of online friends used Discord as his clubhouse. The platform’s terms say it bans hate and incitement, but in practice enforcement is reactive and the rooms are private until they are not. This is not about blaming a tool for evil. It is about recognizing that certain tools are repeatedly selected by evil because they offer the perfect blend of reach, privacy, and deniability.
People ask me if intelligence agencies are involved, and I understand why. When a pattern repeats across ideologically charged incidents, when the same channels keep surfacing, and when the same leaks and shadowy accounts shape the narrative, you have to ask who benefits from chaos and who uses chaos as pretext for more control. I cannot tell you that a specific operative ordered a specific act inside a specific Discord, and I am not going to claim what I cannot prove. I can tell you that infiltration and psychological operations are standard playbooks for every serious intelligence service on earth, including ours, and that online subcultures filled with young isolated men and women are the softest targets imaginable. Recruiters, handlers, and provocateurs have always looked for people without strong real-world community because those are the ones who crave belonging so badly that they will ignore the warning signs and follow the voice that promises meaning. If you design a system that manufactures parasocial loyalty while hiding identity, you invite the wolves.
This is not just about Discord either. We watched a years-long psyop culture grow up around Q, where the public still debates who “Q” really was, yet the most credible reporting points to a small circle of administrators and amplifiers who controlled the levers and were perfectly positioned to manipulate millions of people while claiming they were only the janitors sweeping up the crumbs. The names and timelines are messy, but the lesson is simple. The people who sit closest to the server and the moderation queue can shape reality for everyone downstream, and they do it with a smirk because they know most users will never check the source.
Here is what everyone keeps missing. Big Tech and Big Government play a double game. They allow volatile ecosystems that are tailor-made for covert activation, then they use the very volatility they helped cultivate to justify more surveillance, more censorship, and more centralized control. After the shooting of Charlie Kirk, headlines lit up about new social media crackdowns and Section 230 reviews. The calls for “safety” always get louder after a crisis, and the answer is always the same. Hand over more power to the people who failed to protect you in the first place.
We do not have to be naive about any of this. There are practical steps that creators, parents, and citizens can take right now.
First, stop outsourcing your community to companies that profit from engineered addiction and algorithmic fog. If your primary circle is a private server full of avatars and anonymous handles, you are at a structural disadvantage because you cannot test trust. Build real-world ties at church, in your neighborhood, and inside accountable digital spaces where identity is clear and the incentives reward contribution, not chaos.
Second, treat any anonymous voice that tries to escalate you toward action as hostile until proven otherwise. The activation pattern is predictable. It starts as in-group humor, moves to darker ideological bonding, shifts into “they are coming for us” urgency, and ends with a proposed act that magically aligns with the interests of people who want to paint you as a threat. At every stage there are off-ramps. Take them.
Third, if you run a show or a community, post your values clearly, publish your moderation policies for unlawful incitement, and refuse to let rage merchants hijack your audience. Free speech is not a suicide pact. It is a commitment to truth where the only things off limits are the crimes that destroy the very freedom that lets us speak.
Fourth, demand transparency from platforms. If a company can tell advertisers exactly which viewer segments converted last night, then that same company can give the public meaningful transparency into how extremist networks propagate and which enforcement actions actually occurred. When they say they cannot do it, what they usually mean is that it would expose business practices they do not want anyone to see. Discord publicly says it bans hateful conduct and violence. Show the receipts, not the slogans.
Fifth, stop letting legacy media memory-hole the details that matter. The same outlets that ridicule concerns about online radicalization turn around and quietly confirm the online footprints when the story cannot be suppressed. In the Robinson case, mainstream reports documented the Discord confession and the widening federal inquiry into that server’s membership, yet you will hunt a long time for televised introspection about platform design, perverse incentives, or the uncomfortable history of online ops. The cycle keeps repeating because the incentives never change.
Let me zoom out for a second. The drive toward a gamified, always-on, identity-optional Internet did not happen by accident. It grew up alongside a government appetite for data collection and behavioral nudging. It grew up alongside ad-tech firms that monetize micro-emotions and platform empires that learned how to weaponize outrage. The result is a culture that feels frenzied, lonely, and programmable. If you want to resist, you have to rebuild the disciplines that make people hard to program, which starts with faith, family, and local accountability. A person rooted in Scripture, accountable to a real community, and walking in truth is a terrible target for manipulation because they are already full. A person chasing identity in an anonymous room will grab any label handed to them, including the kind that ends in handcuffs or a body bag.
As the CEO of Pickax, I am building technology that gets out of the way. No engagement algorithm steering your emotions. Human moderation with published standards. Creator tools that treat your audience like adults, not targets. If you believe in that vision, then help me prove a different model at scale. The choice is not between chaos and censorship. The choice is between systems that cultivate virtue and systems that farm attention until someone breaks.
Charlie is gone. Families are grieving. Investigators will keep combing chat logs and devices. The rest of us have work to do. Start by taking inventory of your digital life. Name your real community. Audit your inputs. Lose the anonymous rooms that are shaping your mood more than the people who love you. Refuse to be activated by strangers with agendas. Tell your kids exactly why these platforms feel like friendship but so often lead to despair. And then build something better with us.
We are either going to keep letting Big Tech and the security state train us like lab rats inside dark rooms, or we are going to step into the light, speak plainly, and take back responsibility for the communities we lead and the tools we use. Choose wisely, because the next tragedy is already being rehearsed in a server you will never see.
Catch The Jeff Dornik Show live every weekday at 1pm ET only on Rumble and Pickax, where free speech still reigns. Watch the full episode:











