Skip to content
JohnRedgrave-web copy

Building Safer Spaces Online

Every month, 150 million people use the communications platform Discord. Leading its Trust and Safety team is John Redgrave, who spoke with Brunswick’s Ash Spiegelberg, a Partner and Global Co-Head of the firm’s Technology, Media and Telecoms sector group, about the challenges—both human and technical—of that work.

John Redgrave might not be heading Discord’s Trust and Safety team today were it not for a string of panic attacks in 2017.

Hearing friend after friend discuss the awful experiences their children were having online, Redgrave found himself unable to sleep, worried about what awaited his two young kids. “Finally, one night my wife rolled over and said, ‘Hey, I’m here for you, I support you, but I think you should do something about this.’ I was like, ‘you mean see a therapist?’ She said, ‘or go build something.’”

The following year, Redgrave built a software company called Sentropy that used AI to detect online abuse and make the internet safer for everyone—“supercharging content moderation” is how one report described it.

The new business was a variation of a theme constant throughout Redgrave’s career: using technology to tackle societal problems. In the early 2010s, among other responsibilities he’d held at cutting-edge data analytics firm Palantir, Redgrave worked with the CDC to help it swap clipboards for computers in tracking disease outbreaks.

He’d then joined as COO of a software company called Lattice that was teaching machines how to read with human levels of intelligence—an ability we take for granted today, but was cutting-edge at the time. One application of that work helped combat human trafficking. “We became the extraction engine for a bunch of nefarious websites,” Redgrave said. “We’d pull addresses and phone numbers. We taught the model to detect obfuscated prices—flower varietals indicated different denominations of money. And we shared all that data with law enforcement, so they could conduct coordinated raids to take down human trafficking rings all over the world.” Apple bought Lattice in 2017.

Then came the panic attacks—and Sentropy. “You have to have something that gets you out of bed when you’re building a company. For me, it was about my kids. I fundamentally built that company for my children.”

In 2021, Sentropy was bought by Discord, a communications platform whose popularity exploded during the pandemic and which 150 million people now use to connect each month—a population that, if it were a country, would be the ninth-largest in the world. That population is disproportionately made up of young people—more than 60% of its users are 34 years old or younger.

At its core, Discord is a place for people to talk and hang out online, whether through instant messaging, voice calls, video calls, file sharing, or fun emojis. Distinctive about Discord is its ability to build online communities of all sizes. Some of Discord’s “servers,” akin to a chat room, have millions of members. Yet most servers have only enough people to fill a small living room. According to Redgrave, 90% of the activity on the platform happens between 10 people or fewer.

If users don’t feel safe, they won’t find their friends, they won’t find belonging. So we are going to keep investing and innovating to make people safe on our platform. Period.

The challenge of balancing privacy and safety isn’t unique to Discord, but the task facing the company is distinctive. One of Discord’s strengths is its ability to create close-knit communities in a time of widespread isolation—“young adults are fighting loneliness by making friends online,” as the  Washington Post reported in 2023. Yet the same features that foster those connections—small, private groups—can make maintaining safety across them difficult. In one private group, a Discord user posted photos of documents that contained classified information about the US military; years earlier, in a separate private group, a user shared journal entries that contained his intentions to carry out a shooting—which, 30 minutes later, he did.

As Redgrave explained to Brunswick’s Ash Spiegelberg, detecting and preventing horrible activities online is intensely complicated and taxing—but non-negotiable. “This work is core to who we are as a company,” Redgrave said. “If users don’t feel safe, they won’t find their friends, they won’t find belonging. So we are going to keep investing and innovating to make people safe on our platform. Period.”

A number of companies were interested in buying Sentropy; why did you pick Discord?

There are a couple of features of Discord that really attracted me. I spent time with the founders of each company we talked to. Jason [Citron] and Stan [Vishnevskiy, Discord co-founders] genuinely wanted to make not only their platform safer, but also the internet safer. We talked about a vision of being able to open-source technology, to drive collaboration across social media companies and communications companies. That’s exactly what I wanted to be doing. So that was number one.

Number two is Discord is just different than most platforms. We don’t care about eyeballs, there’s no virality, there’s no doom scrolling. You opt into the places you want to spend time. We offer a monthly subscription service; it’s not about selling ads. It embodied the next generation of social constructs online.

Number three was that it was the best deal for my team. One of my goals was that each of the first 10 people who took the risk to join Sentropy as an early stage start-up would have at least seven figure outcomes. We achieved that.

And finally, it just felt like the best home for the people that we had. Two and a half years later, most of the team is still at Discord. After an acquisition, that is very, very rare. It’s a testament to Discord’s culture, and to the type of people we hired.

Much of your team has remained, but the function you lead, Trust and Safety, looks different than when you joined.

Yes, it’s been a big evolution since 2021. The Trust and Safety team was a generalist function with light specialization. We had lots of really passionate, well-intentioned people predominantly focused on ticket-based work. This approach will never scale to the size that we operate at Discord. It doesn’t allow you to get better at fast pattern recognition to build machine-learning models and rules-based engines; in essence, we weren’t leveraging technology to supercharge our efforts.

Our first change was to shift from generalists to specialists, with specific operational teams dedicated to minor safety, to extremism, to cybercrime. All of them are focused on intelligence-gathering to disrupt the worst types of actors.

The second change was we had all these disparate functions—the Trust and Safety Team, a policy team, a machine learning team, a safety engineering team, a data science team—all working in silos. We brought everybody under a single umbrella that plans and works together to make Discord safer for our users. We talk a lot at Discord about following a “Safety by Design” approach, where every single design decision takes safety into account. That has totally changed the paradigm. Because now, when someone in Trust and Safety sees a challenge, instead of them just figuring out how we get better operationally, we’re now having product people in those conversations going, “How do we change the core Discord product to fix that problem?” That then allows us to go and make that change, which has downstream benefits for the operational teams.

The third piece is we’ve fundamentally altered the scale at which we can operate. When I joined, we were working about 17,000 tickets every week—these were user cases reported to us. We’re now doing about 160,000 a week.

Part of what’s enabled us to do that is using outsourced teams in the Philippines, Malaysia and Colombia so that we can respond to more cases, more quickly to the benefit of our users. Underlying the work those teams do, that our entire Trust and Safety team does, is a real focus on wellness.

We genuinely want the internet to be safer for everyone. And we believe in a model where everyone in the industry shares intelligence and technology.

What do you mean by a “focus on wellness”?

There’s been a lot of coverage about the worker conditions for trust and safety teams, because essentially, we’re looking at the worst content on the internet all the time. It’s a demanding, challenging space to operate in. How can someone look at this content for eight-plus hours a day, nonstop? The short answer: They can’t.

There’s a number of psychological studies about how when you’re reviewing specific images, if you change that image in certain ways, it has less of a mental toll on you. In our review tool, we have a number of features that blur the image, rotate it, or change its coloring to greyscale—things that science suggests really help those workers.

We also work with a company called The Workplace Wellness Project, which does resilience training. They do group and individual sessions so people have an outlet to talk about what they’re going through.

And then we built the industry’s first resilience rotation. As an example, for people who are dealing with child safety issues, we will pull them out of that work and put them on another team for three months, and then we’ll rotate them back in slowly—so they can mentally unload without getting detached immediately. Because there are studies now that if you pull someone out of the work too fast, they can get PTSD [post-traumatic stress disorder].

This is really hard work. And finally, we have chosen our outsourcing partners in large part because of how they treat their workers. That industry is fraught with companies that are all about low-cost working conditions. We invest more with companies like TaskUs that hold themselves to a higher standard  because we think those workers deserve to have the best wellness conditions possible.

What do you wish more people knew about Discord?

I’m going to focus on trust and safety, since that’s the world I sit in. Number one is we have a huge investment in safety—more than 15% of our workforce is dedicated to keeping our users safe. I don’t have a stat on what other companies invest, but I know that Discord’s investment is significant relative to peers in the industry. And we continue to invest because it’s core to who we are.

The second thing is we’re building new technologies and driving collaboration across the industry, through places like the Tech Coalition [a group working to end online child sexual exploitation and abuse]. We just launched this new CSAM [child sexual abuse material] detection capability, which is now available to  companies that are in the Tech Coalition.

You could view a technology like that as proprietary, a competitive advantage—we don’t. We think a rising tide lifts all boats. It’s not really a win for us if bad actors leave our platform and go spread harm somewhere else. We genuinely want the internet to be safer for everyone. And we believe in a model where everyone in the industry shares intelligence and technology.

Has doing this work helped with your panic attacks?

It has. They were partly driven by a sense of helplessness, a feeling that I wasn’t doing anything about the problem. Now I am doing something about it. Every day, we’re making Discord safer, we’re collaborating with law enforcement, with governments, with coalitions, to help make the internet a safer place. I’m sleeping better…most nights.

illustration: thomas fuchs

The Authors

ash_1600x760
Ash Spiegelberg

Partner, Dallas

Ash has 20 years of global experience in strategic communications advisory. He is global co-head of the firm’s technology, media and telecoms (TMT) group. Airbnb, Apple, AT&T, Coinbase, Google, Meta, NVIDIA, and Time Warner are among the 300-plus companies he has advised over his career.