Reddit moderators may finally be getting help in fighting hate - Los Angeles Times
Advertisement

Reddit moderators spent years asking for help fighting hate. The company may finally be listening

Share via

For years Jefferson Kelley watched hate bloom in his treasured online spaces.

When Kelley, a Reddit moderator, booted hateful users off threads where Black people discussed sensitive personal experiences, racial slurs piled up in his inbox. Crude remarks about women filled the comment sections under his favorite “Star Trek” GIFs. The proliferation of notorious forums, including one that perpetuated a vicious racist stereotype about Black fathers, stung Kelley, a Black father himself.

Kelley and other moderators repeatedly pleaded with the company to back them up and take stronger action against harassment and hate speech. But Reddit never quite came through.

Then, all of a sudden, that seemed to change. When Reddit announced last week it was shutting down a noxious pro-Trump group that had violated the site’s rules for years, Kelley could scarcely believe it.

Advertisement

Reddit’s move to overhaul content policy and ban some 2,000 subreddits, or forums, is one of the most sweeping enforcement actions the company has taken to date. To Kelley and other Black moderators, it was a sign that the company might finally begin real work to stem the flow of harassment and abuse they faced on a daily basis.

The bans — which coincided with a wave of aggressive moves by other large internet platforms including Facebook and YouTube — came after hundreds of Reddit moderators signed a letter urging the company to take racism seriously. It also followed the resignation this month of Alexis Ohanian, one of Reddit’s co-founders, from the company’s board of directors. Ohanian, who said he had been moved by the protests over the killing of George Floyd, asked that his board seat be filled by a Black candidate.

Reddit and other tech companies have long been under fire for allowing false information and discriminatory ideologies to spread on their platforms, and for weak or inconsistent enforcement of policies against hate speech and harassment. Hesitant to provoke backlash from conservative critics and far-right agitators, leaders of these companies have often argued their platforms were neutral grounds akin to public spaces, and pointed to free speech values as reason for their inaction.

Advertisement

But the rapid approach of a presidential election amid a global pandemic and a nationwide movement over Floyd’s killing have engineered a tipping point in the tech industry. The math has changed, and tech platforms have seemingly, as one journalist quipped, “decided that the grief they’re getting for tolerating hate is more trouble than the grief they’d get for not tolerating hate.”

For moderators, who had spent years trying in vain to get the ear of Reddit’s corporate leaders, the effect of this sudden shift was as if the brick wall they’d been pushing on suddenly transformed into a swinging door.

When Kelley first started lurking on Reddit in 2014, he was there mostly for the “Star Trek” content. After several years participating enthusiastically in the r/StarTrekGIFs forum, he took charge of it, volunteering as an unpaid moderator in 2016. Reddit quickly became core to his social life. Kelley made GIFs and he made friends. He even started recording a podcast, “Beyond Trek,” with the people he met through the forum.

Advertisement

Kelley had always noticed the stream of hate on the platform, but when he began moderating the prominent Black People Twitter subreddit in 2017, the stream turned into a torrent.

Users mockingly labeled a Black student’s admission to Harvard Medical School an affirmative action case and promoted misleading, racist narratives about “black on black crime.”

The forum was supposed to provide respite from racism, so Kelley and its other moderators came up with new rules: Comments would initially be open to all, but if the bad faith remarks piled up, the thread would be put in “Country Club” mode, in which only users the moderators manually verified could comment. (The name is a tongue-in-cheek reference to the history of Black people being excluded from country clubs.)

Although this tactic succeeded in improving discourse in the forum, Kelley, as its moderator, paid a price. On an average day, he might receive 50 messages with the N-word or other racist sentiment. (With Black Lives Matters protests surging, that number has only increased, he said.)

In its outlines, his story resembles those of legions of other moderators who manage enormous communities on Reddit. Like Kelley, many joined the platform for friends and community only to become disillusioned.

In 2015, moderators shut down more than 265 subreddits in protest of the company’s firing of Victoria Taylor, a then-employee of Reddit who served as a useful resource to moderators. The revolt was a culmination of mounting frustration that the company did not appreciate their work or provide proper moderation tools. Company co-founder Ohanian responded at the time, acknowledging the situation was handled poorly and promising to address moderators’ concerns.

Advertisement

@TheYellowRose, a moderator of the subreddit r/blackladies, told the Atlantic she and her fellow “mods” were harassed in the wake of the 2014 Black Lives Matter protests in Ferguson, Mo. Her team wrote an open letter titled, “We have a racist user problem and Reddit won’t take action.” She said the letter, signed by mods overseeing dozens of subreddits, received no response.

Over the years, Reddit occasionally quarantined or banned a few subreddits and tweaked its policies in response to public backlash. Overall, the changes failed to stem the flow of harassment.

It wasn’t even clear Reddit’s leadership considered that a goal. In 2018, when a user asked Reddit CEO Steve Huffman whether “obvious open racism, including slurs,” was against the company’s rules, Huffman said it wasn’t. (He added later that although racism was unwelcome, it wasn’t prohibited.)

It wasn’t until September 2019 that the company, in the course of banning a dozen white nationalist subreddits, more explicitly banned harassment and bullying on the site.

When the Black Lives Matter movement succeeded in mobilizing millions of protesters following Floyd’s death, it unleashed pressure that had been building for years, pushing Reddit users who have been uncomfortable with the site’s culture for years to act for the first time and ushering executives to the table.

A moderator of the subreddit r/AgainstHateSubreddits, @DubTeeDub, was angered by what he saw as hypocrisy in Huffman’s somber public note affirming support for Black Lives Matter. Huffman wrote: “We do not tolerate hate, racism, and violence, and while we have work to do to fight these on our platform, our values are clear.”

Advertisement

@DubTeeDub drafted a letter demanding change. Hundreds of moderators including Kelley signed the June 8 open letter to Huffman and Reddit’s board. Within a day of publishing, @DubTeeDub received a message from @ggAlex, who introduced himself as Alex Le, the company’s vice president of product.

The introduction led to @DubTeeDub and other moderators of r/AgainstHateSubreddits being invited to a series of Zoom videoconference calls with Reddit’s paid administrators and executives. The sessions were presented as part of the company’s outreach to moderators fighting hate, Black users and other marginalized groups.

The response to the letter was notable because communities one wouldn’t normally expect to offer support, did, said J. Nathan Matias, an assistant professor at Cornell University studying digital governance and behavior. “The music discussion community, the relationship advice community, the community for talking about swimming — you wouldn’t normally see communities like that as focused on social change,” he said, “so it’s actually a big deal.”

In 2015, when Reddit banned an offensive fat-shaming subreddit, the outcry of censorship by various communities was swift and intense. The front page of Reddit, which features the site’s most-engaged-with content, was plastered nonstop with posts decrying the banning; hundreds of imitation “Fat People Hate” subreddits popped up; users posted private information about Reddit admins who helped carry out enforcement.

But it’s clear the culture has changed drastically since then, @DubTeeDub said. “People are getting very tired of being associated with a website that has such a dominating hateful ideology,” he said.

Kelley is part of the shift: Although he has always fought hard to make his own Reddit communities hate-free, Kelley said he had never before devoted time to broader internal efforts.

Kelley joined several of the Zoom calls with Reddit administrators and executives. He’s accumulated a laundry list of ideas for how the company can better support moderators, based on his own experience. One suggestion: erecting more obstacles for users who message mods. It’s not uncommon for someone to create six different Reddit accounts in order to spam a moderator’s inbox over and over. An extra identity-verification step might weed out people acting in bad faith, he said.

Advertisement

The “Black Fathers” subreddit provides a glaring example of Reddit’s inaction on racism over the years. The name suggests a space filled with posts by Black men attempting to do their daughters’ hair and other similarly wholesome content, Kelley said. But, in fact, the subreddit was meant as one big racist joke based on the stereotype of absent Black fathers. The moderators who created the subreddit many years ago restricted posting so that the only visible message was, “There doesn’t seem to be anything here.”

r/BlackFathers remained on the site for years. The company quarantined the group in 2015 but didn’t go as far as banning it until 21 days ago.

When asked about prolonged inaction on subreddits such as r/BlackFathers, Reddit pointed to a statement by Huffman.

He said that although the company had gotten better at scaling enforcement efforts and measurably reducing hateful experiences year over year, “ultimately, it’s our responsibility to support our communities by taking stronger action against those who try to weaponize parts of Reddit against other people.”

Reddit is not the only internet platform rethinking its responsibility to regulate content.

In late May, Twitter slapped warning labels on tweets by President Trump that made false claims or glorified violence toward protesters, becoming the first company to challenge his pattern of lying and bullying via social media. On June 3, Snapchat said it would no longer promote Trump’s account in the “Discover” tab of the app. On June 18, Facebook removed dozens of ads placed by Trump’s reelection campaign for using Nazi imagery, and a week later the company said it would label or remove politicians’ tweets when they violated rules — including tweets posted by Trump.

Advertisement

Then, on the same day as Reddit handed down its bans, Amazon-owned streaming service Twitch temporarily suspended Trump’s channel over “hateful conduct,” and YouTube banned half a dozen prominent white supremacist channels, including those of David Duke and Richard Spencer.

Experts say the changes sweeping the industry have likely been triggered by a confluence of advertisers threatening to pull their ad dollars from big companies, negative press, internal pressure by employees, and diminishing public goodwill.

“When you have high-profile current events, all of these levers can be pulled, which enables more significant, drastic changes,” said Kat Lo, a researcher who studies online content moderation at the nonprofit Meedan.

Even with the broader climate finally stacked in their favor, some Reddit moderators are skeptical company executives will follow through on their promises to proactively create a more welcoming space.

The company’s history offers plenty of fodder to those who suspect its current show of interest is little more than lip service. Adrienne Massanari, an associate professor at the University of Illinois at Chicago who studies new media and digital cultures, said Reddit hurt its credibility by giving r/the_donald so many second chances, “more than it ever should have gotten.”

Still, Kelley is optimistic. “Things don’t change overnight,” he said.

Last week, he put in a request to take over the banned r/BlackFathers. A father to three young children, he wants to reclaim it and turn it into a supportive space for people like himself. A Reddit admin he met on one of the Zoom calls assured him he would be handed the reins.

Advertisement

He’s not sure how long it will be before the company gives him access, but he doesn’t mind the wait. It gives him time to reach out to folks he trusts to join the mod team, and help shape the future of the community.

Advertisement