Under pressure from critics who say Substack profits from newsletters that promote hate speech and racism, the company’s founders said Thursday they would not ban Nazi symbols and extremist rhetoric from the platform .
“I just want to make it clear that we don’t like Nazis either — I hope no one holds those views,” Hamish McKenzie, a co-founder of Substack, said in a statement. “But some people hold those and other extreme views. Because of that, we don’t think censorship (including through demonetizing publications) makes the problem go away — in fact, it makes it worse. ”
The reply came after a few weeks The Atlantic found that at least 16 Substack newsletters had “overt Nazi symbols” in their logos or graphics, and that white supremacists were allowed to publish on, and profit from, the platform. Hundreds of newsletter writers signed a letter opposing Substack’s position and threatening to leave. About 100 others signed a letter supporting the company’s stance.
In the statement, Mr. McKenzie said that he and the company’s other founders, Chris Best and Jairaj Sethi, had come to the conclusion that censoring or demonizing publications would not alleviate the problem of hate speech. rhetoric.
“We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to remove bad ideas from their power,” he said.
That stance drew waves of outrage and criticism, including from popular Substack writers who said they didn’t feel comfortable working on a platform that allowed hateful rhetoric to fester or flourish.
The debate has renewed questions that have long plagued technology companies and social media platforms about how content should be moderated, if at all.
Substack, which takes a 10 percent cut of revenue from writers who charge for newsletter subscriptions, has faced similar criticism in the past, particularly after it allowed transphobic and anti-vaccine language from some writers .
Nikki Usher, a professor of communication at the University of San Diego, says many platforms face the so-called “Nazi problem,” which states that if an online forum is available long enough, there will be extremists there at one point.
Substack has established itself as a neutral content provider, Professor Usher said, but it’s also sending a message: “We’re not going to try to police this problem because it’s complicated, so it’s easier not to take a position.”
More than 200 writers who publish newsletters on Substack have sign a letter opposing the company’s passive approach.
“Why would you choose to promote and allow monetization of sites that traffic in white nationalism?” said the letter.
The writers also questioned whether giving hateful people, like Richard Spencer, a prominent white nationalist, a platform was part of the company’s vision for success.
“Let us know,” the letter said. “From there we can decide if this is still where we want to go.”
Several popular writers on the platform have already pledged to leave. Rudy Fosterwhich has more than 40,000 subscribers, on Dec. 14 that readers often tell him they “can’t afford Substack anymore,” and he feels the same way.
“So here’s to a 2024 where none of us do it!” he wrote.
Other writers defended the company. A letter signed by about 100 Substack writers it says it’s better to let writers and readers moderate content, not social media companies.
Elle Griffinwho has more than 13,000 subscribers to Substack, wrote in the letter that while “there is a lot of hateful content on the internet,” Substack “has yet to come up with the best solution: Giving writers and readers freedom of speech without appearing that’s a speech to the masses.”
He argued that subscribers only receive newsletters they sign up for, so they are unlikely to receive hateful content unless they opt-in. That is not the case with X and Facebook, said Ms. Griffin.
He and other signatories to the letter supporting the company emphasized that Substack is not really one platform, but thousands of individual platforms with unique and curated cultures.
Alexander Hellene, who writes sci-fi and fantasy stories, signed the letter of Ms. Griffin. In a post on Substackhe says a better approach to moderating content is “to take matters into your own hands.”
“Be an adult,” he wrote. “Block people.”
In his statement, Mr. McKenzie, the Substack co-founder, also defended his decision to host Richard Hanania, the president of the Center for the Study of Partisanship and Ideology, on the Substack podcast “The Active Voice.” The Atlantic reported that Mr. Hanania had previously described Black people on social media as “animals” who should be subjected to “more policing, incarceration, and surveillance.”
“Hanania is an influential voice for some in US politics,” Mr. McKenzie wrote, adding that “there is value in knowing his arguments.” He said he was not aware of Mr. Hanania’s writings at the time.
Mr. also argued. McKenzie in his statement that censorship of ideas considered hateful only spreads them.
But research in recently year suggesting The opposite is true.
“Deplatforming appears to have had a positive effect in reducing the spread of far-right propaganda and Nazi content,” said Kurt Braddock, a communications professor at American University who has researched violent extremist groups.
When extremists are removed from one platform, they often go to another platform, but most of their audience does not follow them and their profits are eventually reduced, Professor Braddock said.
“I can appreciate someone’s dedication to freedom of speech rights, but freedom of speech rights are dictated by the government,” he said, noting that businesses can choose the types of content they hosted or banned.
While Substack says it doesn’t allow users to call for violence, even that distinction can be murky, says Professor Braddock, because racists and extremists can walk that line without overtly doing so. But their rhetoric can still inspire others to violence, he said.
Allowing Nazi rhetoric on a platform also normalizes it, he said.
“The more they use the kind of rhetoric that invalidates or demonizes a particular population,” says Professor Braddock, “the more OK it becomes for the general population to follow.”