Jump to content

Platformer: Some thoughts on platforms and Nazis


Recommended Posts

Could contain: Computer Hardware, Electronics, Hardware, Text, Computer, Pc, Laptop, Computer Keyboard, Logo

Sent by Case Newton of Platformer, a popular technology newsletter:

Quote

On Tuesday, I told subscribers that we are considering leaving the platform based on the company’s recent statement that it would not demonetize or remove openly Nazi accounts. After Jonathan M. Katz’s November article investigating extremism on the platform in The Atlantic, 247 Substack writers published an open letter asking the company to clarify its policies.

A few days later, Substack co-founder Hamish McKenzie responded in a blog post. While the platform would remove publications that are found to make credible threats of violence — a high bar — Substack would otherwise leave them alone, he said. “We don't think that censorship (including through demonetizing publications) makes the problem go away — in fact, it makes it worse,” he wrote. “We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power.”

McKenzie’s perspective — that sunlight is the best disinfectant, and that censorship backfires by making dangerous ideas seem more appealing — is reasonable for many or even most circumstances. It is a point of view that informs policies at many younger, smaller tech platforms, owing both to the techno-libertarian streak that runs through many founders in Silicon Valley and the fact that a hands-off approach to content moderation is easier and less expensive than the alternatives.

There was a time when even Facebook, which has more restrictive policies than Substack does across the board, permitted users to deny the Holocaust. CEO Mark Zuckerberg occasionally cited this policy as evidence of the company’s commitment to free speech, even though it occasionally got him into trouble.

Then, in 2020, Facebook reversed course: going forward, it said, it would remove Holocaust denial from the platform. In doing so, Zuckerberg said, Facebook was seeking to keep pace with the changing times.

Here’s Sheera Frenkel in the New York Times:

In announcing the change, Facebook cited a recent survey that found that nearly a quarter of American adults ages 18 to 39 said they believed the Holocaust either was a myth or was exaggerated, or they weren’t sure whether it happened.

“I’ve struggled with the tension between standing for free expression and the harm caused by minimizing or denying the horror of the Holocaust,” Mr. Zuckerberg wrote in his blog post. “Drawing the right lines between what is and isn’t acceptable speech isn’t straightforward, but with the current state of the world, I believe this is the right balance.”

For a time when memories of the Holocaust were fresh, and anti-Semitism had ebbed, it may have seemed less dangerous to let a few cranks peddle their lies. But as those memories faded, and attacks on Jewish people surged, Facebook felt compelled to revisit those policies.

That brings us to Substack. When it was founded in 2017, Substack offered simple infrastructure for individuals to create and grow their email newsletters. From the start, it promised not to take a heavy hand with content moderation. And because it only offered software, this approach drew little criticism. If you wrote something truly awful in Word, after all, no one would blame Microsoft. Substack benefited similarly from this distance.

Over time, though, the company evolved. It began encouraging individual writers to recommend one another, funneling tens of thousands of subscribers to like-minded people. It started to send out an algorithmically ranked digest of potentially interesting posts to anyone with a Substack account, showcasing new voices from across the network. And in April of this year, the company launched Notes, a text-based social network resembling Twitter that surfaces posts in a ranked feed.

By 2023, in other words, Substack no longer could claim to be the simple infrastructure it once was. It was a platform: a network of users, promoted via a variety of ranked surfaces. The fact that it monetized through subscriptions rather than advertising did not change the fact that, just as social networks have at times offered unwitting support to extremists, Substack was now at risk of doing the same.

And in one key respect, Substack is even more vulnerable to this criticism than social networks had been. Extremists on Facebook, Twitter, and YouTube for the most part had been posting for clout: those platforms made it difficult or even impossible for them to monetize their audiences.

On Substack, on the other hand, extremists can post for money. The pieces are now all in place for an extremist Substack to grow an audience using the platform’s recommendation systems, and monetize that audience via subscriptions. And Substack, as it does with all publications, will get 10 percent of the revenue.

Now, other platforms have developed defenses against their systems being exploited in this way. Some, like Facebook, prevent designated dangerous organizations from starting accounts. They also might ban praise for those organizations, or ban hate speech in general. Others, like YouTube, might allow some speech that makes the company uncomfortable, but restrict them from monetizing. Or it will restrict them from appearing in search, or in recommendations.

Substack doesn’t want to do that. It wants to be seen as a pure infrastructure provider — something like Cloudflare, which seemingly only has to moderate content once every few years. But Cloudflare doesn’t recommend blogs. It does not send out a digest of websites to visit. It doesn’t run a text-based social network, or recommend posts you might like right at the top.

Recommendations might appear on their surface to be innocuous, and in most cases they are. In three years on Substack, I’ve been recommended plenty of boring posts, but no openly Nazi ones. My experience of them has been unobjectionable.

But turning a blind eye to recommended content almost always comes back to bite a platform. It was recommendations on Twitter, Facebook, and YouTube that helped turn Alex Jones from a fringe conspiracy theorist into a juggernaut that could terrorize families out of their homes. It was recommendations that turned QAnon from loopy trolling on 4Chan into a violent national movement. It was recommendations that helped to build the modern anti-vaccine movement.

The moment a platform begins to recommend content is the moment it can no longer claim to be simple software.

It is, of course, exhausting — and expensive — to have to police your platform this way. Some users really do want to censor everyone who disagrees with them, and lobby to remove all of their political opponents from the platform. Finding the real danger and harms in a sea of user reports is tedious, thankless work. And no matter how you choose to moderate content, you’ll make at least some groups of users mad.

At the same time, for the sake of your business, you have to draw a line somewhere.

Some of these lines are quite tricky — do you ban all nudity? What if a mother is breastfeeding?

Others are not. Until Substack, I was not aware of any major US consumer internet platform that stated it would not remove or even demonetize Nazi accounts. Even in a polarized world, there remains broad agreement that the slaughter of 6 million Jews during the Holocaust was an atrocity. The Nazis did not commit the only atrocity in history, but a platform that declines to remove their supporters is telling you something important about itself.

If it won’t remove the Nazis, why should we expect the platform to remove any other harm?

Our readers understand this. During the past couple weeks, dozens of paid subscribers to Platformer have canceled their memberships. “The reason is simple,” one of those readers wrote to us today. “I don't want to fund Nazis. I'm disturbed by a Substack leadership that looks at openly pro-Nazi content and says, ‘We won't de-platform you. In fact, we'll monetize you.’"

I’m proud of the Platformer readership for standing up for their principles in this way. Some of our earliest and best customers are people who work in tech policy, content moderation, and trust and safety. They’ve spent years doing the work, making the hard calls, and cleaning up the internet for all of our mutual benefit. It’s only natural that they would resist spending money on a platform that spurns their profession in this way.

Over the past few days, the Platformer team analyzed dozens of Substacks for pro-Nazi content. Earlier this week, I met with Substack to press my case that they should remove content that praises Nazis from the network. Late today, we submitted a list of accounts that we believe to be in violation of the company’s existing policies against incitement to violence. I am scheduled to meet with the company again tomorrow.

Whatever becomes of those accounts, though, I fully expect that more will spring up in their wake. So long as Substack allows itself to be perceived — encourages itself to be perceived! — as a home for Nazis, they will open accounts here and start selling subscriptions. Why wouldn’t they?

Every platform hosts its share of racists, white nationalists, and other noxious personalities. In some very real sense, there is no escaping them online. But there ought to be ways to see them less; to recommend them less; to fund them less. Other platforms have realized this as they’ve grown up. Here’s hoping Substack does the same.

 

Link to comment
Share on other sites

I am a proponent of free speech, I believe everyone should be allowed to speak freely and people should not be prosecuted just because of their speech. However, I do not interpret free speech in absolute terms. I do not approve of hate speech in the name of free speech. I also do not approve of fascism, nazism, extremism, terrorism in the name of free speech. I wouldn't be comfortable using a platform where they allow terrorists and extremists to speak freely.

Link to comment
Share on other sites

I am all about free speech but I absolutely disallow ugly or political speech in my forum. I run a sports community so there is no value to the divisive speech that is not on topic. If your members are fighting about something off topic is not valuable to your community. 

After years of being consistent about it, I hardly ever see a Trump or Biden reference. 

  • Like 1
Link to comment
Share on other sites

I see hardliners growing these days, not just in the countries like Afghanistan but even in Europe and America. People are becoming less tolerant, they cannot accept diverse culture, they do not respect opponents views. Looks like our society is heading towards anarchy.

Link to comment
Share on other sites

I think the question is really how you are going to run your forum. My personal politics are "distinct" but I am fully aware that if I talk about my politics in my forum, I will alienate a portion of my audience. Because my forum is not about politics, it is loose loose for me to talk about my political views.

In other words, my goal is to keep my audience as large as possible and the interjection of any politics is contrary to that goal. Now if you found my Twitter feed, you would find my full embrace of free speech. 

Edited by John Horton
  • Like 1
Link to comment
Share on other sites

  • 4 months later...

Not everyone will like your opinion, everyone is free to pick up sides and you need to understand not everyone can agree on the same thing. You can of course take sides but you need to be fully aware that people will oppose you and take other sides. If you are respectful to others, other people will also respect your opinion. It is really difficult to balance the controversial issues in a community

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Share more information, add your insight, and reply to this topic. All users are welcome to post.

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Recently Online   0 Members, 0 Anonymous, 32 Guests (See full list)

    • There are no registered users currently online

×
×
  • Create New...