Substack CEO Chris Best Doesn’t Realize He’s Just Become The Nazi Bar

from the just-fucking-own-it dept

I get it. I totally get it. Every tech dude comes along and has this thought: “hey, we’ll be the free speech social media site. We won’t do any moderation beyond what’s required.” Even Twitter initially thought this. But then everyone discovers reality. Some discover it faster than others, but everyone discovers it. First, you realize that there’s spam. Or illegal content such as child sexual abuse material. And if that doesn’t do it for you, the copyright police will.

But, then you realize that beyond spam and content that breaks the rules, you end up with malicious users who cause trouble. And trouble drives away users, advertisers, or both. And if you don’t deal with the malicious users, the malicious users define you. It’s the “oh shit, this is a Nazi bar now” problem.

And, look, sure, in the US, you can run the Nazi bar, thanks to the 1st Amendment. But running a Nazi bar is not winning any free speech awards. It’s not standing up for free speech. It’s building your own brand as the Nazi bar and abdicating your own free speech rights of association to kick Nazis out of your private property, and to craft a different kind of community. Let the Nazis build their own bar, or everyone will just assume you’re a Nazi too.

It was understandable a decade ago, before the idea of “trust & safety” was a thing, that not everyone would understand all this. But it is unacceptable for the CEO of a social media site today to not realize this.

Enter Substack CEO Chris Best.

Substack has faced a few controversies regarding the content moderation (or lack thereof) for its main service, which allows writers to create blogs with subscription services built in. I had been a fan of the service since it launched (and had actually spoken with one of the founders pre-launch to discuss the company’s plans, and even whether or not we could do something with them as Techdirt), as I think it’s been incredibly powerful as a tool for independent media. But, the exec team there often seems to have taken a “head in sand” approach to understanding any of this.

That became ridiculously clear on Thursday when Chris Best went on Nilay Patel’s Decoder podcast at the Verge to talk about Substack’s new Notes product, which everyone is (fairly or not) comparing to Twitter. Best had to know that content moderation questions were coming, but seemed not just unprepared for them, but completely out of his depth.

This clip is just damning. Chris just trying to stare down Nilay just doesn’t work.

The larger discussion is worth listening to, or reading below. As Nilay notes in his commentary on the transcript, he feels that there should be much less moderation the closer you get to being an infrastructure provider (this is something I not only agree with, but have spent a lot of time discussing). Substack has long argued that its more hands-off approach in providing its platform to writers is because it’s more like infrastructure.

But the Notes feature takes the company closer to consumer facing social media, and so Nilay had some good questions about that, which Chris just refused to engage with. Here’s the full context that provides more than just the video above. The bold text is Nilay and the non-bold is Chris:

Notes is the most consumer-y feature. You’re saying it’s inheriting a bunch of expectations from the consumer social platforms, whether or not you really want it to, right? It’s inheriting the expectations of Twitter, even from Twitter itself. It’s inheriting the expectations that you should be able to flirt with people and not have to subscribe to their email lists.

In that spectrum of content moderation, it’s the tip of the spear. The expectations are that you will moderate that thing just like any big social platform will moderate. Up until now, you’ve had the out of being able to say, “Look, we are an enterprise software provider. If people don’t want to pay for this newsletter that’s full of anti-vax information, fine. If people don’t want to pay or subscribe to this newsletter where somebody has harsh views on trans people, fine.” That’s the choice. The market will do it. And because you’re the enterprise software provider, you’ve had some cover. When you run a social network that inherits all the expectations of a social network and people start posting that stuff and the feed is algorithmic and that’s what gets engagement, that’s a real problem for you. Have you thought about how you’re going to moderate Notes?

We think about this stuff a lot, you might be surprised to learn.

I know you do, but this is a very different product.

Here’s how I think about this: Substack is neither an enterprise software provider nor a social network in the mold that we’re used to experiencing them. Our self-conception, the thing that we are attempting to build, and I think if you look at the constituent pieces, in fact, the emerging reality is that we are a new thing called the subscription network, where people are subscribing directly to others, where the order in the system is sort of emergent from the empowered — not just the readers but also the writers: the people who are able to set the rules for their communities, for their piece of Substack. And we believe that we can make something different and better than what came before with social networking.

The way that I think about this is, if we draw a distinction between moderation and censorship, where moderation is, “Hey, I want to be a part of a community, of a place where there’s a vibe or there’s a set of rules or there’s a set of norms or there’s an expectation of what I’m going to see or not see that is good for me, and the thing that I’m coming to is going to try to enforce that set of rules,” versus censorship, where you come and say, “Although you may want to be a part of this thing and this other person may want to be a part of it, too, and you may want to talk to each other and send emails, a third party’s going to step in and say, ‘You shall not do that. We shall prevent that.’”

And I think, with the legacy social networks, the business model has pulled those feeds ever closer. There hasn’t been a great idea for how we do moderation without censorship, and I think, in a subscription network, that becomes possible.

Wow. I mean, I just want to be clear, if somebody shows up on Substack and says “all brown people are animals and they shouldn’t be allowed in America,” you’re going to censor that. That’s just flatly against your terms of service.

So, we do have a terms of service that have narrowly prescribed things that are not allowed.

That one I’m pretty sure is just flatly against your terms of service. You would not allow that one. That’s why I picked it.

So there are extreme cases, and I’m not going to get into the–

Wait. Hold on. In America in 2023, that is not so extreme, right? “We should not allow as many brown people in the country.” Not so extreme. Do you allow that on Substack? Would you allow that on Substack Notes?

I think the way that we think about this is we want to put the writers and the readers in charge–

No, I really want you to answer that question. Is that allowed on Substack Notes? “We should not allow brown people in the country.”

I’m not going to get into gotcha content moderation.

This is not a gotcha… I’m a brown person. Do you think people on Substack should say I should get kicked out of the country?

I’m not going to engage in content moderation, “Would you or won’t you this or that?”

That one is black and white, and I just want to be clear: I’ve talked to a lot of social network CEOs, and they would have no hesitation telling me that that was against their moderation rules.

Yeah. We’re not going to get into specific “would you or won’t you” content moderation questions.

Why?

I don’t think it’s a useful way to talk about this stuff.

But it’s the thing that you have to do. I mean, you have to make these decisions, don’t you?

The way that we think about this is, yes, there is going to be a terms of service. We have content policies that are deliberately tuned to allow lots of things that we disagree with, that we strongly disagree with. We think we have a strong commitment to freedom of speech, freedom of the press. We think these are essential ingredients in a free society. We think that it would be a failure for us to build a new kind of network that can’t support those ideals. And we want to design the network in a way where people are in control of their experience, where they’re able to do that stuff. We’re at the very early innings of that. We don’t have all the answers for how those things will work. We are making a new thing. And literally, we launched this thing one day ago. We’re going to have to figure a lot of this stuff out. I don’t think…

You have to figure out, “Should we allow overt racism on Substack Notes?” You have to figure that out.

No, I’m not going to engage in speculation or specific “would you allow this or that” content.

You know this is a very bad response to this question, right? You’re aware that you’ve blundered into this. You should just say no. And I’m wondering what’s keeping you from just saying no.

I have a blanket [policy that] I don’t think it’s useful to get into “would you allow this or that thing on Substack.”

If I read you your own terms of service, will you agree that this prohibition is in that terms of service?

I don’t think that’s a useful exercise.

Okay. I’m granting you the out that when you’re the email service provider, you should have a looser moderation rule. There are a lot of my listeners and a lot of people out there who do not agree with me on that. I’ll give you the out that, as the email service provider, you can have looser moderation rules because that is sort of a market-driven thing, but when you make the consumer product, my belief is that you should have higher moderation rules. And so, I’m just wondering, applying the blanket, I understand why that was your answer in the past. It’s just there’s a piece here that I’m missing. Now that it’s the consumer product, do you not think that it should have a different set of moderation standards?

You are free to have that belief. And I do think it’s possible that there will be different moderation standards. I do think it’s an interesting thing. I think the place that we maybe differ is you’re coming at this from a point where you think that because something is bad… let’s grant that this thing is a terrible, bad thing…

Yeah, I think you should grant that this idea is bad.

That therefore censorship of it is the most effective tool to prevent that. And I think we’ve run, in my estimation over the past five years, however long it’s been, a grand experiment in the idea that pervasive censorship successfully combats ideas that the owners of the platforms don’t like. And my read is that that hasn’t actually worked. That hasn’t been a success. It hasn’t caused those ideas not to exist. It hasn’t built trust. It hasn’t ended polarization. It hasn’t done any of those things. And I don’t think that taking the approach that the legacy platforms have taken and expecting it to have different outcomes is obviously the right answer the way that you seem to be presenting it to be. I don’t think that that’s a question of whether some particular objection or belief is right or wrong.

I understand the philosophical argument. I want to be clear. I think government speech regulations are horrible, right? I think that’s bad. I don’t think there should be government censorship in this country, but I think companies should state their values and go out into the marketplace and live up to their values. I think the platform companies, for better or worse, have missed it on their values a lot for a variety of reasons. When I ask you this question, [I’m asking], “Do you make software to spread abhorrent views, that allows abhorrent views to spread?” That’s just a statement of values. That’s why you have terms of service. I know that there’s stuff that you won’t allow Substack to be used for because I can read it in your terms of service. Here, I’m asking you something that I know is against your terms of service, and your position is that you refuse to say it’s against your terms of service. That feels like not a big philosophical conversation about freedom of speech, which I will have at the drop of a hat, as listeners to this showknow. Actually, you’re saying, “You know what? I don’t want to state my values.” And I’m just wondering why that is.

I think the conversation about freedom of speech is the essential conversation to have. I don’t think this “let me play a gotcha and ask this or that”–

Substack is not the government. Substack is a company that competes in the marketplace.

Substack is not the government, but we still believe that it’s essential to promote freedom of the press and freedom of speech. We don’t think that that is a thing that’s limited to…

So if Substack Notes becomes overrun by racism and transphobia, that’s fine with you?

We’re going to have to work very hard to make Substack Notes be a great place to have the readers and the writers be in charge, where you can have the kinds of conversations that you find valuable. That’s the exciting challenge that we have ahead of us.

I get the academic aspect of where Chris is coming from. He’s correct that content moderation hasn’t made crazy ideas go away. These are the reasons I coined the Streisand Effect years ago, to point out the futility of just trying to stifle speech. And these are the reasons I talk about “protocols, not platforms” as a way to explore enabling more speech without centralized systems that suppress speech.

But Substack is a centralized system. And a centralized system that doesn’t do trust & safety… is the Nazi bar. And if you have some other system that you think allows for “moderation but not censorship” then be fucking explicit about what it is. There are all sorts of interventions short of removing content that have been shown to work well (though, with other social media, they still get accused of “censorship” for literally expressing more speech). But the details matter. A lot.

I get that he thinks his focus is on providing tools, but even so two things stand out: (1) he’s wrong about how all this works and (2) even if he believes that Substack doesn’t need to moderate, he has to own that in the interview rather than claiming that Nilay is playing gotcha with him.

If you’re not going to moderate, and you don’t care that the biggest draws on your platform are pure nonsense peddlers preying on the most gullible people to get their subscriptions, fucking own it, Chris.

Say it. Say that you’re the Nazi bar and you’re proud of it.

Say “we believe that writers on our platform can publish anything they want, no matter how ridiculous, or hateful, or wrong.” Don’t hide from the question. You claim you’re enabling free speech, so own it. Don’t hide behind some lofty goals about “freedom of the press” when you’re really enabling “freedom of the grifters.”

You have every right to allow that on your platform. But the whole point of everyone eventually coming to terms with the content moderation learning curve, and the fact that private businesses are private and not the government, is that what you allow on your platform is what sticks to you. It’s your reputation at play.

And your reputation when you refuse to moderate is not “the grand enabler of free speech.” Because it’s the internet itself that is the grand enabler of free speech. When you’re a private centralized company and you don’t deal with hateful content on your site, you’re the Nazi bar.

Most companies that want to get large enough recognize that playing to the grifters and the nonsense peddlers works for a limited amount of time, before you get the Nazi bar reputation, and your growth is limited. And, in the US, you’re legally allowed to become the Nazi bar, but you should at least embrace that, and not pretend you have some grand principled strategy.

This is what Nilay was getting at. When you’re not the government, you can set whatever rules you want, and the rules you set are the rules that will define what you are as a service. Chris Best wants to pretend that Substack isn’t the Nazi bar, while he’s eagerly making it clear that it is.

It’s stupidly short-sighted, and no, it won’t support free speech. Because people who don’t want to hang out at the Nazi bar will just go elsewhere.

Filed Under: , , , , ,

Companies: substack


Source link