This is juvenile enough that I feel guilty finding it funny, but it’s a good demonstration of the problems with this backlash against content moderation.
link to ‘https://www.techdirt.com/2022/09/26/subreddit-discriminates-against-anyone-who-doesnt-call-texas-gov’
We need to do more work to divorce free speech from content moderation. The world without content moderation would be a much worse world, and we don’t want to live in it. Sure, social media platforms are too powerful, but this is not the answer.
link to ‘Texas has teed up a Supreme Court fight for the future of the internet - The Verge’
I appreciate the way that Masnick uses examples from the news to call out how dumb some of these laws are.
link to ‘Twitter Removes Florida Political Candidate Advocating Shooting Federal Agents; If DeSantis Won His Lawsuit, Twitter Would Need To Leave It Up | Techdirt’
This is why the EFF and others have concerns about overreach of even clearly well intentioned content moderation. CSAM is clearly despicable, but automated content moderation can make mistakes, and consequences for those mistakes aren’t small.
link to ‘A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal. - The New York Times’
Masnick makes two good points here: The GOP seems to only care about content moderation in self-serving ways, but also we should be wary of political mandates for content moderation.
link to ‘Google Maps Is Misleading Users Searching For Abortion Clinics… And The GOP Is Threatening The Company If It Fixes That | Techdirt’
Here’s the EFF pointing out that “free speech” on these platforms means something very particular rather than a broad, deep commitment to legally-protected expression.
link to ‘Self-Proclaimed Free Speech Platforms Are Censoring Nude Content. Here’s Why You Should Care | Electronic Frontier Foundation’
So, here’s a case where TikTok’s Chinese ownership is actually a really big deal—though, of course, YouTube and other U.S. companies have also been quicker to moderate than to archive material that could be valuable in a similar way.
link to ‘TikTok resists calls to preserve Ukraine content for war crime investigations | Ars Technica’
Good example here of how content moderation can absolutely overreach. Arguments that platforms shouldn’t moderate are nonsense, but I appreciate Masnick’s emphasis on the need to be very careful about how we moderate.
link to ‘Impossibility Theorem Strikes Again: YouTube Deletes January 6th Committee Video | Techdirt’
This is a peak example of what performative concerns about “free speech” boil down to.
link to ‘Trump’s ‘Free Speech’ Social Network, Truth Social, Is Banning People For Truthing The Truth About January 6 Hearings | Techdirt’
Content moderation is a good thing, and ‘free speech’ should not be our primary concern when it comes to social media platforms.
link to ‘Racist and Violent Ideas Jump From Web’s Fringes to Mainstream Sites - The New York Times’
Last week, I was interviewed by a reporter at WEKU about social media and content moderation in the context of the horrific recent shooting in Buffalo, and I was pleased to see the interview appear on the WEKU website this morning.
I wish that the headline didn’t frame this as a question of “free speech”—and that I’d perhaps been more forceful in emphasizing that these really aren’t questions of free speech so much as content moderation.
If QAnon is excited, the rest of us should be worried—though I think there is a possibility that Musk realizes just how bad his ideas re: limiting moderation are and fails to deliver.
link to ‘QAnon Thinks Elon Musk Is Going to Let Them Back On Twitter’
The quotes in here underline how often ‘free speech’ is used to mean ‘problematic right-wing talking points.’
link to ‘Trump says he won’t leave Truth Social, despite Musk’s Twitter takeover - The Verge’
EFF cares about and actually understands free speech and content moderation, so their voice is especially important today.
link to ‘Twitter Has a New Owner. Here’s What He Should Do. | Electronic Frontier Foundation’
Fascinated by this article for so many reasons. First, it’s a great example of meaningful practices in online spaces; second, it brings it back to the need for more, smaller platforms.
link to ‘Of ‘Algospeak’ and the Crudeness of Automated Moderation | by Clive Thompson | Apr, 2022 | OneZero’
I have only been reading Techdirt for a short amount of time, but I increasingly appreciate Masnick’s perspectives on issues like this.
link to ‘Elon Musk Demonstrates How Little He Understands About Content Moderation | Techdirt’
Really appreciate Masnick’s perspective here—especially the point that EVERYONE believes in content moderation even if there are disagreements on how to do it. It’s irresponsible for so many (on the right) to describe moderation as censorship.
link to ‘Why Moderating Content Actually Does More To Support The Principles Of Free Speech | Techdirt’
Glad to see reporting on Rumble, but disappointed to see uncritical repeating of claims about “free speech,” “neutrality,” and “censorship.” There are no neutral platforms, and content moderation is the real key idea here.
link to ‘Rumble, the Right’s Go-To Video Site, Has Much Bigger Ambitions - The New York Times’
I missed most of this yesterday, but Masnick sums up my thoughts so much better than I could.
link to ‘Performative Conservatives Are Mad That A Search Engine Wants To Downrank Disinformation | Techdirt’
Even if Spotify could demonstrate it isn’t a publisher here, platforms don’t get a free pass on content. Also, podcast platforms run counter to podcasting, so Spotify’s trying to be successful there is just as troublesome as the costs it’s willing to pay to do so.
link to ‘Spotify CEO Daniel Ek defends Joe Rogan deal in tense company town hall - The Verge’
I have not (and do not care to) read a lot about the Spotify thing, but podcasts are meant to be a platformless, open medium—one of the few left on the web. If you’re going to make one exclusive, you absolutely take responsibility for content moderation.
Podcasts are one of the last bastions of the open internet, but that evidently comes at a cost. So long as Apple and Spotify are trying to corner the podcast market, they should be moderating their content.
link to ‘Election Falsehoods Surged on Podcasts Before Capitol Riots, Researchers Find - The New York Times’
There are clear cases where platforms need to be moderating more content, but let’s not forget the seemingly-well-intentioned but overreaching cases either.
link to ‘Tumblr goes overboard censoring tags on iOS to comply with Apple’s guidelines - The Verge’
Interesting article. I’m particularly interested in the idea of focusing on algorithms rather than content.
link to ‘Facebook whistleblower hearing: France Haugen finally got Republicans to stop yapping about anti-conservative bias.’