Content Moderation
-
Infrastructure companies need a distinct approach to moderation that focuses on neutrality and due process.
-
Twitter went too far last week for reasons that go back to 2016 and the unfair blaming of tech for media’s mistakes.
-
The question of what should be moderated, and when, is an increasingly frequent one in tech. There is no bright line, but there are ways to get closer to an answer.
-
Another Congressional Hearing, The Genesis of Section 230, The Battle over Section 230
Section 230, which shields Internet companies from liability, is getting more attention: the only attention it should get is as a model for other regulations.
-
Spotify’s New Hate Policy, Twitter’s Behavior Policy, YouTube Music and YouTube Premium
Spotify’s new hate policy and Twitter’s behavior policy seem like good things at first glance, but what they suggest about the companies’s power is worrisome. Plus, YouTube’s subscription plans are as confusing as ever.
-
Facebook Content Guidelines, Facebook Video, Amazon Prime Video on Apple TV
Facebook faces a daunting challenge when it comes to policing content, but it is a challenge the company brought on itself. Then, Facebook’s video tab is competing against YouTube, not Amazon or Netflix, and business models explain why — and probably explain the Amazon-Apple truce.
-
Tech Morality, Facebook and the BBC, Wikileaks’ CIA Trove
The analysis of technology cannot escape questions of morality, and certainty is dangerous. Plus, the CIA leak, and the difference between exploits and encryption.
-
Fake News and Facebook, Filter Bubbles and People, Google’s Featured Snippets Problem
Research says truly fake news isn’t much of a problem; filter bubbles are, but algorithms are less responsible than it seems. That, though, is why Google in particular has a responsibility to do better.
-
Fake News
Facebook is under fire for fake news and filter bubbles; they are a problem, but most of the proposed solutions are far worse.
-
Curation and Algorithms
More and more companies are announcing new products based on human curation, even as the most important content players — Google and Facebook — rely on algorithms. When does curation make sense, and when are algorithms better? And ultimately, who is responsible for both?
