OT The 1996 Law that Ruined the Internet

Welcome to our community

Be a part of something great, join today!

Users who are viewing this thread

Voodoo

An American hero
Joined
Feb 26, 2011
Messages
1,874
Likes
2,732
Points
113
In 1996, Congress passed the Communications Decency Act, a law meant to crack down on digital smut. From a decency perspective, the legal standard that had emerged from the CompuServe and Prodigy lawsuits seemed, well, perverse. Prodigy was liable because it had tried to do the right thing; CompuServe was immune because it had not. So Section 230 of the act stipulated that providers of internet forums would not be liable for user-posted speech, even if they selectively censored some material.


Much of the Communications Decency Act was quickly struck down by the Supreme Court, but Section 230 survived. It quietly reshaped our world. Courts interpreted the law as giving internet services a so-called safe harbor from liability for almost anything involving user-generated material. The Electronic Frontier Foundation describes Section 230 as “one of the most valuable tools for protecting freedom of expression and innovation on the Internet.” The internet predated the law. Yet the legal scholar Jeff Kosseff describes the core of Section 230 as “the twenty-six words that created the internet,” because without it, the firms that dominate the internet as we have come to know it could not exist. Maybe that would be a good thing.

Services such as Facebook, Twitter, and YouTube are not mere distributors. They make choices that shape what we see. Some posts are circulated widely. Others are suppressed. We are surveilled, and ads are targeted into our feeds. Without Section 230 protections, these firms would be publishers, liable for all the obscenity, defamation, threats, and abuse that the worst of their users might post. They would face a bitter, perhaps existential, dilemma. They are advertising businesses, reliant on reader clicks. A moderation algorithm that erred on the side of a lawyer’s caution would catch too much in its net, leaving posters angrily muzzled and readers with nothing more provocative than cat pics to scroll through. An algorithm that erred the other way would open a floodgate of lawsuits.

https://www.theatlantic.com/ideas/archive/2021/01/trump-fighting-section-230-wrong-reason/617497/

One of the few things I agree with @realDonaldTrump is that we need to remove section 230 out of the US Code. Where I disagree with him is on the reasoning for why we need to remove it. I have thought this for quite a long time and remember discussing this years ago with my brother who went to law school on the merits of this law. I highly recommend reading this article as I think it does a very good job articulating the problem with section 230, and the ramifications that this law has had on American society.
 
Sounds like throwing the baby out with the bath water?

I wouldn't say that--the section that survived the courts basically removes accountability from companies for how their platforms are used. Newspapers weren't allowed to claim that they just own the paper but they have no control over what's published within it, and I don't think the digital equivalents--the companies that basically ended newspapers, so thoroughly they replaced them--should be able to claim it.
 
I wouldn't say that--the section that survived the courts basically removes accountability from companies for how their platforms are used. Newspapers weren't allowed to claim that they just own the paper but they have no control over what's published within it, and I don't think the digital equivalents--the companies that basically ended newspapers, so thoroughly they replaced them--should be able to claim it.

Basically yes, this is in essence what the article is stating. About 10+ years ago I was beginning this conversation with my brother in relation to Facebook and why it along with Twitter were allowed to get away with not moderating the content on their sites. I remember how ridiculous the stuff that was being posted about Obama was leading up to the 2008 elections and being in disbelief that these companies could allow some of the obviously false things such as the birther conspiracy theory crap to be posted and not be sued. At the time we were talking about how that kind of conspiracy peddling would be bad for democracy as it would become a slippery slope that would continue to embolden hate speech, lies, conspiracy theories, and damaging rhetoric. Man, at the time I had no idea how prescient that conversation was.
 
But newspapers were in charge of their content and creators therein. Social media is exactly that, society. Where freedom of speech is protected. You talk about slippery slopes, moderating speech in the public sphere.
 
But newspapers were in charge of their content and creators therein. Social media is exactly that, society. Where freedom of speech is protected. You talk about slippery slopes, moderating speech in the public sphere.

Not quite, did you read the article? The general premise is that there wouldn't be protections for the publishers of your content. They would be assuming the financial liability for anything any of their users posted that could be defamatory. This is not like a Chinese style great firewall where there are millions of content censors that work directly for the government. This is just putting the onus where it belongs, back onto the social media companies to help mitigate some of the worst aspects of the internet.
 
Last edited:
Just to be clear, anyone could post whatever they want - but FB and Twitter would be at risk of publishing a defamation story if they didn't moderate it. Think of how this would fundamentally change social media, I would argue for the better. If you wanted to publish some lunatic crap you're more than welcome to do it, you would simply need to create your own blog to do it, you would then be the one accepting the risk of liability for publishing whatever you want to post. This would greatly curtail the hate speech, hate mongering, and general deterioration of civil discourse.
 
But newspapers were in charge of their content and creators therein. Social media is exactly that, society. Where freedom of speech is protected.

"Freedom of speech" means freedom from the government censoring speech, which is protected in newspapers too. This isn't a free speech issue. Saying social media is "society" because it has "social" in the name doesn't make any sense. If newspapers had called themselves "social papers" and said they weren't responsible for anything printed in it, including the stuff readers write in, would you consider that fine?

"Social media" is not society. It's big corporations, just like major news agencies are/were. The corporations own the platforms and profit from them, it's not people talking on the street. Granting them immunity from anything published in their owned and operated platforms is simply a huge give-away to corporate interests.
 
"Freedom of speech" means freedom from the government censoring speech, which is protected in newspapers too. This isn't a free speech issue. Saying social media is "society" because it has "social" in the name doesn't make any sense. If newspapers had called themselves "social papers" and said they weren't responsible for anything printed in it, including the stuff readers write in, would you consider that fine?

"Social media" is not society.

What readers write into a paper is only published at the papers behest. Content on social media is created primarily by the users not the platforms or their employees.

If there was ever a time where its more evident society and social media are one in the same its the last ten months. Technology is the new street, us conversing right now is a prime example.
 
I wouldn't say that--the section that survived the courts basically removes accountability from companies for how their platforms are used. Newspapers weren't allowed to claim that they just own the paper but they have no control over what's published within it, and I don't think the digital equivalents--the companies that basically ended newspapers, so thoroughly they replaced them--should be able to claim it.
Then get ready for forums like this to shut down, since the company that runs the server will be sued out of existence.
 
The issue with this is-where do you stop the accountability - at the hosting provider for private blogs? At the network provider for people that host their own server?

This is like calling the loggers that provided the raw material for the paper responsible for newspaper articles or the power company where someone put a defamatory note on a wooden post responsible. The moment the general public converses - it is a problem to regulate the technology providers.
 
The phone company should be liable for what anyone says on the phone. The phone company isn't just a provider. It's the creator of its content. Therefore, like message board owners and moderators, phone company managers should be in jail.
 
Back
Top