Both sides agree that changes need made to section 230. And they both propose similar changes, but for completely opposite reasons. This is a very delicate situation where just some minor phrasing within the change can create the exact opposite result.
In short:
Conservatives want websites which heavily moderate their content to be treated as publishers wherein they share legal liability for anything posted on their site.
Liberals want all websites to be treated as publishers wherein they share legal liability for anything posted on their site.
The former would encourage many big websites to reduce their moderation to continue to cover their asses legally while protecting small websites who allow users to post as they please. Some of the bigger websites may choose to moderate/censor heavier, accepting legal liability, but censoring so hard that it would be hard to bring a case against them.
The latter would force all websites to moderate/censor so heavily to avoid having a case brought against them. As implemented, the government will likely put out strict guidelines for you to be able to "earn" your liability protection. Because the government wouldn't outright be making a law that says you must follow these guidelines, they'll likely scoot right by the first amendment. It's a way to force you without really "forcing" you.
The main difference will be what constitutes "earning" your liability protection. Per the spirit of the law, it absolutely should be the conservative way of if you don't censor, you're taking your hands off of it and letting the people do as they please. If you do aggressively censor, when people see the content get posted, they practically assume it was vetted first and therefore you are co-opting it, so you share some liability if the content is illegal.
But the liberal way would set arbitrary standards that really have nothing to do with being a publisher versus a platform. Instead, they'll simply use it as a tool to make things that aren't illegal de facto illegal, like "misinformation."
It's not so much a gray area that needs solved as it is something that needs to be determined on a case-by-case basis. It's something so nuanced that it takes a jury to decide. Many laws are like this.
The litmus test should be this (for libel, as an example): if a reasonable, sensible person can have a greater impression of truth by that statement being on your website versus being published directly on the author's own website, then you should share part of the blame.
That is: in early Facebook, no sensible person would believe that a statement, just by being on Facebook, is more likely to be true. But, in modern Facebook, if a sensible person sees something a bit "spicier" posted on Facebook and widely shared, something that would normally be quickly "fact checked" or outright removed, and it is not fact checked, it could be reasonable to believe that the statement is so true that even Facebook wasn't able to find something fallacious about it. You could easily find yourself in a situation where you believe something is true because it's uncensored on Facebook that you wouldn't otherwise believe.
I got banned from kotakuinaction on win for stating the fact that the holodomor was perpetrated by Jewish Bolsheviks
I guess the mods think they'll still get an invite to one of Ghislaine Maxwell's infamous bat mitzvah parties
Both sides agree that changes need made to section 230. And they both propose similar changes, but for completely opposite reasons. This is a very delicate situation where just some minor phrasing within the change can create the exact opposite result.
In short:
Conservatives want websites which heavily moderate their content to be treated as publishers wherein they share legal liability for anything posted on their site.
Liberals want all websites to be treated as publishers wherein they share legal liability for anything posted on their site.
The former would encourage many big websites to reduce their moderation to continue to cover their asses legally while protecting small websites who allow users to post as they please. Some of the bigger websites may choose to moderate/censor heavier, accepting legal liability, but censoring so hard that it would be hard to bring a case against them.
The latter would force all websites to moderate/censor so heavily to avoid having a case brought against them. As implemented, the government will likely put out strict guidelines for you to be able to "earn" your liability protection. Because the government wouldn't outright be making a law that says you must follow these guidelines, they'll likely scoot right by the first amendment. It's a way to force you without really "forcing" you.
The main difference will be what constitutes "earning" your liability protection. Per the spirit of the law, it absolutely should be the conservative way of if you don't censor, you're taking your hands off of it and letting the people do as they please. If you do aggressively censor, when people see the content get posted, they practically assume it was vetted first and therefore you are co-opting it, so you share some liability if the content is illegal.
But the liberal way would set arbitrary standards that really have nothing to do with being a publisher versus a platform. Instead, they'll simply use it as a tool to make things that aren't illegal de facto illegal, like "misinformation."
It's not so much a gray area that needs solved as it is something that needs to be determined on a case-by-case basis. It's something so nuanced that it takes a jury to decide. Many laws are like this.
The litmus test should be this (for libel, as an example): if a reasonable, sensible person can have a greater impression of truth by that statement being on your website versus being published directly on the author's own website, then you should share part of the blame.
That is: in early Facebook, no sensible person would believe that a statement, just by being on Facebook, is more likely to be true. But, in modern Facebook, if a sensible person sees something a bit "spicier" posted on Facebook and widely shared, something that would normally be quickly "fact checked" or outright removed, and it is not fact checked, it could be reasonable to believe that the statement is so true that even Facebook wasn't able to find something fallacious about it. You could easily find yourself in a situation where you believe something is true because it's uncensored on Facebook that you wouldn't otherwise believe.