this post was submitted on 05 Nov 2023
264 points (96.5% liked)

Technology

59086 readers
3617 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

It's not just about facts: Democrats and Republicans have sharply different attitudes about removing misinformation from social media::One person’s content moderation is another’s censorship when it comes to Democrats’ and Republicans’ views on handling misinformation.

top 50 comments
sorted by: hot top controversial new old
[–] lolcatnip@reddthat.com 40 points 1 year ago* (last edited 1 year ago) (1 children)

Democrats and Republicans have sharply different attitudes about whether disinformation is desirable.

[–] nilloc@discuss.tchncs.de 10 points 1 year ago (1 children)

It benefits Republicans, so the side it benefits would obviously desire that benefit.

load more comments (1 replies)
[–] Blamemeta@lemm.ee 21 points 1 year ago (10 children)

Part of the problem is who decides what is misinformation. As soon as the state gets to decide what is and isn't true, and thus what can and cannot be said, you no longer have free speech.

[–] scarabic@lemmy.world 54 points 1 year ago* (last edited 1 year ago) (1 children)

The state deciding on speech is a red line yes but that’s not even on the table here. This is about social media moderation. It actually seems really suspiciously disingenuous to bring that up here.

OP: Thread about social media moderation

You: The state deciding what’s true is the death of free speech!

Actually your comment is one of the big problems in this debate. People can’t tell the difference between a private social media firm moderating hate content and the government taking away their freedom of speech. You just slurred the two together yourself by bringing this up here.

[–] bamboo@lemm.ee 4 points 1 year ago (3 children)

Centralized for-profit companies policing speech doesn’t really solve free speech concerns. It doesn’t violate the US first amendment, but corporate-approved speech isn’t really free speech either. No person or organization is really suitable to be the arbiter of truth, but at the same time unmoderated misinformation presents its own problems.

[–] scarabic@lemmy.world 6 points 1 year ago (6 children)

Yes it solves it. Companies are not required to carry your voice around the world, which is what their platforms do. Stop equating guaranteed amplification with your freedom of speech. It’s wrong and dumb. I’ve lived in countries that actually restrict speech and whatever the Facebook mod did to you is NOTHING. The only reason Americans even fall into this stupid way of thinking is because their speech is so free. When your speech has never truly been restricted you have no idea what that freedom even means.

load more comments (6 replies)
[–] Pxtl@lemmy.ca 3 points 1 year ago

No person or organization is really suitable to be the arbiter of truth

Courtrooms are arbiters of truth literally all the time. There are plenty of laws for which truth is a defence, and dishonesty is punished.

When battling misinformation, the problem is not that lying on the internet is legal - it is still actionable. Fraud is still illegal. False or misleading advertisements are still illegal. Defamation is still illegal. Perjury is illegal in the criminal law sense, not just torts. Ask Martha Stewart who the "arbiter of truth" is.

The problem is that it's functionally impossible to enforce on the scale of social media. If 50,000 people call you a pedophile because it became a meme even though it was completely untrue, and this costs you your job and you start getting death threats, what are you going to do about that? Sue them all?

So we throw up our hands and let corporations handle it through abuse policies, because the actual law is unworkable - it's "this is illegal but enforcing it is so impractical that it's legal". Twitter and Facebook don't have to deal with that crap so we let them do a vague implementation of the law but without the whole "due process" thing and all the justice they can mete out is bans.

If you disagree, then I've got a Nigerian prince who'd like to get your banking info, and also you're all cannibals.

load more comments (1 replies)
[–] echo64@lemmy.world 29 points 1 year ago* (last edited 1 year ago) (1 children)

You do not have free speech on social media today, private platforms decide what they want to have.

The state does not have to be the one to decide these things, nor is it a case of "deciding" what is true, we have a long history of using proofs to solidify something as fact, or propaganda, or somewhere in between. This is functionally what history studies are about.

[–] Blamemeta@lemm.ee 8 points 1 year ago (12 children)

That brings up another thing. At what point does it become a "public space"?

Theres an old supreme court case on a company town that claimed someone was trespassing on a sidewalk. The supreme court ruled it was a public space, and thus they could pass out leaflets.

https://firstamendment.mtsu.edu/article/marsh-v-alabama-1946/

Imo, a lot of big sites have gotten to that stage, and should be treated as such.

[–] Lith@lemmy.sdf.org 13 points 1 year ago (4 children)

I think this is an underrated point. A lot of people are quick to say "private companies aren't covered by free speech", but I'm sure everyone agrees legal ≠ moral. We rely on these platforms so much that they've effectively become our public squares. Our government even uses them in official capacities, e.g. the president announcing things on Twitter.

When being censored on a private platform is effectively social and informational murder, I think it's time for us to revisit our centuries-old definitions. Whether you agree or disagree that these instances should be covered by free speech laws, this is becoming an important discussion that I never see brought up, but instead I keep seeing the same bad faith argument that companies are allowed to do this because they're allowed to do it.

[–] gregorum@lemm.ee 13 points 1 year ago (6 children)

This is an argument for a publicly-funded “digital public square”, not an argument for stripping private companies of their rights.

[–] wizardbeard@lemmy.dbzer0.com 8 points 1 year ago (2 children)

Why not both?

While I agree that punishing companies for success isn't a good idea, we aren't talking about small startups or local business ran by individual entrepreneurs or members of the community here. We're talking about absurdly huge corporations with reach and influence the likes that few businesses ever reach. I don't think it's unreasonable to apply a different set of rules to them, as they are distinctly different situations.

load more comments (2 replies)
load more comments (5 replies)
[–] ChairmanMeow@programming.dev 7 points 1 year ago

It's different because the company built and maintains the space. Same goes for a concert hall, a pub, etc...

Nobody believes that someone being thrown out of a pub for spouting Nazistic hate speech is their "free speech being trampled". Why should it be any different if it's a website?

You rarely see the discussion, because there's rarely a good argument here. It boils down to "it's a big website, so I should be allowed to post whatever I want there", which makes little to no sense and opens up a massive quagmire of legal issues.

[–] TrickDacy@lemmy.world 2 points 1 year ago (1 children)

bad faith argument that companies are allowed to do this because they're allowed to do it.

So let's get this straight, it's "bad faith" to point out facts but "good faith" to support bigotry and hatred like you're "accidentally" doing with your argument?

[–] Lith@lemmy.sdf.org 4 points 1 year ago (5 children)

It's bad faith to argue that companies should be allowed to do things because they're already allowed to do those things. I see a little bit of that creeping in even here with the concept of "rights", as if corporations were humans. Laws can change.

It's good faith to ask if companies have too much power over what has become our default mode of communication. It's also good faith to challenge this question with non-circular logic.

Your assumption that I'm defending racism and bigotry is exactly why I think this stuff is important. You've implied I'm an insidious alt-rightist trying to dog whistle, and now I'm terrified of getting banned or otherwise censored. I'm interested in expressing myself. I do not want to express bigotry. But if one person decides what I said is even linked to bigotry, suddenly I'm a target, and I can lose a decades-old social account and all of its connections. And if that happens I just have to accept it because it's currently legal. It's so fucking stressful to say anything online anymore.

load more comments (5 replies)
load more comments (1 replies)
[–] SexyTimeSasquatch@lemmy.world 5 points 1 year ago (2 children)

There is a key difference here. Social media companies have some liability with what gets shared on the platform. They also have a financial interest in what gets said and how it gets promoted by algorithms. The fact is, these are not public spaces. These are not streets. They're more akin to newspapers, or really the people printing and publishing leaflets. The Internet itself is the street in your analogy.

[–] puppy@lemmy.world 5 points 1 year ago (2 children)

Your analogy about Newspapers isn't accurate either. The writers of a newspaper are paid by the company and everyone knows that writers execute the newspaper's agenda. Nothing gets published without review and everything aligns with the company's vision. Information is one way and readers buy it to consume information. They don't expect their voice to be heard and the newspaper don't pretend that the readers have that ability either. This isn't comparable to a social media site at all.

load more comments (2 replies)
[–] BellaDonna@mujico.org 3 points 1 year ago

Companies probably shouldn't be liable then for what individuals share / post then, instead the individuals should. Social media constantly controls their push / promotion of posts currently using algorithms to decide what should be shown / shared and when.

I hate this so much. I want real, linear feeds from all my friends I'm following, not a personally curated style sanitized feed to consider my interests and sensibilities.

load more comments (10 replies)
[–] Corgana@startrek.website 25 points 1 year ago (3 children)

Nobody (besides maybe extreme conservatives) is advocating for "the state" to decide what "is and isn't true". That's not what this is about.

Furthermore, "misinformation" and "disinformation" refer to things that can be true! Propogansists don't always need to invent false facts for them to be used in deceptive ways. To suggest that the goverment should stay out of the matter unless they utilze a perfectly foolproof fact-o-meter is IMO, shortsighted. "The state" makes policy decisions all the time with imperfect facts.

load more comments (3 replies)
[–] dhork@lemmy.world 18 points 1 year ago (3 children)

Except there have always been limits on speech, centered mainly on truth. Your freedom of speech doesn't extend to yelling "Fire" in a crowded theater when there is no fire, for instance.

But we live in an age of alternative facts now, where science isn't trusted if it comes up with conclusions that conflict with your world view. Do you get a pass if you are yelling "Fire" because you are certain there are cell phone jammers in the theater that are setting your brain on fire because you got the COVID shot and now the 5G nanoparticles can't transmit back to Fauci's mind control lair?

[–] FireTower@lemmy.world 7 points 1 year ago (2 children)

Do you get a pass if you are yelling "Fire" because you are certain there are cell phone jammers in the theater that are setting your brain on fire

Yes. Anyone in good faith attempting to warn others of any potential harm that they believe to be true to the best of their abilities should have their speech protected.

[–] dhork@lemmy.world 10 points 1 year ago* (last edited 1 year ago) (6 children)

Anyone in good faith attempting to warn others of any potential harm that they believe to be true to the best of their abilities

But what if their beliefs are verifiably false? I don't mean that in a sense of a religious belief, which cannot be proven and must be taken on faith. I mean that the facts are clear that there are no 5G nanoparticles in the vaccine for cell phone jammers to interfere with in the first place. That isn't even a thing.

It's one thing to allow for tolerance of different opinions in public. It's another thing entirely to misrepent things that can be objectively disproven as true, just because you've tied it to a political movement. Can that really still be considered to be in good faith?

load more comments (6 replies)

I wrote a comment about this earlier today. People who have been brainwashed to believe total nonsense often act in ways that are rational to them, but irrational to people who see the world through different eyes.

That's fine until it's violent action.

The alcoholic who thinks he's "fine to drive" believes he's perfectly rational. He's drunk all the time and no accidents. That's wonderful until he kills a family some night.

load more comments (2 replies)
[–] Pxtl@lemmy.ca 15 points 1 year ago

Uh, you know that happens regularly in courtrooms right? Like, almost every court battle hinges on what's true and what's not. And courts are an arm of the state.

In some cases it's directly about the truth of speech. Fraud, defamation, perjury, filing a false report, etc. are all cases where a court will be deciding whether a statement made publicly is true and punishing a party if it was not. Ask a CEO involved in a merger how much "free speech" they have.

[–] TrickDacy@lemmy.world 10 points 1 year ago (1 children)

Oh weird, you coincidentally are a conservative mod lol

Gee so surprising you're mad about cEnSoRsHiP

load more comments (1 replies)
[–] scarabic@lemmy.world 5 points 1 year ago

Well, here’s how that was framed for participants of this study:

identified as misinformation based on a bipartisan fact check

And even with this, Republicans didn’t care if it was true or not.

We’re actually past the point of anyone being able to be considered truthful by Republicans. It either tickles their feelings right or it doesn’t and that is all.

[–] tastysnacks@programming.dev 4 points 1 year ago

Section 230 gets the state involved from the get go. Remove liability protections from the state and everything else will shake out. Make little tweaks from there as necessary. The broad protection of 230 is causing this issue.

[–] cheese_greater@lemmy.world 3 points 1 year ago (1 children)

Isnt a grand jury enough to deal with this kinda thing? Like before damage is done but I don't see why that mechanism can't be useful here too?

load more comments (1 replies)
load more comments
view more: next ›