this post was submitted on 22 Feb 2025
29 points (100.0% liked)

Chat

7533 readers
35 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] seathru@lemmy.sdf.org 18 points 1 day ago* (last edited 1 day ago) (1 children)

I liberally use that block button. Even to whole communities & instances. Worrying about blocking a toxic user's speech from anyone's view but your own is not worth the effort (unless you happen to be a mod).

[–] ComradeMiao@beehaw.org 5 points 1 day ago (1 children)

That's helped me a lot as well :) but I wish the block wasn't one sided. It's an odd choice that the other user can still see our content.

[–] TheFogan@programming.dev 13 points 1 day ago (1 children)

whatever we post is public... you can't stop someone from seeing public things. (Even if it worked the way you would like, they could browse anonymously or on a different account to see it). Blocking makes it convenient for you (so you don't have to look at public things that you don't want to see).

[–] ComradeMiao@beehaw.org 5 points 1 day ago (2 children)

Good point but typically blocking is two way not a one way mirror.

[–] missingno@fedia.io 9 points 1 day ago

Back in my day, one-way used to be the norm. Two-way is a more recent thing on some newer platforms, and I'm of the opinion that it does more harm than good. Especially in a public forum like this, it can be abused by bad actors as a way of hiding misinformation from those that would push back against it.

I know this because when Reddit changed their block system from one-way to two-way, that's exactly how it ended up getting abused.

[–] jarfil@beehaw.org 7 points 1 day ago (1 children)

For 2-way blocking, check Threads. It has more trolls and spam, but also more options like:

  • User "Mute": 1-way block, like Lemmy
  • User "Block": 3-way block, you don't see them, they don't see you, nobody sees their replies to your comments
  • Reply "Hide for everyone": hide replies to your comments
  • Comment "Limit who can reply": Anyone / only Followed / only Mentioned

Although it's a Meta spawn, it ends up being relatively clean since users can "ban" each other from discussions, which works as a de-escalation mechanism.

[–] TheFogan@programming.dev 5 points 1 day ago (1 children)

I mean, maybe a de-escalation, but also rife for it's own forms of abuse.

IE... someone wants to spread misinformation... they block anyone fact checking or disproving their nonsense.

Now I fully agree, the misinformation rabbitholes have diminishing returns the longer the thread and arguement goes on.

IE lets say

Misinformer, posts blatent lie.

Person1: Rebuts lie, Includes multiple credible sources for the rebuttle.

Misinformer: Claims all true sources are in a conspiracy or agenda.

Person1: argues back

At this point it's just wasting everyones time... but IMO the initial fact check is important for people approaching.

So in the lemmy method.

Person 1 can debunk the claim. Block the person... leave it up to others if they actually want to bother engaging etc...

Sounds to me like the threads method on the other hand.... Fake claimer can go... and either whack a mole block comments that disagree... or shut off discussion altogether leaving the claim unchecked. To me that seems a bigger problem. Fact is there's a lot of falsehoods that sound convincing to the general public, but are easilly disprovable with a bit of research, and IMO they need to be challanged where the claims are made.

[–] jarfil@beehaw.org 4 points 1 day ago (1 children)

That is true, but only works at a single thread level:

  • Mallory posts some misinformation - A
  • Alice replies with a rebuttal - B
  • Bob replies to Alice with further fact-checking - C
  • Mallory hides Alice's comment B, leaving Bob's C only visible to Alice
  • Eve adds a supporting reply - D
  • Charlie replies to Eve with a rebuttal - E
  • Eve can hide Charlie's E, but Mallory can't

Now Mallory has to decide whether to:

  1. Hide D+E, losing Eve's support D
  2. Hope for Eve to hide E
  3. Leave Eve's support D with Charlie's rebuttal E visible

If Mallory keeps hiding replies, her post A will have less engagement, with a notification of "Some additional replies are unavailable".

Meanwhile... Alice doesn't need to stop rebutting A:

  • Alice reposts Mallory's A as a quote with her own comment - B(A)
  • Mallory can do nothing about B(A) since it's under Alice's control
  • Alice replies to her own B(A) with a quote of Bob's C - C2
  • If Alice got to see Charlie's E, she can also quote it - E2

If people like Alice's rebuttal, then it can get more engagement than Mallory's misinformation, which makes the algorithm show it to more people.

So while the system can create echo chambers at a single thread level, as long as a post is open to comments and resharing, which are essential to spreading it, anyone can also grab it and create their own chamber around it.

It's usual to see these kinds of reposts, with separate discussions, sometimes linking to each other and creating larger discussion pools.

[–] TheFogan@programming.dev 1 points 20 hours ago (1 children)

I think there's the problem though, so Alice posts it on her page.

Now there's 2 ways people will see it... Either the algorythm is looking. So that's a popularity contest, assuming the algorythm is going based on engagement etc... Which unfortunately I have to say, historically BS tends to gather larger crowds than popular ones.

More importantly if we are talking algorythms they tend to push people towards the type of content they regularly consume. IE the algorythm is going to push people who are suceptible to BS (Some of which may be the ones who are suceptible, but not so far gone as to be immune to truth) to Mallory's page. Meanwhile alice's page will be drawing the skeptics, the ones who would like to push back against it... but can't. I see the mallory page like the /r/conservative subreddit. A fucking cespool, and most importantly very very determined to push out any views that disturb the narrative... yet with about 10x the views as any specifically left subreddits I can find (though admitted only 1/8th of general politics, which is still leftish by US standards.

[–] jarfil@beehaw.org 1 points 4 hours ago* (last edited 4 hours ago)

Ultimately, outside of friends and followers, all media discovery is a popularity contest, can't really discover the least popular content... and it's usually for a good reason.

Threads is not a perfect solution, but I think it does have elements going in the right direction. Mallory doesn't have a "page" like a subreddit, there is no group of mods with power over the whole conversation; even if multiple people were to share an account, even if they added an "automod" bot... they still only have direct power over direct replies, not sub-replies. Astroturfing, gang upvoting, and bot saturation are still a thing, but the ability to shape conversations by selective pruning and cherry picking, is much more limited. Mallory's options are: either to let people disagree, or to create multiple fake accounts, or to fall off the popularity contest.

Then, each comment/post/repost is its own ecosystem, the only common mod ruleset is from "daddy Meta"... which has its own issues, but not nearly the issues of a subreddit.

At the end of the day, all communication platforms fall somewhere between "single person dictatorship" (static web pages) and "anything goes" (4chan). There is no magic bullet, so far.

IMHO, right now Threads is more chaotic than Reddit or Lemmy, but has the tools to avoid becoming a 4chan or even a Facebook (somewhat ironically).