this post was submitted on 23 Jan 2025
917 points (97.3% liked)

Technology

60942 readers
3920 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] LandedGentry@lemmy.zip 9 points 16 hours ago

This is basically the central thesis of The Social Dilemma.

[–] MITM0@lemmy.world 4 points 13 hours ago (2 children)

So....... in the US then ?

load more comments (2 replies)
[–] ohlaph@lemmy.world 15 points 19 hours ago (1 children)

If I see any alt-right content, I immediately block the account and report it. I don't see any now. I go to yourube for entertainment only. I don't want that trash propaganda.

[–] KariKariCrunch@lemmy.world 4 points 16 hours ago

Same. I watched one Rogan video in like, 2019, and it was like opening a flood gate. Almost immediately almost every other recommendation was some right-wing personality's opinion about "cancel culture" or "political correctness." It eventually called down once I started blocking those channels and anything that looks like it might lead to that kind of content. I can only imagine what would pop up now.

[–] thezeesystem@lemmy.blahaj.zone 11 points 18 hours ago

All platforms are now excessively catering to Elon Nazi trump America. It's pretty much propaganda. And it's extreme and excessive.

[–] jared@mander.xyz 15 points 20 hours ago

Don't let the algorithm feed you!

[–] bulwark@lemmy.world 10 points 20 hours ago

I noticed my feed almost immediately changed after Trump was elected. I didn't change my viewing habits. I'm positive YouTube tweaked the algorithm to lean more right.

[–] ohellidk@sh.itjust.works 9 points 20 hours ago

Crazy stuff. So not only does YouTube make you generally dumber, it now is pushing the audience to more Conservative viewpoints because of the "emotional engagement" that keeps 'em watching. and YouTube probably sells more premium subscriptions that way. fuck google!

[–] Hope@lemmy.world 8 points 20 hours ago* (last edited 20 hours ago) (1 children)

Just scrolling through shorts on a given day, I'm usually recommended at least one short by someone infamously hostile to the LGBTQIA+ community. I get that it could be from people with my interests hate-watching, but I don't want to be shown any of it. (Nearly all of my YouTube subscriptions are to LGBTQIA+ creators. I'm not subscribed to anyone who has ever even mentioned support for right leaning policies on anything.)

[–] UraniumBlazer@lemm.ee 1 points 14 hours ago

Oh same! There's also casual hate towards the queer community in really random videos.

[–] random_character_a@lemmy.world 0 points 15 hours ago (1 children)

You get what you usually click?

[–] psx_crab@lemmy.zip 3 points 14 hours ago* (last edited 14 hours ago)

I didn't watch the video, but it's YT short, you just swipe like tiktok. The few ways to curate the algorithm is to either swipe away quickly, click on the "not interested" button, downvote, or delete watched shorts from history. If you doesn't interact with any of this and watch the full length of the video, the algorithm gonna assume you like this kind of content. They also will introduce you content you never watched before to gauge your interest, a lot of times it's not even related to what you currently watched, and if you didn't do any curation, they gonna feed you the exact type for some times. I don't know how they manage the curation but that's the gist of it from my experience. My feed have 0 politics, mostly cats. I control the feed strictly so i got what i demand.

load more comments
view more: ‹ prev next ›