this post was submitted on 13 Aug 2023
71 points (79.8% liked)

World News

32283 readers
808 users here now

News from around the world!

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Madison_rogue@kbin.social 73 points 1 year ago* (last edited 1 year ago) (1 children)

Seriously though, she chose a show that was randomly chosen by the algorithm, she watched it, and more content of that type was suggested to her by the algorithm.

This isn't quite rocket science.

[–] reflex@kbin.social 31 points 1 year ago* (last edited 1 year ago) (5 children)

and more content of that type was suggested

That, or they might have figured it out from her search patterns alone—like how Target figured out that one woman was gregnant before she did.

[–] Legolution@feddit.uk 15 points 1 year ago

It was never proven that the baby was Greg's.

[–] shinjiikarus@mylem.eu 14 points 1 year ago (1 children)

Has this story ever been confirmed by Target directly? As this happened in America and her father was outraged about it, it would have been awfully convenient, to “blame” the algorithm for “discovering”, she was pregnant. It takes quite a data analyst to figure out trends before someone even knows they are pregnant. It doesn’t take a genius to figure out a pattern for someone if they know they are pregnant and are just hiding it from their dad.

[–] what_is_a_name@lemmy.world 17 points 1 year ago (1 children)

Yes. It’s many years in my past, but this was confirmed. Target still does their targeting but now scatter unrelated items in the ads to hide what they know.

[–] NotSpez@lemm.ee 9 points 1 year ago

target still does their targeting

Awesome sentence

[–] oldGregg@lemm.ee 9 points 1 year ago
[–] Madison_rogue@kbin.social 8 points 1 year ago (2 children)

They didn't figure anything out. There's no sentience in the algorithm, only the creators of said algorithm. It only chose content based on input. So it all revolves around the choices of the article's author.

Same thing with the woman who was pregnant, the algorithm gave choices based on the user's browsing history. It made the connection that the choice of product A was also chosen by pregnant mothers, therefore the shopper might be interested in product B which is something an expecting mother would buy.

[–] reflex@kbin.social 15 points 1 year ago (1 children)

They didn't figure anything out.

Ugh, I was agreeing with you, and you go pedant. Come on, you should know "figure out" doesnt necessarily imply sentience. It can also be used synonymously with, "determine."

[–] Madison_rogue@kbin.social 7 points 1 year ago

Sorry, I misunderstood your tone. Apologize for going all pedantic…it’s a character flaw.

[–] ExLisper@linux.community 4 points 1 year ago

I believe in case of the pregnant women she was offered diapers and stuff. Based on food she bought. So it's no simply "you both diet coke, maybe try diet chocolate?". In case of Netflix there's no " A show only gay people watch" so her complaints are silly.