this post was submitted on 08 Jun 2025
499 points (95.4% liked)

Technology

71094 readers
3041 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] RampantParanoia2365@lemmy.world 6 points 31 minutes ago

Fucking obviously. Until Data's positronic brains becomes reality, AI is not actual intelligence.

[–] Auli@lemmy.ca 17 points 3 hours ago

No shit. This isn't new.

[–] GaMEChld@lemmy.world 18 points 5 hours ago (6 children)

Most humans don't reason. They just parrot shit too. The design is very human.

[–] joel_feila@lemmy.world 1 points 4 minutes ago

Thata why ceo love them. When your job is 90% spewing bs a machine that does that is impressive

[–] elbarto777@lemmy.world 17 points 3 hours ago

LLMs deal with tokens. Essentially, predicting a series of bytes.

Humans do much, much, much, much, much, much, much more than that.

load more comments (4 replies)
[–] mavu@discuss.tchncs.de 47 points 7 hours ago

No way!

Statistical Language models don't reason?

But OpenAI, robots taking over!

[–] ZILtoid1991@lemmy.world 15 points 7 hours ago (1 children)

Thank you Captain Obvious! Only those who think LLMs are like "little people in the computer" didn't knew this already.

[–] TheFriar@lemm.ee 5 points 3 hours ago (1 children)

Yeah, well there are a ton of people literally falling into psychosis, led by LLMs. So it’s unfortunately not that many people that already knew it.

[–] BlaueHeiligenBlume@feddit.org 10 points 7 hours ago (1 children)

Of course, that is obvious to all having basic knowledge of neural networks, no?

[–] Endmaker@ani.social 0 points 2 hours ago

I still remember Geoff Hinton's criticisms of backpropagation.

IMO it is still remarkable what NNs managed to achieve: some form of emergent intelligence.

[–] vala@lemmy.world 29 points 9 hours ago
[–] bjoern_tantau@swg-empire.de 38 points 11 hours ago
[–] surph_ninja@lemmy.world 10 points 9 hours ago (3 children)

You assume humans do the opposite? We literally institutionalize humans who not follow set patterns.

[–] petrol_sniff_king@lemmy.blahaj.zone 16 points 8 hours ago (6 children)

Maybe you failed all your high school classes, but that ain't got none to do with me.

load more comments (6 replies)
[–] LemmyIsReddit2Point0@lemmy.world 12 points 8 hours ago

We also reward people who can memorize and regurgitate even if they don't understand what they are doing.

load more comments (1 replies)
load more comments
view more: next ›