this post was submitted on 17 Aug 2023
484 points (96.0% liked)

Technology

59086 readers
3617 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://nom.mom/post/121481

OpenAI could be fined up to $150,000 for each piece of infringing content.https://arstechnica.com/tech-policy/2023/08/report-potential-nyt-lawsuit-could-force-openai-to-wipe-chatgpt-and-start-over/#comments

you are viewing a single comment's thread
view the rest of the comments
[–] PupBiru@kbin.social 2 points 1 year ago (1 children)

i think the distinction that either side is seeing here is that you think humans are inherently different to a neural network, where i think that the only difference is in the complexity: that if we had a neural network at the same scale as the human brain, that there’s nothing stopping those electronic neurons from connecting and responding in a way that’s indistinguishable from a human

the fact that we’re not there yet i don’t see as particularly relevant, because we’re talking about concepts rather than specifics… of course a LLM doesn’t display the same characteristics as a human: it’s not of the same scale, and the training is different but functionally there’s nothing different between chemical neurons firing and neurons made of transistors firing

we learn in the same way: by reinforcing connections between our neurons

[–] walrusintraining@lemmy.world 2 points 1 year ago

A few points:

  • Humans are more than just a brain. There’s the entire experience of ego, individualism, and body

  • Another massive distinction is autonomy and liberty, which no AI models currently possess.

  • We don’t know all there is to know about the human brain. We can’t say it is functionally equivalent to a neural network.

  • If complexity is irrelevant, then the simplest neural network trained on a single work of writing is equivalent to the most advanced models for the purposes of this discussion. Such a network would, again, output a copy of the work it was trained on

When we’ve developed a true self-aware AI that can move and think freely, the idea that there is little difference will have more weight to it.