this post was submitted on 31 Aug 2023
595 points (97.9% liked)

Technology

60106 readers
1923 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

I'm rather curious to see how the EU's privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn't have a paywall)

you are viewing a single comment's thread
view the rest of the comments
[–] SpiderShoeCult@sopuli.xyz 1 points 1 year ago (1 children)

I'm just going to leave this here.

some random article

A quote from the article, I found especially interesting.

"As a result, no one on Earth fully understands the inner workings of LLMs. Researchers are working to gain a better understanding, but this is a slow process that will take years—perhaps decades—to complete."

Quite an interesting read and I'm sure you can find some others if you want to and try hard enough.

[–] Veraticus@lib.lgbt 0 points 1 year ago

This is a somewhat sensationalist and frankly uninteresting way to describe neural networks. Obviously it would take years of analysis to understand the weights of each individual node and what they're accomplishing (if it is even understandable in a way that would make sense to people without very advanced math degrees). But that doesn't mean we don't understand the model or what it does. We can and we do.

You have misunderstood this article if what you took from it is this:

It’s also very similar in the way that nobody actually can tell precisely how it works, for some reason it just does.

We do understand how it works -- as an overall system. Inspecting the individual nodes is as irrelevant to understanding an LLM as cataloguing trees in a forest tells you the name of the city to which the forest is adjacent.