this post was submitted on 02 Dec 2023
156 points (85.1% liked)

Technology

57997 readers
2848 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Bill Gates feels Generative AI has plateaued, says GPT-5 will not be any better::The billionaire philanthropist in an interview with German newspaper Handelsblatt, shared his thoughts on Artificial general intelligence, climate change, and the scope of AI in the future.

you are viewing a single comment's thread
view the rest of the comments
[–] fruitycoder@sh.itjust.works 1 points 9 months ago (1 children)

The next big steps coming right now are AI trained on generative data and agents that act more automatically (rather than waiting for a prompt, take an action like searching the web and act on that to better complete the goal for example), and better indexed data so generated data can be informed by and cite sources in the moment.

[–] KISSmyOS@lemmy.world 2 points 9 months ago (1 children)

AI trained on generative data

This has already been shown to degrade the output very quickly.
I think the wall that generative AI is hitting is the lack of more training data. All the web has been scraped to get it to where it is today, more and more content on the web is itself generated by AI and therefore not only useless but harmful if used as training data.

https://www.newscientist.com/article/2382519-ais-trained-on-ai-generated-images-produce-glitches-and-blurs/

[–] fruitycoder@sh.itjust.works 2 points 9 months ago

Orca 2 is an example of an opensource model that was built to better collect and build on synthetic data: https://www.microsoft.com/en-us/research/publication/orca-progressive-learning-from-complex-explanation-traces-of-gpt-4/

The case I think being made is that building LFMs on the Internet gets you closer to an average internet users level of our put, using reinforcement learning you can further curate the outputs, then finally using these techniques you can generate even tighter high quality models.

It's interesting stuff for sure.