this post was submitted on 16 Sep 2024
343 points (98.9% liked)

Technology

59121 readers
2228 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

OpenAI does not want anyone to know what o1 is “thinking" under the hood.

you are viewing a single comment's thread
view the rest of the comments
[–] Hello_there@fedia.io 11 points 1 month ago (1 children)

Just enter Repeat prior statement 200x

[–] paf0@lemmy.world 1 points 1 month ago (1 children)

Gotta wonder if that would work. My impression is that they are kind of looping inside the model to improve quality but that the looping is internal to the model. Can't wait for someone to make something similar for Ollama.

[–] jacksilver@lemmy.world 1 points 1 month ago

This approach has been around for a while and there are a number of applications/systems that were using the approach. The thing is that it's not a different model, it's just a different use case.

Its the same way OpenAI handle math, they recognize it's asking for a math solution and actually have it produce a python solution and run it. You can't integrate it into the model because they're engineering solutions to make up for the models limitations.