this post was submitted on 07 Dec 2023
28 points (85.0% liked)

Programming

16971 readers
233 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] autotldr@lemmings.world 3 points 9 months ago

This is the best summary I could come up with:


Alongside its Gemini generative AI model, Google this morning took the wraps off of AlphaCode 2, an improved version of the code-generating AlphaCode introduced by Google’s DeepMind lab roughly a year ago.

AlphaCode 2 can understand programming challenges involving “complex” math and theoretical computer science.

And, among other reasonably sophisticated techniques, AlphaCode 2 is capable of dynamic programming, explains DeepMind research scientist Rémi Leblond in a prerecorded video.

Dynamic programming entails simplifying a complex problem by breaking it down into easier sub-problems over and over; Leblond says that AlphaCode 2 knows not only when to properly implement this strategy but where to use it.

According to the whitepaper, AlphaCode 2 requires a lot of trial and error, is too costly to operate at scale and relies heavily on being able to filter out obviously bad code samples.

“One of the things that was most exciting to me about the latest results is that when programmers collaborate with [AlphaCode 2 powered by] Gemini, by defining certain properties for the code to follow, the performance [of the model] gets even better,” Collins said.


The original article contains 567 words, the summary contains 181 words. Saved 68%. I'm a bot and I'm open source!