this post was submitted on 01 Dec 2023
125 points (83.4% liked)

Technology

58013 readers
3237 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] autotldr@lemmings.world 3 points 9 months ago

This is the best summary I could come up with:


Their work, which is yet to be peer reviewed, shows that while training massive AI models is incredibly energy intensive, it’s only one part of the puzzle.

For each of the tasks, such as text generation, Luccioni ran 1,000 prompts, and measured the energy used with a tool she developed called Code Carbon.

Generating 1,000 images with a powerful AI model, such as Stable Diffusion XL, is responsible for roughly as much carbon dioxide as driving the equivalent of 4.1 miles in an average gasoline-powered car.

AI startup Hugging Face has undertaken the tech sector’s first attempt to estimate the broader carbon footprint of a large language model.

The generative-AI boom has led big tech companies to  integrate powerful AI models into many different products, from email to word processing.

Luccioni tested different versions of Hugging Face’s multilingual AI model BLOOM to see how many uses would be needed to overtake training costs.


The original article contains 1,021 words, the summary contains 153 words. Saved 85%. I'm a bot and I'm open source!