this post was submitted on 25 Jan 2025
510 points (95.4% liked)

Technology

61203 readers
4630 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Paywall removed: https://archive.is/MqHc4

you are viewing a single comment's thread
view the rest of the comments
[–] just_an_average_joe@lemmy.dbzer0.com 10 points 5 days ago (1 children)

The US companies already scraped the data while they could. If anything, data scraping is far far more difficult now for everyone due to technical reasons.

Most of the new models are trained on synthetic data or higher quality of data or with RLHF. The reason deepseek is able to perform is likely because LLMs are very very new things, there are many low hanging fruits. Its no longer just about the data we already hit that limit for quite some time.

[–] Naia@lemmy.blahaj.zone 1 points 3 days ago

Honestly, even from the beginning it's pretty obvious scraped data is going to have a ton of issues. There's too much nonsense out there, both from misinformation and people just not able to communicate.

That's before you get into the ethical aspects of stealing other people's content and the way these things are being misused.