this post was submitted on 29 Oct 2023
75 points (96.3% liked)

Privacy

29872 readers
1650 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

Hi, I'm building a personal website and I don't want it to be used to train AI. In my robots.txt file I blocked:

  • ChatGPT-User
  • GPTBot
  • Google-Extended
  • FacebookBot

What bots should I also add? Are there any other ways to block AI bots?

IMPORTANT: I don't want to block search engine crawlers, only bots that are used to train AI.

you are viewing a single comment's thread
view the rest of the comments
[–] hperrin@lemmy.world 37 points 8 months ago (4 children)

Pollute your site with nonsense that’s invisible to users. Things like pages that are linked to with invisible links that are just walls and walls of random text.

[–] chevy9294@monero.town 12 points 8 months ago (1 children)

Good idea. I will made a invisible link to "traps for bots". One trap will show random text, one will be redirect loop and one would be random link generator that will link to itself. I will also make every response randomly slow, for example 0,5 to 1,5 seconds.

Good thing is that I can also block search engine crawlers from accessing only the traps.

[–] c24w@lemmy.world 4 points 8 months ago

If you're interested in traps, you can add a honeypot to your robots.txt. It comes with some risk of blocking legitimate users, though.

[–] Pantherina@feddit.de 10 points 8 months ago (1 children)
[–] chevy9294@monero.town 4 points 8 months ago (1 children)

Nice idea, but a lot of random text that user doean't see would slow down the website.

[–] Pantherina@feddit.de 2 points 8 months ago* (last edited 8 months ago)

I dont think thats really a big problem. Like simply make every key word useless, somehow automate the process.

There should be a tool for this damn, there is at least one Unicode character that doesnt even display a blank in a damn Terminal.

Like... modern web crap doesnt even load without Javascript or animations. So dont bother a bit more HTML

[–] folkrav@lemmy.ca 8 points 8 months ago (1 children)

OP still wants search indexing, in which case it's a big no-no - it can be perceived as spam by search engines, and links your pages to tons of unrelated keywords.

[–] chevy9294@monero.town 8 points 8 months ago

I can block search engine crawlers from specific paths so that should be solved.

[–] stewsters@lemmy.world 3 points 8 months ago

As long as you do not rely on SEO to get traffic. This has a good chance of affecting how Google sees your site as well.