

Bots only identify themselves and their organization in the user agent, they don’t tell you specifically what they do with the data so stop your fairytales. They do give you a really handy url though with user agents and even IPs jn json if you want to fully block the crawlers but not the search bots sent by user prompts.
Your ad revenue money can be secured.
https://platform.openai.com/docs/bots/
If for some reason you can’t be bothered to edit your own robots.txt (because it’s hard to tell which bots are search bots for muh ad money) then maybe hire someone.
So far none of your ramblings disproves what I said. Yeah there are crawlers for niche collecting probably, nobody crawls the entire internet when they can use the weekly updated common crawl. Unless you or anyone else has access to unknown internal openAI policies on why they intentionally reinvent the wheel, your fake anecdotes (lol bots literally telling you they’re going to use scraping for training in the user agent) don’t cut it. You’re probably seeing search bots.
If you didn’t care for ad money and search engine exposure bozo you’d block everything in robots.txt and be done instead of whining about specific bots you don’t like.
You didn’t link to this but go on take their IPs json files and block them.