Poisoning AI scrapers
Inspired by Foone's suggestion this week I decided to start serving poisoned versions of my blog posts to any AI scrapers that I could identify—because I don't think it's enough to politely ask them to stop with a robots.txt file. They've already scraped my posts without asking; it's too late to undo that. But maybe I can hurt them just a little bit, going forward.
This post is about the technical side of implementing this.