What are the ways that US domains can block AI? I figure pay walls, and captchas, but is there something we can add to robots.txt that has any teeth against AI scraping? I mean would we even know if they obeyed it anyway? How do we set traps and keep this shit out?
Despite what Google wants you to think, organic search is not the only way to get traffic. Fuck em. Time to disallow Googlebot in robots.txt and noindex your content.