A GitHub-hosted project offers a curated robots.txt file designed to block known AI crawlers from accessing website content.