This data as json
|964322136||MDU6SXNzdWU5NjQzMjIxMzY=||1426||Manage /robots.txt in Datasette core, block robots by default||9599||open||0||7||2021-08-09T19:56:56Z||2021-08-19T21:36:28Z||OWNER||
See accompanying Twitter thread: https://twitter.com/simonw/status/1424820203603431439
I have a lot of Datasettes deployed now, and tailing logs shows that they are being hammered by search engine crawlers even though many of them are not interesting enough to warrant indexing.
I'm starting to think blocking crawlers would actually be a better default for most people, provided it was well documented and easy to understand how to allow them.
Default-deny is usually a better policy than default-allow!