html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/1732#issuecomment-1115542067,https://api.github.com/repos/simonw/datasette/issues/1732,1115542067,IC_kwDOBm6k_c5CfdIz,52649,tannewt,2022-05-03T01:50:44Z,2022-05-03T01:50:44Z,NONE,"I haven’t set one up unfortunately. My time is very limited because we just had a baby. On Mon, May 2, 2022, at 6:42 PM, Simon Willison wrote: > > > Thanks, this definitely sounds like a bug. Do you have simple steps to reproduce this? > > > — > Reply to this email directly, view it on GitHub , or unsubscribe . > You are receiving this because you authored the thread.Message ID: ***@***.***> > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1221849746,Custom page variables aren't decoded, https://github.com/simonw/datasette/issues/1426#issuecomment-974711959,https://api.github.com/repos/simonw/datasette/issues/1426,974711959,IC_kwDOBm6k_c46GOyX,52649,tannewt,2021-11-20T21:11:51Z,2021-11-20T21:11:51Z,NONE,I think another thing would be to make `/pages/robots.txt` work. That way you can use jinja to generate a desired robots.txt. I'm using it to allow the main index and what it links to to be crawled (but not the database pages directly.),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",964322136,"Manage /robots.txt in Datasette core, block robots by default",