{"html_url": "https://github.com/simonw/datasette/issues/731#issuecomment-618070791", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/731", "id": 618070791, "node_id": "MDEyOklzc3VlQ29tbWVudDYxODA3MDc5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-04-22T22:21:27Z", "updated_at": "2020-04-22T22:21:27Z", "author_association": "OWNER", "body": "I linked to this from https://github.com/zeit/now/discussions/4055", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 605110015, "label": "Option to automatically configure based on directory layout"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/717#issuecomment-609905777", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/717", "id": 609905777, "node_id": "MDEyOklzc3VlQ29tbWVudDYwOTkwNTc3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-04-06T16:41:04Z", "updated_at": "2020-04-06T16:43:05Z", "author_association": "OWNER", "body": "Full traceback from Zeit Now logs:\r\n```\r\nERROR: conn=, sql = \"select name from sqlite_master where type='table'\", params = None: disk I/O error\r\nTraceback (most recent call last):\r\nFile \"/var/task/datasette/utils/asgi.py\", line 121, in route_path\r\nreturn await view(new_scope, receive, send)\r\nFile \"/var/task/datasette/utils/asgi.py\", line 193, in view\r\nrequest, **scope[\"url_route\"][\"kwargs\"]\r\nFile \"/var/task/datasette/views/base.py\", line 61, in head\r\nresponse = await self.get(*args, **kwargs)\r\nFile \"/var/task/datasette/views/index.py\", line 27, in get\r\ntable_names = await db.table_names()\r\nFile \"/var/task/datasette/database.py\", line 221, in table_names\r\n\"select name from sqlite_master where type='table'\"\r\nFile \"/var/task/datasette/database.py\", line 167, in execute\r\nsql_operation_in_thread\r\nFile \"/var/task/datasette/database.py\", line 114, in execute_against_connection_in_thread\r\nself.ds.executor, in_thread\r\nFile \"/var/lang/lib/python3.6/concurrent/futures/thread.py\", line 56, in run\r\nresult = self.fn(*self.args, **self.kwargs)\r\nFile \"/var/task/datasette/database.py\", line 111, in in_thread\r\nreturn fn(conn)\r\nFile \"/var/task/datasette/database.py\", line 137, in sql_operation_in_thread\r\ncursor.execute(sql, params or {})sqlite3.OperationalError: disk I/O error\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 594189527, "label": "See if I can get Datasette working on Zeit Now v2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/705#issuecomment-603560898", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/705", "id": 603560898, "node_id": "MDEyOklzc3VlQ29tbWVudDYwMzU2MDg5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-24T23:41:32Z", "updated_at": "2020-03-24T23:41:32Z", "author_association": "OWNER", "body": "I can switch over to deploying that using Cloud Run. Unfortunately if I move away from Zeit Now v1 (since it's no longer supported and might stop working) I don't think I'll be able to deploy a permanent URL for every commit hash that I push any more, which is a real shame.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 585626199, "label": "latest.datasette.io is no longer updating"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/697#issuecomment-596264937", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/697", "id": 596264937, "node_id": "MDEyOklzc3VlQ29tbWVudDU5NjI2NDkzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-08T23:18:48Z", "updated_at": "2020-03-08T23:18:48Z", "author_association": "OWNER", "body": "Cancel that plan: I'm pretty sure the Travis configuration that publishes a demo to Zeit Now and builds a Docker image isn't designed to handle releases that don't correspond to current master. I guess I'll release 0.38 instead.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 577578306, "label": "index.html is not reliably loaded from a plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/513#issuecomment-503199253", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/513", "id": 503199253, "node_id": "MDEyOklzc3VlQ29tbWVudDUwMzE5OTI1Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-06-18T15:56:29Z", "updated_at": "2019-06-18T15:56:29Z", "author_association": "OWNER", "body": "Unfortunately not - I really wish this was possible. I have not yet found a great serverless solution for publishing 1GB+ databases - they're too big for Heroku, Cloud Run OR Zeit Now. Once databases get that big the only option I've found is to run a VPS (or an EC2 instance) with a mounted hard drive volume and execute `datasette serve` on that instance, with an nginx running on port 80 that proxies traffic back to Datasette.\r\n\r\nI'd love to figure out a way to make hosting larger databases as easy as it currently is to host small ones.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 457201907, "label": "Is it possible to publish to Heroku despite slug size being too large?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/409#issuecomment-472875713", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/409", "id": 472875713, "node_id": "MDEyOklzc3VlQ29tbWVudDQ3Mjg3NTcxMw==", "user": {"value": 209967, "label": "michaelmcandrew"}, "created_at": "2019-03-14T14:14:39Z", "updated_at": "2019-03-14T14:14:39Z", "author_association": "NONE", "body": "also linking this zeit issue in case it is helpful: https://github.com/zeit/now-examples/issues/163#issuecomment-440125769", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 408376825, "label": "Zeit API v1 does not work for new users - need to migrate to v2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/374#issuecomment-448437245", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/374", "id": 448437245, "node_id": "MDEyOklzc3VlQ29tbWVudDQ0ODQzNzI0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-12-19T01:35:59Z", "updated_at": "2018-12-19T01:35:59Z", "author_association": "OWNER", "body": "Closing this as Zeit went on a different direction with Now v2, so the 100MB limit is no longer a concern.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377518499, "label": "Get Datasette working with Zeit Now v2's 100MB image size limit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/374#issuecomment-439762759", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/374", "id": 439762759, "node_id": "MDEyOklzc3VlQ29tbWVudDQzOTc2Mjc1OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-11-19T03:41:36Z", "updated_at": "2018-11-19T03:41:36Z", "author_association": "OWNER", "body": "It turned out Zeit didn't end up shipping the new 100MB-limit Docker-based Zeit 2.0 after all - they ended up going in a completely different direction, towards lambdas instead (which don't really fit the Datasette model): https://zeit.co/blog/now-2\r\n\r\nBut... as far as I can tell they have introduced the 100MB image size for all free Zeit accounts ever against their 1.0 platform. So we still need to solve this, or free Zeit users won't be able to use `datasette publish now` even while 1.0 is still available.\r\n\r\nI made some notes on this here: https://simonwillison.net/2018/Nov/19/smaller-python-docker-images/\r\n\r\nI've got it working for the Datasette Publish webapp, but I still need to fix `datasette publish now` to create much smaller patterns.\r\n\r\nI know how to do this for regular datasette, but I haven't yet figured out an Alpine Linux pattern for spatialite extras:\r\n\r\nhttps://github.com/simonw/datasette/blob/5e3a432a0caa23837fa58134f69e2f82e4f632a6/datasette/utils.py#L287-L300", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377518499, "label": "Get Datasette working with Zeit Now v2's 100MB image size limit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/374#issuecomment-435976262", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/374", "id": 435976262, "node_id": "MDEyOklzc3VlQ29tbWVudDQzNTk3NjI2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-11-05T18:11:10Z", "updated_at": "2018-11-05T18:11:10Z", "author_association": "OWNER", "body": "I think there is a useful way forward here though: the image size may be limited to 100MB, but once the instance launches it gets access to a filesystem with a lot more space than that (possibly as much as 15GB given my initial poking around).\r\n\r\nSo... one potential solution here is to teach Datasette to launch from a smaller image and then download a larger SQLite file from a known URL as part of its initial startup.\r\n\r\nCombined with the ability to get Now to always run at least one copy of an instance this could allow Datasette to host much larger SQLite databases on that platform while playing nicely with the Zeit v2 platform.\r\n\r\nSee also https://github.com/zeit/now-cli/issues/1523", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377518499, "label": "Get Datasette working with Zeit Now v2's 100MB image size limit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/371#issuecomment-435767775", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/371", "id": 435767775, "node_id": "MDEyOklzc3VlQ29tbWVudDQzNTc2Nzc3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-11-05T06:27:33Z", "updated_at": "2018-11-05T06:27:33Z", "author_association": "OWNER", "body": "This would be fantastic - that tutorial looks like many of the details needed for this.\r\n\r\nDo you know if Digital Ocean have the ability to provision URLs for a droplet without you needing to buy your own domain name? Heroku have https://example.herokuapp.com/ and Zeit have https://blah.now.sh/ - does Digital Ocean have an equivalent?\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377156339, "label": "datasette publish digitalocean plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/366#issuecomment-433680598", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/366", "id": 433680598, "node_id": "MDEyOklzc3VlQ29tbWVudDQzMzY4MDU5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-10-28T06:38:43Z", "updated_at": "2018-10-28T06:38:43Z", "author_association": "OWNER", "body": "I've just started running into this as well. Looks like I'll have to anchor to v1 for the moment - I'm hoping the discussion on https://github.com/zeit/now-cli/issues/1523 encourages an increase in this limit policy :/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 369716228, "label": "Default built image size over Zeit Now 100MiB limit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/366#issuecomment-429737929", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/366", "id": 429737929, "node_id": "MDEyOklzc3VlQ29tbWVudDQyOTczNzkyOQ==", "user": {"value": 416374, "label": "gfrmin"}, "created_at": "2018-10-15T07:32:57Z", "updated_at": "2018-10-15T07:32:57Z", "author_association": "CONTRIBUTOR", "body": "Very hacky solution is to write now.json file forcing the usage of v1 of Zeit cloud, see https://github.com/slygent/datasette/commit/3ab824793ec6534b6dd87078aa46b11c4fa78ea3\r\n\r\nThis does work, at least.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 369716228, "label": "Default built image size over Zeit Now 100MiB limit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/333#issuecomment-405975025", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/333", "id": 405975025, "node_id": "MDEyOklzc3VlQ29tbWVudDQwNTk3NTAyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-07-18T15:36:11Z", "updated_at": "2018-07-18T15:40:04Z", "author_association": "OWNER", "body": "A `force_https_api_urls` config option would work here - if set, Datasette will ignore the incoming protocol and always use https. The `datasette deploy now` command could then add that as an option passed to `datasette serve`.\r\n\r\nThis is the pattern which is producing incorrect URLs on Zeit Now, because the Sanic `request.url` property is not being correctly set.\r\n\r\nhttps://github.com/simonw/datasette/blob/6e37f091edec35e2706197489f54fff5d890c63c/datasette/views/table.py#L653-L655\r\n\r\nSuggested help text:\r\n\r\n> Always use https:// for URLs output as part of Datasette API responses", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 338768551, "label": "Datasette on Zeit Now returns http URLs for facet and next links"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/308#issuecomment-405971920", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/308", "id": 405971920, "node_id": "MDEyOklzc3VlQ29tbWVudDQwNTk3MTkyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-07-18T15:27:12Z", "updated_at": "2018-07-18T15:27:12Z", "author_association": "OWNER", "body": "It looks like there are a few extra options we should support:\r\n\r\nhttps://devcenter.heroku.com/articles/heroku-cli-commands\r\n\r\n```\r\n -t, --team=team team to use\r\n --region=region specify region for the app to run in\r\n --space=space the private space to create the app in\r\n```\r\n\r\nSince these differ from the options for Zeit Now I think this means splitting up `datasette publish now` and `datasette publish Heroku` into separate subcommands.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 330826972, "label": "Support extra Heroku apps:create options - region, space, team"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/238#issuecomment-384362028", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/238", "id": 384362028, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDM2MjAyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-25T17:07:11Z", "updated_at": "2018-04-25T17:07:11Z", "author_association": "OWNER", "body": "On further thought: this is actually only an issue for immutable deployments to platforms like Zeit Now and Heroku.\r\n\r\nAs such, adding it to `datasette serve` feels clumsy. Maybe `datasette publish` should instead gain the ability to optionally install an extra mechanism that periodically pulls a fresh copy of `metadata.json` from a URL.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317714268, "label": "External metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/157#issuecomment-350496277", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/157", "id": 350496277, "node_id": "MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-12-09T18:29:41Z", "updated_at": "2017-12-09T18:29:41Z", "author_association": "OWNER", "body": "Example usage:\r\n\r\n datasette package --static css:extra-css/ --static js:extra-js/ \\\r\n \tsf-trees.db --template-dir templates/ --tag sf-trees --branch master\r\n\r\nThis creates a local Docker image that includes copies of the templates/,\r\nextra-css/ and extra-js/ directories. You can then run it like this:\r\n\r\n\tdocker run -p 8001:8001 sf-trees\r\n\r\nFor publishing to Zeit now:\r\n\r\n\tdatasette publish now --static css:extra-css/ --static js:extra-js/ \\\r\n\t\tsf-trees.db --template-dir templates/ --name sf-trees --branch master\r\n\r\nExample: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List\r\n\r\nFor publishing to Heroku:\r\n\r\n\tdatasette publish heroku --static css:extra-css/ --static js:extra-js/ \\\r\n\t\tsf-trees.db --template-dir templates/ --branch master\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 278190321, "label": "Teach \"datasette publish\" about custom template directories"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/160#issuecomment-350496258", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/160", "id": 350496258, "node_id": "MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-12-09T18:29:28Z", "updated_at": "2017-12-09T18:29:28Z", "author_association": "OWNER", "body": "Example usage:\r\n\r\n datasette package --static css:extra-css/ --static js:extra-js/ \\\r\n \tsf-trees.db --template-dir templates/ --tag sf-trees --branch master\r\n\r\nThis creates a local Docker image that includes copies of the templates/,\r\nextra-css/ and extra-js/ directories. You can then run it like this:\r\n\r\n\tdocker run -p 8001:8001 sf-trees\r\n\r\nFor publishing to Zeit now:\r\n\r\n\tdatasette publish now --static css:extra-css/ --static js:extra-js/ \\\r\n\t\tsf-trees.db --template-dir templates/ --name sf-trees --branch master\r\n\r\nExample: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List\r\n\r\nFor publishing to Heroku:\r\n\r\n\tdatasette publish heroku --static css:extra-css/ --static js:extra-js/ \\\r\n\t\tsf-trees.db --template-dir templates/ --branch master\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 278208011, "label": "Ability to bundle and serve additional static files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/94#issuecomment-344472313", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/94", "id": 344472313, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T03:08:00Z", "updated_at": "2017-11-15T03:08:00Z", "author_association": "OWNER", "body": "Works for me. I'm going to land this.\r\n\r\nJust one thing:\r\n\r\n simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now\r\n The publish command requires \"now\" to be installed and configured \r\n Follow the instructions at https://zeit.co/now#whats-now\r\n\r\nMaybe we should have the Docker container install the \"now\" client? Not sure how much size that would add though. I think it's OK without for the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273961179, "label": "Initial add simple prod ready Dockerfile refs #57"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343788581", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343788581, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T01:48:17Z", "updated_at": "2017-11-13T01:48:17Z", "author_association": "OWNER", "body": "I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052\r\n\r\n\"page_rules__datasettes_com___cloudflare_-_web_performance___security\"\r\n\r\nI also have to flip this switch off every time I want to add a new alias:\r\n\r\n\"crypto__datasettes_com___cloudflare_-_web_performance___security\"\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343780539", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343780539, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:13:29Z", "updated_at": "2017-11-13T00:19:46Z", "author_association": "OWNER", "body": "https://zeit.co/docs/features/dns is docs\r\n\r\n now domain add -e datasettes.com\r\n\r\nI had to set up a custom TXT record on `_now.datasettes.com` to get this to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null}