{"html_url": "https://github.com/simonw/sqlite-utils/issues/230#issuecomment-778812684", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/230", "id": 778812684, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODgxMjY4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-14T17:45:16Z", "updated_at": "2021-02-14T17:45:16Z", "author_association": "OWNER", "body": "Running this could take any CSV (or TSV) file and automatically detect the delimiter. If no header row is detected it could add `unknown1,unknown2` headers:\r\n\r\n sqlite-utils insert db.db data file.csv --sniff\r\n\r\n(Using `--sniff` would imply `--csv`)\r\n\r\nThis could be called `--sniffer` instead but I like `--sniff` better.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 808008305, "label": "--sniff option for sniffing delimiters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778812050", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/228", "id": 778812050, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODgxMjA1MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-14T17:41:30Z", "updated_at": "2021-02-14T17:41:30Z", "author_association": "OWNER", "body": "I just spotted that `csv.Sniffer` in the Python standard library has a `.has_header(sample)` method which detects if the first row appears to be a header or not, which is interesting. https://docs.python.org/3/library/csv.html#csv.Sniffer", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 807437089, "label": "--no-headers option for CSV and TSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778811934", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/228", "id": 778811934, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODgxMTkzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-14T17:40:48Z", "updated_at": "2021-02-14T17:40:48Z", "author_association": "OWNER", "body": "Another pattern that might be useful is to generate a header that is just \"unknown1,unknown2,unknown3\" for each of the columns in the rest of the file. This makes it easy to e.g. facet-explore within Datasette to figure out the correct names, then use `sqlite-utils transform --rename` to rename the columns.\r\n\r\nI needed to do that for the https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c example.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 807437089, "label": "--no-headers option for CSV and TSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778511347", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/228", "id": 778511347, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODUxMTM0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T23:27:50Z", "updated_at": "2021-02-12T23:27:50Z", "author_association": "OWNER", "body": "For the moment, a workaround can be to `cat` an additional row onto the start of the file.\r\n\r\n echo \"name,url,description\" | cat - missing_headings.csv | sqlite-utils insert blah.db table - --csv", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 807437089, "label": "--no-headers option for CSV and TSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778510528", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/131", "id": 778510528, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODUxMDUyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T23:25:06Z", "updated_at": "2021-02-12T23:25:06Z", "author_association": "OWNER", "body": "If `-c` isn't available, maybe `-t` or `--type` would work for specifying column types:\r\n```\r\nsqlite-utils insert db.db images images.tsv \\\r\n --tsv \\\r\n --type id int \\\r\n --type score float\r\n```\r\nor\r\n```\r\nsqlite-utils insert db.db images images.tsv \\\r\n --tsv \\\r\n -t id int \\\r\n -t score float\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 675753042, "label": "sqlite-utils insert: options for column types"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778508887", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/131", "id": 778508887, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODUwODg4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T23:20:11Z", "updated_at": "2021-02-12T23:20:11Z", "author_association": "OWNER", "body": "Annoyingly `-c` is currently a shortcut for `--csv` - so I'd have to do a major version bump to use that.\r\n\r\nhttps://github.com/simonw/sqlite-utils/blob/726219c3503e77440975cd15b74d006639feb0f8/sqlite_utils/cli.py#L601-L603\r\n\r\nParticularly annoying because I attempted to remove the `-c` shortcut in https://github.com/simonw/sqlite-utils/commit/2c00567aac6d9c79087cfff0d054f64922b1473d#diff-76294b3d4afeb27e74e738daa01c26dd4dc9ccb6f4477451483a2ece1095902eL48 but forgot to remove it from the input options (I removed it from the output options).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 675753042, "label": "sqlite-utils insert: options for column types"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1220#issuecomment-778467759", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1220", "id": 778467759, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ==", "user": {"value": 30607, "label": "aborruso"}, "created_at": "2021-02-12T21:35:17Z", "updated_at": "2021-02-12T21:35:17Z", "author_association": "NONE", "body": "Thank you", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806743116, "label": "Installing datasette via docker: Path 'fixtures.db' does not exist"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1220#issuecomment-778439617", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1220", "id": 778439617, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODQzOTYxNw==", "user": {"value": 7476523, "label": "bobwhitelock"}, "created_at": "2021-02-12T20:33:27Z", "updated_at": "2021-02-12T20:33:27Z", "author_association": "CONTRIBUTOR", "body": "That Docker command will mount your current directory inside the Docker container at `/mnt` - so you shouldn't need to change anything locally, just run\r\n\r\n```\r\ndocker run -p 8001:8001 -v `pwd`:/mnt \\\r\n datasetteproject/datasette \\\r\n datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db\r\n```\r\n\r\nand it will use the `fixtures.db` file within your current directory", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806743116, "label": "Installing datasette via docker: Path 'fixtures.db' does not exist"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770069864", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60", "id": 770069864, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDA2OTg2NA==", "user": {"value": 22578954, "label": "daniel-butler"}, "created_at": "2021-01-29T21:52:05Z", "updated_at": "2021-02-12T18:29:43Z", "author_association": "CONTRIBUTOR", "body": "For the purposes below I am assuming the organization I would get all the repositories and their related commits from is called `gh-organization`. The github's owner id of gh-orgnization is `123456789`.\r\n\r\n```bash\r\ngithub-to-sqlite repos github.db gh-organization\r\n```\r\n\r\nI'm on a windows computer running git bash to be able to use the `|` command. This works for me\r\n```bash\r\nsqlite3 github.db \"SELECT full_name FROM repos WHERE owner = '123456789';\" | tr '\\n\\r' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }\r\n```\r\n\r\nOn a pure linux system I think this would work because the new line character is normally `\\n`\r\n```bash\r\nsqlite3 github.db \"SELECT full_name FROM repos WHERE owner = '123456789';\" | tr '\\n' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }`\r\n```\r\n\r\nAs expected I ran into rate limit issues #51 \r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797097140, "label": "Use Data from SQLite in other commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778349672", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/228", "id": 778349672, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODM0OTY3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T18:00:43Z", "updated_at": "2021-02-12T18:00:43Z", "author_association": "OWNER", "body": "I could combine this with #131 to allow types to be specified in addition to column names.\r\n\r\nProbably need an option that means \"ignore the existing heading row and use this one instead\".", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 807437089, "label": "--no-headers option for CSV and TSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/227#issuecomment-778349142", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/227", "id": 778349142, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODM0OTE0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T17:59:35Z", "updated_at": "2021-02-12T17:59:35Z", "author_association": "OWNER", "body": "It looks like I can at least bump this size limit up to the maximum allowed by Python - I'll take a look at that. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 807174161, "label": "Error reading csv files with large column data"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778246347", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 778246347, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODI0NjM0Nw==", "user": {"value": 41546558, "label": "RhetTbull"}, "created_at": "2021-02-12T15:00:43Z", "updated_at": "2021-02-12T15:00:43Z", "author_association": "CONTRIBUTOR", "body": "Yes, Big Sur Photos database doesn't have `ZGENERICASSET` table. PR #31 will fix this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778014990", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 778014990, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODAxNDk5MA==", "user": {"value": 675335, "label": "leafgarland"}, "created_at": "2021-02-12T06:54:14Z", "updated_at": "2021-02-12T06:54:14Z", "author_association": "NONE", "body": "Ahh, that might be because macOS Big Sur has changed the structure of the photos db. Might need to wait for a later release, there is a PR which adds support for Big Sur. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1220#issuecomment-778008752", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1220", "id": 778008752, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg==", "user": {"value": 30607, "label": "aborruso"}, "created_at": "2021-02-12T06:37:34Z", "updated_at": "2021-02-12T06:37:34Z", "author_association": "NONE", "body": "I have used my path, I'm running it from the folder in wich I have the db.\n\nDo I must an absolute path?\n\nDo I must create exactly that folder?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806743116, "label": "Installing datasette via docker: Path 'fixtures.db' does not exist"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778002092", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 778002092, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODAwMjA5Mg==", "user": {"value": 11855322, "label": "robmarkcole"}, "created_at": "2021-02-12T06:19:32Z", "updated_at": "2021-02-12T06:19:32Z", "author_association": "NONE", "body": "hi @leafgarland that results in a new error:\r\n```\r\n(venv) (base) Robins-MacBook:datasette robin$ dogsheep-photos apple-photos photos.db\r\nTraceback (most recent call last):\r\n File \"/Users/robin/datasette/venv/bin/dogsheep-photos\", line 8, in \r\n sys.exit(cli())\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/cli.py\", line 206, in apple_photos\r\n db.conn.execute(\r\nsqlite3.OperationalError: no such table: attached.ZGENERICASSET\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-777951854", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 777951854, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzk1MTg1NA==", "user": {"value": 675335, "label": "leafgarland"}, "created_at": "2021-02-12T03:54:39Z", "updated_at": "2021-02-12T03:54:39Z", "author_association": "NONE", "body": "I think that is a typo in the docs, you can use\r\n\r\n > dogsheep-photos apple-photos photos.db", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1223#issuecomment-777949755", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1223", "id": 777949755, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzk0OTc1NQ==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-02-12T03:45:31Z", "updated_at": "2021-02-12T03:45:31Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=h1) Report\n> Merging [#1223](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=desc) (d1cd1f2) into [main](https://codecov.io/gh/simonw/datasette/commit/9603d893b9b72653895318c9104d754229fdb146?el=desc) (9603d89) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1223/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1223 +/- ##\n=======================================\n Coverage 91.42% 91.42% \n=======================================\n Files 32 32 \n Lines 3955 3955 \n=======================================\n Hits 3616 3616 \n Misses 339 339 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=footer). Last update [9603d89...d1cd1f2](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806918878, "label": "Add compile option to Dockerfile to fix failing test (fixes #696)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1220#issuecomment-777927946", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1220", "id": 777927946, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzkyNzk0Ng==", "user": {"value": 7476523, "label": "bobwhitelock"}, "created_at": "2021-02-12T02:29:54Z", "updated_at": "2021-02-12T02:29:54Z", "author_association": "CONTRIBUTOR", "body": "According to https://github.com/simonw/datasette/blob/master/docs/installation.rst#using-docker it should be\r\n\r\n```\r\ndocker run -p 8001:8001 -v `pwd`:/mnt \\\r\n datasetteproject/datasette \\\r\n datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db\r\n```\r\n\r\nThis uses `/mnt/fixtures.db` whereas you're using `fixtures.db` - did you try using this path instead?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806743116, "label": "Installing datasette via docker: Path 'fixtures.db' does not exist"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1221#issuecomment-777901052", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1221", "id": 777901052, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzkwMTA1Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T01:09:54Z", "updated_at": "2021-02-12T01:09:54Z", "author_association": "OWNER", "body": "I also tested this manually. I generated certificate files like so:\r\n\r\n cd /tmp\r\n python -m trustme\r\n\r\nThis created `/tmp/server.pem`, `/tmp/client.pem` and `/tmp/server.key`\r\n\r\nThen I started Datasette like this:\r\n\r\n datasette --memory --ssl-keyfile=/tmp/server.key --ssl-certfile=/tmp/server.pem\r\n\r\nAnd exercise it using `curl` like so:\r\n\r\n /tmp % curl --cacert /tmp/client.pem 'https://localhost:8001/_memory.json'\r\n {\"database\": \"_memory\", \"path\": \"/_memory\", \"size\": 0, \"tables\": [], \"hidden_count\": 0, \"views\": [], \"queries\": [],\r\n \"private\": false, \"allow_execute_sql\": true, \"query_ms\": 0.8843200000114848}\r\n\r\nNote that without the `--cacert` option I get an error:\r\n\r\n```\r\n/tmp % curl 'https://localhost:8001/_memory.json' \r\ncurl: (60) SSL certificate problem: Invalid certificate chain\r\nMore details here: https://curl.haxx.se/docs/sslcerts.html\r\n\r\ncurl failed to verify the legitimacy of the server and therefore could not\r\nestablish a secure connection to it. To learn more about this situation and\r\nhow to fix it, please visit the web page mentioned above.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806849424, "label": "Support SSL/TLS directly"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1221#issuecomment-777887190", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1221", "id": 777887190, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzg4NzE5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T00:29:18Z", "updated_at": "2021-02-12T00:29:18Z", "author_association": "OWNER", "body": "I can use this recipe to start a `datasette` server in a sub-process during the pytest run and exercise it with real HTTP requests: https://til.simonwillison.net/pytest/subprocess-server", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806849424, "label": "Support SSL/TLS directly"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1221#issuecomment-777883452", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1221", "id": 777883452, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzg4MzQ1Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T00:19:30Z", "updated_at": "2021-02-12T00:19:40Z", "author_association": "OWNER", "body": "Uvicorn supports these options: https://www.uvicorn.org/#command-line-options\r\n```\r\n --ssl-keyfile TEXT SSL key file\r\n --ssl-certfile TEXT SSL certificate file\r\n --ssl-keyfile-password TEXT SSL keyfile password\r\n --ssl-version INTEGER SSL version to use (see stdlib ssl module's)\r\n [default: 2]\r\n\r\n --ssl-cert-reqs INTEGER Whether client certificate is required (see\r\n stdlib ssl module's) [default: 0]\r\n\r\n --ssl-ca-certs TEXT CA certificates file\r\n --ssl-ciphers TEXT Ciphers to use (see stdlib ssl module's)\r\n [default: TLSv1]\r\n```\r\nFor the moment I'm going to support just `--ssl-keyfile` and `--ssl-certfile` as arguments to `datasette serve`. I'll add other options if people ask for them.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806849424, "label": "Support SSL/TLS directly"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/pull/10#issuecomment-777839351", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/10", "id": 777839351, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzgzOTM1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T22:37:55Z", "updated_at": "2021-02-11T22:37:55Z", "author_association": "MEMBER", "body": "I've merged these changes by hand now, thanks!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 770712149, "label": "BugFix for encoding and not update info."}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/issues/7#issuecomment-777827396", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/7", "id": 777827396, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzgyNzM5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T22:13:14Z", "updated_at": "2021-02-11T22:13:14Z", "author_association": "MEMBER", "body": "My best guess is that you have an older version of `sqlite-utils` installed here - the `replace=True` argument was added in version 2.0. I've bumped the dependency in `setup.py`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743297582, "label": "evernote-to-sqlite on windows 10 give this error: TypeError: insert() got an unexpected keyword argument 'replace'"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/issues/9#issuecomment-777821383", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/9", "id": 777821383, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzgyMTM4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T22:01:28Z", "updated_at": "2021-02-11T22:01:28Z", "author_association": "MEMBER", "body": "Aha! I think I've figured out what's going on here.\r\n\r\nThe CData blocks containing the notes look like this:\r\n\r\n`
This note includes two images.

...`\r\n\r\nThe DTD at http://xml.evernote.com/pub/enml2.dtd includes some entities:\r\n\r\n```\r\n\r\n\r\n\r\n%HTMLlat1;\r\n\r\n\r\n%HTMLsymbol;\r\n\r\n\r\n%HTMLspecial;\r\n```\r\nSo I need to be able to handle all of those different entities. I think I can do that using `html.entities.entitydefs` from the Python standard library, which looks a bit like this:\r\n\r\n```python\r\n{'Aacute': '\u00c1',\r\n 'aacute': '\u00e1',\r\n 'Aacute;': '\u00c1',\r\n 'aacute;': '\u00e1',\r\n 'Abreve;': '\u0102',\r\n 'abreve;': '\u0103',\r\n 'ac;': '\u223e',\r\n 'acd;': '\u223f',\r\n# ...\r\n}\r\n```\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 748372469, "label": "ParseError: undefined entity š"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777798330", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11", "id": 777798330, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzc5ODMzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T21:18:58Z", "updated_at": "2021-02-11T21:18:58Z", "author_association": "MEMBER", "body": "Thanks for the fix!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792851444, "label": "XML parse error"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777690332", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11", "id": 777690332, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzY5MDMzMg==", "user": {"value": 3613583, "label": "dskrad"}, "created_at": "2021-02-11T18:16:01Z", "updated_at": "2021-02-11T18:16:01Z", "author_association": "NONE", "body": "I solved this issue by modifying line 31 of utils.py\r\n\r\n content = ET.tostring(ET.fromstring(content_xml.strip())).decode(\"utf-8\")", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792851444, "label": "XML parse error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1200#issuecomment-777178728", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1200", "id": 777178728, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzE3ODcyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T03:13:59Z", "updated_at": "2021-02-11T03:13:59Z", "author_association": "OWNER", "body": "I came up with the need for this while playing with this tool: https://calands.datasettes.com/calands?sql=select%0D%0A++AsGeoJSON(geometry)%2C+*%0D%0Afrom%0D%0A++CPAD_2020a_SuperUnits%0D%0Awhere%0D%0A++PARK_NAME+like+'%25mini%25'+and%0D%0A++Intersects(GeomFromGeoJSON(%3Afreedraw)%2C+geometry)+%3D+1%0D%0A++and+CPAD_2020a_SuperUnits.rowid+in+(%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++SpatialIndex%0D%0A++++where%0D%0A++++++f_table_name+%3D+'CPAD_2020a_SuperUnits'%0D%0A++++++and+search_frame+%3D+GeomFromGeoJSON(%3Afreedraw)%0D%0A++)&freedraw={\"type\"%3A\"MultiPolygon\"%2C\"coordinates\"%3A[[[[-122.42202758789064%2C37.82280243352759]%2C[-122.39868164062501%2C37.823887203271454]%2C[-122.38220214843751%2C37.81846319511331]%2C[-122.35061645507814%2C37.77071473849611]%2C[-122.34924316406251%2C37.74465712069939]%2C[-122.37258911132814%2C37.703380457832374]%2C[-122.39044189453125%2C37.690340943717715]%2C[-122.41241455078126%2C37.680559803205135]%2C[-122.44262695312501%2C37.67295135774715]%2C[-122.47283935546876%2C37.67295135774715]%2C[-122.52502441406251%2C37.68382032669382]%2C[-122.53463745117189%2C37.6892542140253]%2C[-122.54699707031251%2C37.690340943717715]%2C[-122.55798339843751%2C37.72945260537781]%2C[-122.54287719726564%2C37.77831314799672]%2C[-122.49893188476564%2C37.81303878836991]%2C[-122.46185302734376%2C37.82822612280363]%2C[-122.42889404296876%2C37.82822612280363]%2C[-122.42202758789064%2C37.82280243352759]]]]} - before I fixed https://github.com/simonw/datasette-leaflet-geojson/issues/16 it was loading a LOT of maps, which felt bad. I wanted to be able to link people to that page with a hard limit on the number of rows displayed on that page.\r\n\r\nIt's mainly to guard against unexpected behaviour from limit-less queries though. It's not a very high priority feature!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792890765, "label": "?_size=10 option for the arbitrary query page would be useful"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1200#issuecomment-777132761", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1200", "id": 777132761, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzEzMjc2MQ==", "user": {"value": 7476523, "label": "bobwhitelock"}, "created_at": "2021-02-11T00:29:52Z", "updated_at": "2021-02-11T00:29:52Z", "author_association": "CONTRIBUTOR", "body": "I'm probably missing something but what's the use case here - what would this offer over adding `limit 10` to the query?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792890765, "label": "?_size=10 option for the arbitrary query page would be useful"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1219#issuecomment-775442039", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1219", "id": 775442039, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NTQ0MjAzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-08T20:39:52Z", "updated_at": "2021-02-08T22:13:00Z", "author_association": "OWNER", "body": "This comment helped me find a pattern for running Scalene against the Datasette test suite: https://github.com/emeryberger/scalene/issues/70#issuecomment-755245858\r\n\r\n```\r\npip install scalene\r\n```\r\nThen I created a file called `run_tests.py` with the following contents:\r\n```python\r\nif __name__ == \"__main__\":\r\n import sys, pytest\r\n pytest.main(sys.argv)\r\n```\r\nThen I ran this:\r\n```\r\nscalene --profile-all run_tests.py -sv -x .\r\n```\r\nBut... it quit with a segmentation fault!\r\n```\r\n(datasette) datasette % scalene --profile-all run_tests.py -sv -x .\r\n======================================================================== test session starts ========================================================================\r\nplatform darwin -- Python 3.8.6, pytest-6.0.1, py-1.9.0, pluggy-0.13.1 -- python\r\ncachedir: .pytest_cache\r\nrootdir: /Users/simon/Dropbox/Development/datasette, configfile: pytest.ini\r\nplugins: asyncio-0.14.0, timeout-1.4.2\r\ncollecting ... Fatal Python error: Segmentation fault\r\n\r\nCurrent thread 0x0000000110c1edc0 (most recent call first):\r\n File \"/Users/simon/Dropbox/Development/datasette/datasette/utils/__init__.py\", line 553 in detect_json1\r\n File \"/Users/simon/Dropbox/Development/datasette/datasette/filters.py\", line 168 in Filters\r\n File \"/Users/simon/Dropbox/Development/datasette/datasette/filters.py\", line 94 in \r\n File \"\", line 219 in _call_with_frames_removed\r\n File \"\", line 783 in exec_module\r\n File \"\", line 671 in _load_unlocked\r\n File \"\", line 975 in _find_and_load_unlocked\r\n File \"\", line 991 in _find_and_load\r\n File \"/Users/simon/Dropbox/Development/datasette/datasette/views/table.py\", line 27 in \r\n File \"\", line 219 in _call_with_frames_removed\r\n File \"\", line 783 in exec_module\r\n File \"\", line 671 in _load_unlocked\r\n File \"\", line 975 in _find_and_load_unlocked\r\n File \"\", line 991 in _find_and_load\r\n File \"/Users/simon/Dropbox/Development/datasette/datasette/app.py\", line 42 in \r\n File \"\", line 219 in _call_with_frames_removed\r\n File \"\", line 783 in exec_module\r\n File \"\", line 671 in _load_unlocked\r\n File \"\", line 975 in _find_and_load_unlocked\r\n File \"\", line 991 in _find_and_load\r\n File \"/Users/simon/Dropbox/Development/datasette/tests/test_api.py\", line 1 in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/assertion/rewrite.py\", line 170 in exec_module\r\n File \"\", line 671 in _load_unlocked\r\n File \"\", line 975 in _find_and_load_unlocked\r\n File \"\", line 991 in _find_and_load\r\n File \"\", line 1014 in _gcd_import\r\n File \"/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/importlib/__init__.py\", line 127 in import_module\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/pathlib.py\", line 520 in import_path\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py\", line 552 in _importtestmodule\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py\", line 484 in _getobj\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py\", line 288 in obj\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py\", line 500 in _inject_setup_module_fixture\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/python.py\", line 487 in collect\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py\", line 324 in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py\", line 294 in from_call\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py\", line 324 in pytest_make_collect_report\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/callers.py\", line 187 in _multicall\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py\", line 84 in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py\", line 93 in _hookexec\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/hooks.py\", line 286 in __call__\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/runner.py\", line 441 in collect_one_node\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 768 in genitems\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 771 in genitems\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 568 in _perform_collect\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 516 in perform_collect\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 306 in pytest_collection\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/callers.py\", line 187 in _multicall\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py\", line 84 in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py\", line 93 in _hookexec\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/hooks.py\", line 286 in __call__\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 295 in _main\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 240 in wrap_session\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/main.py\", line 289 in pytest_cmdline_main\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/callers.py\", line 187 in _multicall\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py\", line 84 in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/manager.py\", line 93 in _hookexec\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/pluggy/hooks.py\", line 286 in __call__\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/_pytest/config/__init__.py\", line 157 in main\r\n File \"run_tests.py\", line 3 in \r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/scalene/scalene_profiler.py\", line 1525 in main\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/scalene/__main__.py\", line 7 in main\r\n File \"/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/scalene/__main__.py\", line 14 in \r\n File \"/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py\", line 87 in _run_code\r\n File \"/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py\", line 194 in _run_module_as_main\r\nScalene error: received signal SIGSEGV\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803929694, "label": "Try profiling Datasette using scalene"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1219#issuecomment-775497449", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1219", "id": 775497449, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NTQ5NzQ0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-08T22:11:34Z", "updated_at": "2021-02-08T22:11:34Z", "author_association": "OWNER", "body": "https://github.com/emeryberger/scalene/issues/110 reports a \"received signal SIGSEGV\" error that was fixed by upgrading to the latest Scalene version, but I'm running that already.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803929694, "label": "Try profiling Datasette using scalene"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774730656", "issue_url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9", "id": 774730656, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDczMDY1Ng==", "user": {"value": 635179, "label": "merwok"}, "created_at": "2021-02-07T18:45:04Z", "updated_at": "2021-02-07T18:45:04Z", "author_association": "NONE", "body": "That URL uses TLS 1.3, but maybe only if the client supports it.\r\nIt could be your Python version or your SSL library that\u2019s not recent enough.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 801780625, "label": "SSL Error"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774726123", "issue_url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9", "id": 774726123, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDcyNjEyMw==", "user": {"value": 12669260, "label": "jfeiwell"}, "created_at": "2021-02-07T18:21:08Z", "updated_at": "2021-02-07T18:21:08Z", "author_association": "NONE", "body": "@simonw any ideas here?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 801780625, "label": "SSL Error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1217#issuecomment-774528913", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1217", "id": 774528913, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDUyODkxMw==", "user": {"value": 639730, "label": "virtadpt"}, "created_at": "2021-02-06T19:23:41Z", "updated_at": "2021-02-06T19:23:41Z", "author_association": "NONE", "body": "I've had a lot of success running it as an OpenFaaS lambda.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 802513359, "label": "Possible to deploy as a python app (for Rstudio connect server)?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1217#issuecomment-774385092", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1217", "id": 774385092, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDM4NTA5Mg==", "user": {"value": 6165713, "label": "plpxsk"}, "created_at": "2021-02-06T02:49:11Z", "updated_at": "2021-02-06T02:49:11Z", "author_association": "NONE", "body": "A good reference seems to be the note to run `datasette` as a module in https://github.com/simonw/datasette/pull/556\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 802513359, "label": "Possible to deploy as a python app (for Rstudio connect server)?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/223#issuecomment-774373829", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/223", "id": 774373829, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDM3MzgyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-06T01:39:47Z", "updated_at": "2021-02-06T01:39:47Z", "author_association": "OWNER", "body": "Documentation: https://sqlite-utils.datasette.io/en/stable/cli.html#cli-insert-csv-tsv-delimiter", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 788527932, "label": "--delimiter option for CSV import"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1208#issuecomment-774286962", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1208", "id": 774286962, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDI4Njk2Mg==", "user": {"value": 4488943, "label": "kbaikov"}, "created_at": "2021-02-05T21:02:39Z", "updated_at": "2021-02-05T21:02:39Z", "author_association": "CONTRIBUTOR", "body": "@simonw could you please take a look at the PR 1211 that fixes this issue?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 794554881, "label": "A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-774217792", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 774217792, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDIxNzc5Mg==", "user": {"value": 1049910, "label": "drkane"}, "created_at": "2021-02-05T18:44:13Z", "updated_at": "2021-02-05T18:44:13Z", "author_association": "NONE", "body": "Thanks for looking at this - home schooling kids has prevented me from replying. \r\n\r\nI'd struggled with how to adapt the API for the foreign keys too - I definitely tried the String/Tuple approach. I hadn't considered the breaking changes that would introduce though. I can take a look at this and try and make the change - see which of your options works best.\r\n\r\nI've got a workaround for the use-case I was looking at this for, so it wouldn't be a problem for me if it was put on the back burner until a hypothetical v4.0 anyway.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1210#issuecomment-773977128", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1210", "id": 773977128, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Mzk3NzEyOA==", "user": {"value": 525780, "label": "heyarne"}, "created_at": "2021-02-05T11:30:34Z", "updated_at": "2021-02-05T11:30:34Z", "author_association": "NONE", "body": "Thanks for your quick reply! Having changed my `metadata.yml`, queries AND database I can't really reproduce it anymore, sorry. But at least I'm happy to say that it works now! :) Thanks again for the super nifty tool, very appreciated.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796234313, "label": "Immutable Database w/ Canned Queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1216#issuecomment-772796111", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1216", "id": 772796111, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Mjc5NjExMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-03T20:20:48Z", "updated_at": "2021-02-03T20:20:48Z", "author_association": "OWNER", "body": "Relevant code: https://github.com/simonw/datasette/blob/1600d2a3ec3ada1f6fb5b1eb73bdaeccb5f80530/datasette/app.py#L620-L632", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 800669347, "label": "/-/databases should reflect connection order, not alphabetical order"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-772408273", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56", "id": 772408273, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MjQwODI3Mw==", "user": {"value": 42315895, "label": "gsajko"}, "created_at": "2021-02-03T10:36:36Z", "updated_at": "2021-02-03T10:36:36Z", "author_association": "NONE", "body": "I figured it out.\r\nThose tweets are in database, because somebody quote tweeted them, or retweeted them.\r\nAnd if you grab quoted tweet or reweeted tweet from other tweet json, It doesn't grab all of the details.\r\n\r\nSo if someone quote tweeted a quote tweet, the second quote tweet won't have `quoted_status`. \r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796736607, "label": "Not all quoted statuses get fetched?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1212#issuecomment-772007663", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1212", "id": 772007663, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MjAwNzY2Mw==", "user": {"value": 4488943, "label": "kbaikov"}, "created_at": "2021-02-02T21:36:56Z", "updated_at": "2021-02-02T21:36:56Z", "author_association": "CONTRIBUTOR", "body": "How do you get 4-5 minutes?\r\nI run my tests in WSL 2, so may be i need to try a real linux VM.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797651831, "label": "Tests are very slow. "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1214#issuecomment-772001787", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1214", "id": 772001787, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MjAwMTc4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T21:28:53Z", "updated_at": "2021-02-02T21:28:53Z", "author_association": "OWNER", "body": "Fix is now live on https://latest.datasette.io/fixtures/searchable?_search=terry - clearing \"terry\" and re-submitting the form now works as expected.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 799693777, "label": "Re-submitting filter form duplicates _x querystring arguments"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1214#issuecomment-771992628", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1214", "id": 771992628, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk5MjYyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T21:15:18Z", "updated_at": "2021-02-02T21:15:18Z", "author_association": "OWNER", "body": "The cause of this bug is form fields which begin with `_` but ARE displayed as form inputs on the page - hence should not be duplicated in an `` element.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 799693777, "label": "Re-submitting filter form duplicates _x querystring arguments"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1214#issuecomment-771992025", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1214", "id": 771992025, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk5MjAyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T21:14:16Z", "updated_at": "2021-02-02T21:14:16Z", "author_association": "OWNER", "body": "As a result, navigating to https://github-to-sqlite.dogsheep.net/github/labels?_search=help and clearing out the `_search` field then submitting the form does NOT clear the search term.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 799693777, "label": "Re-submitting filter form duplicates _x querystring arguments"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1212#issuecomment-771976561", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1212", "id": 771976561, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk3NjU2MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T20:53:27Z", "updated_at": "2021-02-02T20:53:27Z", "author_association": "OWNER", "body": "It would be great if we could get `python-xdist` to run too - I tried it in the past and gave up when I ran into those race conditions, but I've not done any further digging to see if there's a way to fix that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797651831, "label": "Tests are very slow. "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1212#issuecomment-771975941", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1212", "id": 771975941, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk3NTk0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T20:52:36Z", "updated_at": "2021-02-02T20:52:36Z", "author_association": "OWNER", "body": "37 minutes, wow! They're a little slow for me (4-5 minutes perhaps) but not nearly that bad.\r\n\r\nThanks for running that profile. I think you're right: figuring out how to use more session scopes would definitely help.\r\n\r\nThe `:memory:` idea is interesting too. The new `memory_name=` feature added in #1151 (released in Datasette 0.54) could help a lot here, since it allows Datasette instances to share the same in-memory database across multiple HTTP requests and connections.\r\n\r\nNote that `memory_name=` also persists within test runs themselves, independently of any `scope=` options on the fixtures. That might actually help us here!\r\n\r\nI'd be delighted if you explored this issue further, especially the option of using `memory_name=` for the fixtures databases used by the tests.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797651831, "label": "Tests are very slow. "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1213#issuecomment-771968675", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1213", "id": 771968675, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk2ODY3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T20:41:55Z", "updated_at": "2021-02-02T20:41:55Z", "author_association": "OWNER", "body": "So maybe I could a special response header which ASGI middleware can pick up that means \"Don't attempt to gzip this, just stream it through\".", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 799663959, "label": "gzip support for HTML (and JSON) responses"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1213#issuecomment-771968177", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1213", "id": 771968177, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk2ODE3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T20:41:13Z", "updated_at": "2021-02-02T20:41:13Z", "author_association": "OWNER", "body": "Starlette accumulates the full response body in a `body` variable and then does this:\r\n```python\r\n elif message_type == \"http.response.body\":\r\n # Remaining body in streaming GZip response.\r\n body = message.get(\"body\", b\"\")\r\n more_body = message.get(\"more_body\", False)\r\n\r\n self.gzip_file.write(body)\r\n if not more_body:\r\n self.gzip_file.close()\r\n\r\n message[\"body\"] = self.gzip_buffer.getvalue()\r\n self.gzip_buffer.seek(0)\r\n self.gzip_buffer.truncate()\r\n\r\n await self.send(message)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 799663959, "label": "gzip support for HTML (and JSON) responses"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1213#issuecomment-771965281", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1213", "id": 771965281, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTk2NTI4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-02T20:37:08Z", "updated_at": "2021-02-02T20:39:24Z", "author_association": "OWNER", "body": "Starlette's gzip middleware implementation is here: https://github.com/encode/starlette/blob/0.14.2/starlette/middleware/gzip.py", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 799663959, "label": "gzip support for HTML (and JSON) responses"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1211#issuecomment-771127458", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1211", "id": 771127458, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MTEyNzQ1OA==", "user": {"value": 4488943, "label": "kbaikov"}, "created_at": "2021-02-01T20:13:39Z", "updated_at": "2021-02-01T20:13:39Z", "author_association": "CONTRIBUTOR", "body": "Ping @simonw ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797649915, "label": "Use context manager instead of plain open"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1159#issuecomment-770865698", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1159", "id": 770865698, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDg2NTY5OA==", "user": {"value": 552629, "label": "lovasoa"}, "created_at": "2021-02-01T13:42:29Z", "updated_at": "2021-02-01T13:42:29Z", "author_association": "NONE", "body": "@simonw : Could you have a look at this ? I think this really improves readability.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 774332247, "label": "Improve the display of facets information"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1211#issuecomment-770343684", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1211", "id": 770343684, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDM0MzY4NA==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-01-31T08:03:40Z", "updated_at": "2021-01-31T08:03:40Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=h1) Report\n> Merging [#1211](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=desc) (e33ccaa) into [main](https://codecov.io/gh/simonw/datasette/commit/dde3c500c73ace33529672f7d862b76753d309cc?el=desc) (dde3c50) will **decrease** coverage by `0.00%`.\n> The diff coverage is `92.85%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1211/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1211 +/- ##\n==========================================\n- Coverage 91.54% 91.53% -0.01% \n==========================================\n Files 32 32 \n Lines 3948 3959 +11 \n==========================================\n+ Hits 3614 3624 +10 \n- Misses 334 335 +1 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `77.29% <66.66%> (-0.31%)` | :arrow_down: |\n| [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `95.62% <100.00%> (+<0.01%)` | :arrow_up: |\n| [datasette/publish/cloudrun.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvY2xvdWRydW4ucHk=) | `96.96% <100.00%> (+0.09%)` | :arrow_up: |\n| [datasette/publish/heroku.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvaGVyb2t1LnB5) | `87.73% <100.00%> (+0.60%)` | :arrow_up: |\n| [datasette/utils/\\_\\_init\\_\\_.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.13% <100.00%> (+0.02%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=footer). Last update [dde3c50...e33ccaa](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797649915, "label": "Use context manager instead of plain open"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-770150526", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51", "id": 770150526, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDE1MDUyNg==", "user": {"value": 22578954, "label": "daniel-butler"}, "created_at": "2021-01-30T03:44:19Z", "updated_at": "2021-01-30T03:47:24Z", "author_association": "CONTRIBUTOR", "body": "I don't have much experience with github's rate limiting. In my day job we use the [tenacity library](https://github.com/jd/tenacity) to handle http errors we get.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 703246031, "label": "github-to-sqlite should handle rate limits better"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770112248", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60", "id": 770112248, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDExMjI0OA==", "user": {"value": 22578954, "label": "daniel-butler"}, "created_at": "2021-01-30T00:01:03Z", "updated_at": "2021-01-30T01:14:42Z", "author_association": "CONTRIBUTOR", "body": "Yes that would be cool! I wouldn't mind helping. Is this the meat of it? https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/utils.py#L512\r\n\r\nIt looks like the cli option is added with this decorator : https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/cli.py#L14\r\n\r\nI looked a bit at utils.py in the GitHub repository. I was surprised at the amount of manual mapping of the API response you had to do to get this to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797097140, "label": "Use Data from SQLite in other commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770071568", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60", "id": 770071568, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDA3MTU2OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-29T21:56:15Z", "updated_at": "2021-01-29T21:56:15Z", "author_association": "MEMBER", "body": "I really like the way you're using pipes here - really smart. It's similar to how I build the demo database in this GitHub Actions workflow:\r\n\r\nhttps://github.com/dogsheep/github-to-sqlite/blob/62dfd3bc4014b108200001ef4bc746feb6f33b45/.github/workflows/deploy-demo.yml#L52-L82\r\n\r\n`twitter-to-sqlite` actually has a mechanism for doing this kind of thing, documented at https://github.com/dogsheep/twitter-to-sqlite#providing-input-from-a-sql-query-with---sql-and---attach\r\n\r\nIt lets you do things like:\r\n\r\n```\r\n$ twitter-to-sqlite users-lookup my.db --sql=\"select follower_id from following\" --ids\r\n```\r\nMaybe I should add something similar to `github-to-sqlite`? Feels like it could be really useful.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797097140, "label": "Use Data from SQLite in other commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769973212", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56", "id": 769973212, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTk3MzIxMg==", "user": {"value": 42315895, "label": "gsajko"}, "created_at": "2021-01-29T18:29:02Z", "updated_at": "2021-01-29T18:31:55Z", "author_association": "NONE", "body": "I think it was with `twitter-to-sqlite home-timeline home.db -a auth.json --since`\r\nand Im using only this command to grab tweets \r\n\r\nfrom cron tab\r\n`2,7,12,17,22,27,32,37,42,47,52,57 * * * * run-one /home/gsajko/miniconda3/bin/twitter-to-sqlite home-timeline /home/gsajko/work/custom_twitter_feed/home.db -a /home/gsajko/work/custom_twitter_feed/auth/auth.json --since`\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796736607, "label": "Not all quoted statuses get fetched?"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769957751", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56", "id": 769957751, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTk1Nzc1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-29T17:59:40Z", "updated_at": "2021-01-29T17:59:40Z", "author_association": "MEMBER", "body": "This is interesting - how did you create that initial table? Was this using the `twitter-to-sqlite import archive.db ~/Downloads/twitter-2019-06-25-b31f2.zip` command, or something else?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796736607, "label": "Not all quoted statuses get fetched?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1207#issuecomment-769534187", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1207", "id": 769534187, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTUzNDE4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-29T02:37:19Z", "updated_at": "2021-01-29T02:37:19Z", "author_association": "OWNER", "body": "https://docs.datasette.io/en/latest/testing_plugins.html#using-pdb-for-errors-thrown-inside-datasette", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793881756, "label": "Document the Datasette(..., pdb=True) testing pattern"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1209#issuecomment-769455370", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1209", "id": 769455370, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTQ1NTM3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-28T23:00:21Z", "updated_at": "2021-01-28T23:00:21Z", "author_association": "OWNER", "body": "Good catch on the workaround here. The root problem is that `datasette-template-sql` looks for the first available databsae if you don't provide it with a `database=` argument, and in Datasette 0.54 the first available database changed to being the new `_internal` database.\r\n\r\nIs this a bug? I think it is - because the documented behaviour on https://docs.datasette.io/en/stable/internals.html#get-database-name is this:\r\n\r\n> `name` - string, optional\r\n>\r\n> The name to be used for this database - this will be used in the URL path, e.g. `/dbname`. If not specified Datasette will pick one based on the filename or memory name.\r\n\r\nSince the new behaviour differs from what was in the documentation I'm going to treat this as a bug and fix it.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 795367402, "label": "v0.54 500 error from sql query in custom template; code worked in v0.53; found a workaround"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1205#issuecomment-769453074", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1205", "id": 769453074, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTQ1MzA3NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-28T22:54:49Z", "updated_at": "2021-01-28T22:55:02Z", "author_association": "OWNER", "body": "\r\nI also checked that the following works:\r\n\r\n echo '{\"foo\": \"bar\"}' | sqlite-utils insert _memory.db demo -\r\n datasette _memory.db --memory\r\n\r\nSure enough, it results in the following Datasette homepage - thanks to #509\r\n\r\n\"Datasette___memory___memory_2\"", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793027837, "label": "Rename /:memory: to /_memory"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1205#issuecomment-769452084", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1205", "id": 769452084, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTQ1MjA4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-28T22:52:23Z", "updated_at": "2021-01-28T22:52:23Z", "author_association": "OWNER", "body": "Here are the redirect tests: https://github.com/simonw/datasette/blob/1600d2a3ec3ada1f6fb5b1eb73bdaeccb5f80530/tests/test_api.py#L635-L648", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793027837, "label": "Rename /:memory: to /_memory"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1205#issuecomment-769442165", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1205", "id": 769442165, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTQ0MjE2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-28T22:30:16Z", "updated_at": "2021-01-28T22:30:27Z", "author_association": "OWNER", "body": "I'm going to do this, with redirects from `/:memory:*`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793027837, "label": "Rename /:memory: to /_memory"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1210#issuecomment-769274591", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1210", "id": 769274591, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTI3NDU5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-28T18:10:02Z", "updated_at": "2021-01-28T18:10:02Z", "author_association": "OWNER", "body": "That definitely sounds like a bug! Can you provide a copy of your `metadata.JSON` and the command-line you are using to launch Datasette?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796234313, "label": "Immutable Database w/ Canned Queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-767888743", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54", "id": 767888743, "node_id": "MDEyOklzc3VlQ29tbWVudDc2Nzg4ODc0Mw==", "user": {"value": 19328961, "label": "henry501"}, "created_at": "2021-01-26T23:07:41Z", "updated_at": "2021-01-26T23:07:41Z", "author_association": "NONE", "body": "My import got much further with the applied fixes than 0.21.3, but not 100%. I do appear to have all of the tweets imported at least. \r\nNot sure when I'll have a chance to look further to try to fix or see what didn't make it into the import.\r\n\r\nHere's my output:\r\n\r\n```\r\ndirect-messages-group: not yet implemented\r\nbranch-links: not yet implemented\r\nperiscope-expired-broadcasts: not yet implemented\r\ndirect-messages: not yet implemented\r\nmute: not yet implemented\r\nperiscope-comments-made-by-user: not yet implemented\r\nperiscope-ban-information: not yet implemented\r\nperiscope-profile-description: not yet implemented\r\nscreen-name-change: not yet implemented\r\nmanifest: not yet implemented\r\nfleet: not yet implemented\r\nuser-link-clicks: not yet implemented\r\nperiscope-broadcast-metadata: not yet implemented\r\ncontact: not yet implemented\r\nfleet-mute: not yet implemented\r\ndevice-token: not yet implemented\r\nprotected-history: not yet implemented\r\ndirect-message-mute: not yet implemented\r\nTraceback (most recent call last):\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/bin/twitter-to-sqlite\", line 33, in \r\n sys.exit(load_entry_point('twitter-to-sqlite==0.21.3', 'console_scripts', 'twitter-to-sqlite')())\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/cli.py\", line 772, in import_\r\n archive.import_from_file(db, filepath.name, open(filepath, \"rb\").read())\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 233, in import_from_file\r\n to_insert = transformer(data)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 21, in callback\r\n return {filename: [fn(item) for item in data]}\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 21, in \r\n return {filename: [fn(item) for item in data]}\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 88, in ageinfo\r\n return item[\"ageMeta\"][\"ageInfo\"]\r\nKeyError: 'ageInfo'\r\n\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779088071, "label": "Archive import appears to be broken on recent exports"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1208#issuecomment-767823684", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1208", "id": 767823684, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NzgyMzY4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-26T20:58:51Z", "updated_at": "2021-01-26T20:58:51Z", "author_association": "OWNER", "body": "This is a good catch - I've been lazy about this, but you're right that it's an issue that needs cleaning up. Would be very happy to apply a PR, thanks!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 794554881, "label": "A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1151#issuecomment-767762551", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1151", "id": 767762551, "node_id": "MDEyOklzc3VlQ29tbWVudDc2Nzc2MjU1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-26T19:07:44Z", "updated_at": "2021-01-26T19:07:44Z", "author_association": "OWNER", "body": "Mentioned in https://simonwillison.net/2021/Jan/25/datasette/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 770448622, "label": "Database class mechanism for cross-connection in-memory databases"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/991#issuecomment-767761155", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/991", "id": 767761155, "node_id": "MDEyOklzc3VlQ29tbWVudDc2Nzc2MTE1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-26T19:05:21Z", "updated_at": "2021-01-26T19:06:36Z", "author_association": "OWNER", "body": "Idea: implement this using the existing table view, with a custom template called `table-internal-bb0ec0-tables.html` - that's the custom template listed in the HTML comments at the bottom of https://latest.datasette.io/_internal/tables", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 714377268, "label": "Redesign application homepage"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1201#issuecomment-766991680", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1201", "id": 766991680, "node_id": "MDEyOklzc3VlQ29tbWVudDc2Njk5MTY4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-25T17:42:21Z", "updated_at": "2021-01-25T17:42:21Z", "author_association": "OWNER", "body": "https://docs.datasette.io/en/stable/changelog.html#v0-54", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792904595, "label": "Release notes for Datasette 0.54"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1206#issuecomment-766589070", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1206", "id": 766589070, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NjU4OTA3MA==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-01-25T06:50:30Z", "updated_at": "2021-01-25T17:31:11Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=h1) Report\n> Merging [#1206](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=desc) (06480e1) into [main](https://codecov.io/gh/simonw/datasette/commit/a5ede3cdd455e2bb1a1fb2f4e1b5a9855caf5179?el=desc) (a5ede3c) will **not change** coverage.\n> The diff coverage is `100.00%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1206/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1206 +/- ##\n=======================================\n Coverage 91.53% 91.53% \n=======================================\n Files 32 32 \n Lines 3947 3947 \n=======================================\n Hits 3613 3613 \n Misses 334 334 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/version.py](https://codecov.io/gh/simonw/datasette/pull/1206/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZlcnNpb24ucHk=) | `100.00% <100.00%> (\u00f8)` | |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=footer). Last update [a5ede3c...571476d](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793086333, "label": "Release 0.54"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1206#issuecomment-766588371", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1206", "id": 766588371, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NjU4ODM3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-25T06:49:06Z", "updated_at": "2021-01-25T06:49:06Z", "author_association": "OWNER", "body": "Last thing to do: write up the annotated version of these release notes, assign it a URL on my blog and link to it from the release notes here so I can publish them simultaneously.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793086333, "label": "Release 0.54"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1206#issuecomment-766588020", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1206", "id": 766588020, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NjU4ODAyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-25T06:48:20Z", "updated_at": "2021-01-25T06:48:20Z", "author_association": "OWNER", "body": "Issues to reference in the commit message: #509, #1091, #1150, #1151, #1166, #1167, #1178, #1181, #1182, #1184, #1185, #1186, #1187, #1194, #1198", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793086333, "label": "Release 0.54"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1201#issuecomment-766586151", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1201", "id": 766586151, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NjU4NjE1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-25T06:44:43Z", "updated_at": "2021-01-25T06:44:43Z", "author_association": "OWNER", "body": "OK, release notes are ready to merge from that branch. I'll ship the release in the morning, to give me time to write the accompanying annotated release notes.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792904595, "label": "Release notes for Datasette 0.54"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1201#issuecomment-766545604", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1201", "id": 766545604, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NjU0NTYwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-25T05:14:31Z", "updated_at": "2021-01-25T05:14:31Z", "author_association": "OWNER", "body": "The two big ticket items are `