{"id": 1299129869, "node_id": "PR_kwDOJHON9s5NbyYN", "number": 13, "state": "open", "locked": 0, "title": "use universal command", "user": {"value": 14314871, "label": "amlestin"}, "body": null, "created_at": "2023-04-02T15:10:54Z", "updated_at": "2023-04-02T15:37:34Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "b40fdee5efac03f10257f749ee7f69e4692ad6c5", "assignee": null, "milestone": null, "draft": 0, "head": "8111718e747f59dddcb5bf7820ce922e0723c04a", "base": "e55a802d37a896475b6cf475c1ba947af63cca73", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 611552758, "label": "apple-notes-to-sqlite"}, "url": "https://github.com/dogsheep/apple-notes-to-sqlite/pull/13", "merged_by": null, "auto_merge": null} {"id": 1501923826, "node_id": "PR_kwDOJHON9s5ZhYny", "number": 14, "state": "open", "locked": 0, "title": "fix: fix the problem of Chinese character garbling", "user": {"value": 2698003, "label": "barretlee"}, "body": "1. The code uses two different ways of writing encoding formats, `mac_roman` and `macroman`. It is uncertain whether there are any typo errors.\r\n2. When there are Chinese characters in the content, exporting it results in garbled code. Changing it to `utf8` can fix the issue.", "created_at": "2023-09-04T23:48:28Z", "updated_at": "2023-09-04T23:48:28Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "66b5b73948b1bedc275432956dda43cfe151c78c", "assignee": null, "milestone": null, "draft": 0, "head": "5febe19b8922aa818e7dc265bdee30bcc5004eb4", "base": "e55a802d37a896475b6cf475c1ba947af63cca73", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 611552758, "label": "apple-notes-to-sqlite"}, "url": "https://github.com/dogsheep/apple-notes-to-sqlite/pull/14", "merged_by": null, "auto_merge": null} {"id": 726990680, "node_id": "MDExOlB1bGxSZXF1ZXN0NzI2OTkwNjgw", "number": 35, "state": "open", "locked": 0, "title": "Support for Datasette's --base-url setting", "user": {"value": 2670795, "label": "brandonrobertz"}, "body": "This makes it so you can use Dogsheep if you're using Datasette with the `--base-url /some-path/` setting.", "created_at": "2021-09-03T17:47:45Z", "updated_at": "2021-09-03T17:47:45Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "0f5931da2099303111c49ec726b78bae814f755e", "assignee": null, "milestone": null, "draft": 0, "head": "e6679d287b2e97fc94f50da64e1a7b91c1fbbf67", "base": "a895bc360f2738c7af43deda35c847f1ee5bff51", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "url": "https://github.com/dogsheep/dogsheep-beta/pull/35", "merged_by": null, "auto_merge": null} {"id": 434162316, "node_id": "MDExOlB1bGxSZXF1ZXN0NDM0MTYyMzE2", "number": 29, "state": "closed", "locked": 0, "title": "Fixed bug in SQL query for photo scores", "user": {"value": 41546558, "label": "RhetTbull"}, "body": "The join on ZCOMPUTEDASSETATTRIBUTES used the wrong columns. In most of the Photos database tables, table.ZASSET joins with ZGENERICASSET.Z_PK", "created_at": "2020-06-14T15:39:22Z", "updated_at": "2020-12-04T22:32:36Z", "closed_at": "2020-12-04T22:32:27Z", "merged_at": "2020-12-04T22:32:27Z", "merge_commit_sha": "edc80a0d361006f478f2904a90bfe6c730ed6194", "assignee": null, "milestone": null, "draft": 0, "head": "f961a90788cb2059d40b9a0810900ac81e6859f6", "base": "45ce3f8bfb8c70f57ca5d8d82f22368fea1eb391", "author_association": "CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/29", "merged_by": null, "auto_merge": null} {"id": 448355680, "node_id": "MDExOlB1bGxSZXF1ZXN0NDQ4MzU1Njgw", "number": 30, "state": "open", "locked": 0, "title": "Handle empty bucket on first upload. Allow specifying the endpoint_url for services other than S3 (like b2 and digitalocean spaces)", "user": {"value": 110038, "label": "scanner"}, "body": "Finally got around to trying dogsheep-photos but I want to use backblaze's b2 service instead of AWS S3.\r\nHad to add a way to optionally specify the endpoint_url to connect to. Then with the bucket being empty the initial key retrieval would fail. Probably a better way to see that the bucket is empty than doing a test inside the paginator loop.\r\n\r\nAlso probably a better way to specify the endpoint_url as we get and test for it twice using the same code in two different places but did not want to spend too much time worrying about it.", "created_at": "2020-07-13T16:15:26Z", "updated_at": "2020-07-13T16:15:26Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "583b26f244166aadf2dcc680e39d1ca59765da37", "assignee": null, "milestone": null, "draft": 0, "head": "647d4b42c6f4d1fba4b99f73fe163946cea6ee36", "base": "45ce3f8bfb8c70f57ca5d8d82f22368fea1eb391", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/30", "merged_by": null, "auto_merge": null} {"id": 543015825, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQzMDE1ODI1", "number": 31, "state": "open", "locked": 0, "title": "Update for Big Sur", "user": {"value": 41546558, "label": "RhetTbull"}, "body": "Refactored out the SQL for extracting aesthetic scores to use osxphotos -- adds compatbility for Big Sur via osxphotos which has been updated for new table names in Big Sur. Have not yet refactored the SQL for extracting labels which is still compatible with Big Sur.", "created_at": "2020-12-20T04:36:45Z", "updated_at": "2023-08-08T15:52:52Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "0e571b07430024d4ce00d5e8ba28591cefd27d6f", "assignee": null, "milestone": null, "draft": 0, "head": "39c12f8cda206ad621ec9940cce538570513e764", "base": "edc80a0d361006f478f2904a90bfe6c730ed6194", "author_association": "CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/31", "merged_by": null, "auto_merge": null} {"id": 727390835, "node_id": "MDExOlB1bGxSZXF1ZXN0NzI3MzkwODM1", "number": 36, "state": "open", "locked": 0, "title": "Correct naming of tool in readme", "user": {"value": 2129, "label": "badboy"}, "body": null, "created_at": "2021-09-05T12:05:40Z", "updated_at": "2022-01-06T16:04:46Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "358678c6b48072769f2985fe6be8fc5e54ed2e06", "assignee": null, "milestone": null, "draft": 0, "head": "bf26955c250e601a0d9e751311530940b704f81e", "base": "edc80a0d361006f478f2904a90bfe6c730ed6194", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/36", "merged_by": null, "auto_merge": null} {"id": 986925985, "node_id": "PR_kwDOD079W84600uh", "number": 37, "state": "open", "locked": 0, "title": "Fix former command name in readme", "user": {"value": 578773, "label": "DanLipsitt"}, "body": "Looks like a previous commit missed a `photo-to-sqlite`\u2192 `dogsheep-photos` replacement.", "created_at": "2022-07-05T02:09:13Z", "updated_at": "2022-07-05T02:09:13Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "1fa5a3b9ddab2a954aea21ea4292b944e826866a", "assignee": null, "milestone": null, "draft": 0, "head": "b0d256c5bc480450627d98d8c8a5e3d8c61dc2ae", "base": "325aa38cb23d0757bb1335ee2ea94a082475a66e", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/37", "merged_by": null, "auto_merge": null} {"id": 1454719622, "node_id": "PR_kwDOD079W85WtUKG", "number": 38, "state": "closed", "locked": 0, "title": "photos-to-sql not found?", "user": {"value": 319473, "label": "coldclimate"}, "body": "I wonder if `photos-to-sql` is an old name for `dogsheep-photos`, because I can't find it anywhere.\r\n\r\nI can't actually get this command to work (`sqlite3.OperationalError: no such table: attached.ZGENERICASSET` thrown) but I don't think that's related", "created_at": "2023-07-29T09:59:42Z", "updated_at": "2023-07-29T10:01:27Z", "closed_at": "2023-07-29T10:01:23Z", "merged_at": null, "merge_commit_sha": "78898d8344bd37379b9ef44384629c26ed8c4558", "assignee": null, "milestone": null, "draft": 0, "head": "dbf0213a98cfb92f49a60695d0e0094b851b1cbb", "base": "325aa38cb23d0757bb1335ee2ea94a082475a66e", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/38", "merged_by": null, "auto_merge": null} {"id": 1454726308, "node_id": "PR_kwDOD079W85WtVyk", "number": 39, "state": "open", "locked": 0, "title": "Missing option in datasette instructions", "user": {"value": 319473, "label": "coldclimate"}, "body": "Gotta tell it where to look", "created_at": "2023-07-29T10:34:48Z", "updated_at": "2023-07-29T10:34:48Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "bd9c51b4e3e110122f921fc6ebf10b69d7fcbb7a", "assignee": null, "milestone": null, "draft": 0, "head": "17c0ddf5113d8587247d4736e1390fe07ec33b8c", "base": "325aa38cb23d0757bb1335ee2ea94a082475a66e", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "url": "https://github.com/dogsheep/dogsheep-photos/pull/39", "merged_by": null, "auto_merge": null} {"id": 338647378, "node_id": "MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4", "number": 1, "state": "closed", "locked": 0, "title": "Add parkrun-to-sqlite", "user": {"value": 1101318, "label": "mrw34"}, "body": "", "created_at": "2019-11-08T12:05:32Z", "updated_at": "2020-10-12T00:35:16Z", "closed_at": "2020-10-12T00:35:16Z", "merged_at": "2020-10-12T00:35:16Z", "merge_commit_sha": "58ca0c785fbf34250042379dd0269bf2d0c5ea7e", "assignee": null, "milestone": null, "draft": 0, "head": "ccb86548e0ae6f02a83f1feb0974476ad0f2f2d8", "base": "2972bb001ab5f675eced62f7ba5adef2d3eba2ad", "author_association": "CONTRIBUTOR", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "url": "https://github.com/dogsheep/dogsheep.github.io/pull/1", "merged_by": null, "auto_merge": null} {"id": 357974326, "node_id": "MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2", "number": 3, "state": "closed", "locked": 0, "title": "Add todoist-to-sqlite", "user": {"value": 706257, "label": "bcongdon"}, "body": "Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful", "created_at": "2019-12-30T04:02:59Z", "updated_at": "2020-10-12T00:35:58Z", "closed_at": "2020-10-12T00:35:57Z", "merged_at": "2020-10-12T00:35:57Z", "merge_commit_sha": "85af27dbff7e08a92656639fbf0cfa15c7d30b5c", "assignee": null, "milestone": null, "draft": 0, "head": "49bc87a43555d10696044e8e40d700d93611a190", "base": "58ca0c785fbf34250042379dd0269bf2d0c5ea7e", "author_association": "CONTRIBUTOR", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "url": "https://github.com/dogsheep/dogsheep.github.io/pull/3", "merged_by": null, "auto_merge": null} {"id": 370024697, "node_id": "MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3", "number": 4, "state": "closed", "locked": 0, "title": "Add beeminder-to-sqlite", "user": {"value": 706257, "label": "bcongdon"}, "body": "", "created_at": "2020-02-02T15:51:36Z", "updated_at": "2020-10-12T00:36:16Z", "closed_at": "2020-10-12T00:36:16Z", "merged_at": "2020-10-12T00:36:16Z", "merge_commit_sha": "7e4c6ecdabc249c77e8049cd172b1b5af08a3371", "assignee": null, "milestone": null, "draft": 0, "head": "6713b5c50178b95a9ec50227d4ef5793e71e8b0a", "base": "2972bb001ab5f675eced62f7ba5adef2d3eba2ad", "author_association": "CONTRIBUTOR", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "url": "https://github.com/dogsheep/dogsheep.github.io/pull/4", "merged_by": null, "auto_merge": null} {"id": 505076418, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA1MDc2NDE4", "number": 5, "state": "open", "locked": 0, "title": "Add fitbit-to-sqlite", "user": {"value": 4632208, "label": "mrphil007"}, "body": "", "created_at": "2020-10-16T20:04:05Z", "updated_at": "2020-10-16T20:04:05Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "9b9a677a4fcb6a31be8c406b3050cfe1c6e7e398", "assignee": null, "milestone": null, "draft": 0, "head": "db64d60ee92448b1d2a7e190d9da20eb306326b0", "base": "d0686ebed6f08e9b18b4b96c2b8170e043a69adb", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "url": "https://github.com/dogsheep/dogsheep.github.io/pull/5", "merged_by": null, "auto_merge": null} {"id": 602261092, "node_id": "MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky", "number": 6, "state": "closed", "locked": 0, "title": "Add testres-db tool", "user": {"value": 1151557, "label": "ligurio"}, "body": "", "created_at": "2021-03-28T15:43:23Z", "updated_at": "2022-02-16T05:12:05Z", "closed_at": "2022-02-16T05:12:05Z", "merged_at": null, "merge_commit_sha": "eceb016506b5db29b9c21bc7fcf5e6e77259c7b4", "assignee": null, "milestone": null, "draft": 0, "head": "91cfa6f7dcab032e2d21e80657c81e69119e2018", "base": "92c6bb77629feeed661c7b8d9183a11367de39e0", "author_association": "NONE", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "url": "https://github.com/dogsheep/dogsheep.github.io/pull/6", "merged_by": null, "auto_merge": null} {"id": 673872974, "node_id": "MDExOlB1bGxSZXF1ZXN0NjczODcyOTc0", "number": 7, "state": "open", "locked": 0, "title": "Add instagram-to-sqlite", "user": {"value": 36654812, "label": "gavindsouza"}, "body": "The tool covers only chat imports at the time of opening this PR but I'm planning to import everything else that I feel inquisitive about\r\n\r\nref: https://github.com/gavindsouza/instagram-to-sqlite", "created_at": "2021-06-19T12:26:16Z", "updated_at": "2021-07-28T07:58:59Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "66e9828db4a8ddc4049ab9932e1304288e571821", "assignee": null, "milestone": null, "draft": 0, "head": "4e4c6baf41778071a960d288b0ef02bd01cb6376", "base": "92c6bb77629feeed661c7b8d9183a11367de39e0", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 214746582, "label": "dogsheep.github.io"}, "url": "https://github.com/dogsheep/dogsheep.github.io/pull/7", "merged_by": null, "auto_merge": null} {"id": 542406910, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQyNDA2OTEw", "number": 10, "state": "closed", "locked": 0, "title": "BugFix for encoding and not update info.", "user": {"value": 1277270, "label": "riverzhou"}, "body": "Bugfix 1:\r\n\r\nTraceback (most recent call last):\r\n File \"d:\\anaconda3\\lib\\runpy.py\", line 194, in _run_module_as_main\r\n return _run_code(code, main_globals, None,\r\n File \"d:\\anaconda3\\lib\\runpy.py\", line 87, in _run_code\r\n exec(code, run_globals)\r\n File \"D:\\Anaconda3\\Scripts\\evernote-to-sqlite.exe\\__main__.py\", line 7, in \r\n File \"d:\\anaconda3\\lib\\site-packages\\click\\core.py\", line 829, in __call__\r\n File \"d:\\anaconda3\\lib\\site-packages\\click\\core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"d:\\anaconda3\\lib\\site-packages\\click\\core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"d:\\anaconda3\\lib\\site-packages\\click\\core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"d:\\anaconda3\\lib\\site-packages\\evernote_to_sqlite\\cli.py\", line 30, in enex\r\n for tag, note in find_all_tags(fp, [\"note\"], progress_callback=bar.update):\r\n File \"d:\\anaconda3\\lib\\site-packages\\evernote_to_sqlite\\utils.py\", line 11, in find_all_tags\r\n chunk = fp.read(1024 * 1024)\r\nUnicodeDecodeError: 'gbk' codec can't decode byte 0xa4 in position 383: illegal multibyte sequence\r\n\r\nBugfix 2:\r\n\r\nTraceback (most recent call last):\r\n File \"D:\\Anaconda3\\Scripts\\evernote-to-sqlite-script.py\", line 33, in \r\n sys.exit(load_entry_point('evernote-to-sqlite==0.3', 'console_scripts', 'evernote-to-sqlite')())\r\n File \"D:\\Anaconda3\\lib\\site-packages\\click\\core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"D:\\Anaconda3\\lib\\site-packages\\click\\core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"D:\\Anaconda3\\lib\\site-packages\\click\\core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"D:\\Anaconda3\\lib\\site-packages\\click\\core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"D:\\Anaconda3\\lib\\site-packages\\click\\core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"D:\\Anaconda3\\lib\\site-packages\\evernote_to_sqlite-0.3-py3.8.egg\\evernote_to_sqlite\\cli.py\", line 31, in enex\r\n File \"D:\\Anaconda3\\lib\\site-packages\\evernote_to_sqlite-0.3-py3.8.egg\\evernote_to_sqlite\\utils.py\", line 28, in save_note\r\nAttributeError: 'NoneType' object has no attribute 'text'", "created_at": "2020-12-18T08:58:54Z", "updated_at": "2021-02-11T22:37:56Z", "closed_at": "2021-02-11T22:37:56Z", "merged_at": null, "merge_commit_sha": "4425daeccd43ce3c7bb45deaae577984f978e40f", "assignee": null, "milestone": null, "draft": 0, "head": "7b8b96b69f43cb2247875c3ca6d39878edf77a78", "base": "92254b71075c8806bca258c939e24af8397cdf98", "author_association": "NONE", "repo": {"value": 303218369, "label": "evernote-to-sqlite"}, "url": "https://github.com/dogsheep/evernote-to-sqlite/pull/10", "merged_by": null, "auto_merge": null} {"id": 645100848, "node_id": "MDExOlB1bGxSZXF1ZXN0NjQ1MTAwODQ4", "number": 12, "state": "open", "locked": 0, "title": "Recovering of malformed ENEX file", "user": {"value": 8431437, "label": "engdan77"}, "body": "Hey .. Awesome work developing this project, that I found very useful to me and saved me some work.. Thanks.. :)\r\n\r\nSome background to this PR... \r\nI've been searching around for a tool allowing me to transforming my personal collection of Evernote notes to a format easier to search and potentially easier import to future services. \r\n\r\nNow I discovered problem processing my large data ~5GB using the existing source using Pythons builtin xml-parser that unfortunately was unable to succeed without exception breaking the process. \r\n\r\nMy first attempt I tried to adapt to more robust lxml package allowing huge data and with \"recover\", but even if it worked better it also failed processing the whole data. Even using the memory efficient etree.iterparse() it also unfortunately got into trouble.\r\n\r\nAnd with no luck finding any other libraries successfully parsing this enormous file I instead chose to build a \"hugexmlparser\" module that allows parsing this huge file using yield (on a byte-to-byte-level) and allows you to set a maximum size for to cater for potential malformed or undesirable large attachments to export, should succeed covering potential exceptions. Some cases found where the parses discover malformed XML within so also in those cases try to save as much as possible by escaping (to be dealt at a later stage, better than nothing), and if a missing end before new (malformed?) it would add this after encounter a new start-tag.\r\n\r\nThe code for the recovery process is a bit rough and for certain room for refactoring, but at the moment is seem to achieve what I wanted.\r\n\r\nNow with the above we pass this a minor changed version of save_note_recovery() assure the existing works.\r\nAlso adding this as a new recover-enex command to click and kept the original options. \r\nA couple of new tests was added as well to check against using this command.\r\n\r\nNow this currently works to me, but thought I might share a PR in such as you find use for this yourself or found useful to others finding this repository.\r\n\r\nAs a second step .. When the time allows it would have been nice to also be able to easily export from SQLite to formatted HTML/MD and attachments saved... but that might perhaps be better a separate project ... or if you or someone else have something that might shared to save some trouble, I would be interested ;-) ", "created_at": "2021-05-15T07:49:31Z", "updated_at": "2021-05-15T19:57:50Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "95f21ca163606db74babd036e6fa44b7d484d137", "assignee": null, "milestone": null, "draft": 0, "head": "a5839dadaa43694f208ad74a53670cebbe756956", "base": "0bc6ba503eecedb947d2624adbe1327dd849d7fe", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 303218369, "label": "evernote-to-sqlite"}, "url": "https://github.com/dogsheep/evernote-to-sqlite/pull/12", "merged_by": null, "auto_merge": null} {"id": 771790589, "node_id": "PR_kwDOEhK-wc4uAJb9", "number": 15, "state": "open", "locked": 0, "title": "include note tags in the export", "user": {"value": 436138, "label": "d-rep"}, "body": "When parsing the Evernote `` elements, the script will now also parse any nested `` elements, writing them out into a separate sqlite table.\r\n\r\nHere is an example of how to query the data after the script has run:\r\n```\r\nselect notes.*,\r\n\t(select group_concat(tag) from notes_tags where notes_tags.note_id=notes.id) as tags\r\nfrom notes;\r\n```\r\n\r\nMy .enex source file is 3+ years old so I am assuming the structure hasn't changed. Interestingly, my _notebook names_ show up in the _tags_ list where the tag name is prefixed with `notebook_`, so this could maybe help work around the first limitation mentioned in the [evernote-to-sqlite blog post](https://simonwillison.net/2020/Oct/16/building-evernote-sqlite-exporter/).\r\n", "created_at": "2021-11-02T20:04:31Z", "updated_at": "2021-11-02T20:04:31Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "ee36aba995b0a5385bdf9a451851dcfc316ff7f6", "assignee": null, "milestone": null, "draft": 0, "head": "8cc3aa49c6e61496b04015c14048c5dac58d6b42", "base": "fff89772b4404995400e33fe1d269050717ff4cf", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 303218369, "label": "evernote-to-sqlite"}, "url": "https://github.com/dogsheep/evernote-to-sqlite/pull/15", "merged_by": null, "auto_merge": null} {"id": 525371029, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI1MzcxMDI5", "number": 8, "state": "closed", "locked": 0, "title": "fix import error if note has no \"updated\" element", "user": {"value": 4028322, "label": "mkorosec"}, "body": "I got the following error when executing evernote-to-sqlite enex evernote.db evernote.enex\r\n``` \r\n... \r\n File \"evernote_to_sqlite/cli.py\", line 31, in enex\r\n save_note(db, note)\r\n File \"evernote_to_sqlite/utils.py\", line 28, in save_note\r\n updated = note.find(\"updated\").text\r\nAttributeError: 'NoneType' object has no attribute 'text'\r\n``` \r\n\r\nSeems that in some cases the updated element is not added to the note, this is a part of the problematic note:\r\n\r\n``` \r\n20201019T074518Z\r\n\r\n web.clip7\r\n webclipper.evernote\r\n\r\n```", "created_at": "2020-11-22T22:51:05Z", "updated_at": "2021-02-11T22:34:06Z", "closed_at": "2021-02-11T22:34:06Z", "merged_at": "2021-02-11T22:34:06Z", "merge_commit_sha": "1c8457ddaa487aa2e677963d37217fcb6d544e59", "assignee": null, "milestone": null, "draft": 0, "head": "03b0c240d5f12c2d651c4cb25f92b0fecc7f7419", "base": "1c355e5678877e14eefa2a5fab5a267342a03335", "author_association": "CONTRIBUTOR", "repo": {"value": 303218369, "label": "evernote-to-sqlite"}, "url": "https://github.com/dogsheep/evernote-to-sqlite/pull/8", "merged_by": null, "auto_merge": null} {"id": 469651732, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY5NjUxNzMy", "number": 48, "state": "closed", "locked": 0, "title": "Add pull requests", "user": {"value": 755825, "label": "adamjonas"}, "body": "ref #46 \r\n\r\nIssues don't have merge information on them, which means that PRs need to be pulled separately.\r\n\r\nDid my best to mimic the API of issues.", "created_at": "2020-08-18T17:58:44Z", "updated_at": "2020-11-29T23:51:09Z", "closed_at": "2020-11-29T23:51:09Z", "merged_at": "2020-11-29T23:51:09Z", "merge_commit_sha": "b37f55549461cfe0731b57623f315860b3db49d0", "assignee": null, "milestone": null, "draft": 0, "head": "3a0d5c498f9faae4e40aab204cd01b965a4f61f3", "base": "16d271253f4ea71b261d2d228b926c7bc1a7e660", "author_association": "CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/48", "merged_by": null, "auto_merge": null} {"id": 543246535, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1", "number": 59, "state": "closed", "locked": 0, "title": "Remove unneeded exists=True for -a/--auth flag.", "user": {"value": 631242, "label": "frosencrantz"}, "body": "The file does not need to exist when using an environment variable.", "created_at": "2020-12-21T06:03:55Z", "updated_at": "2021-05-22T14:06:19Z", "closed_at": "2021-05-19T16:08:12Z", "merged_at": "2021-05-19T16:08:12Z", "merge_commit_sha": "70dffca351375e6f542969c72ebc43c6d393d99c", "assignee": null, "milestone": null, "draft": 0, "head": "79745bed50b7344c5cbb17a08215dc20d58b9416", "base": "d19d7db034bf7c3adcae37b9ab6f365d569605b3", "author_association": "CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/59", "merged_by": null, "auto_merge": null} {"id": 564172140, "node_id": "MDExOlB1bGxSZXF1ZXN0NTY0MTcyMTQw", "number": 61, "state": "closed", "locked": 0, "title": "fixing typo in get cli help text", "user": {"value": 22578954, "label": "daniel-butler"}, "body": "", "created_at": "2021-01-29T18:57:04Z", "updated_at": "2021-05-19T16:07:09Z", "closed_at": "2021-05-19T16:07:09Z", "merged_at": "2021-05-19T16:07:09Z", "merge_commit_sha": "ba8cf3e9bb5f4f8740bd4b9eed28f1464d7f6b9a", "assignee": null, "milestone": null, "draft": 0, "head": "7ac6efc3a873facafa72192b58e28c6e8a79f744", "base": "62dfd3bc4014b108200001ef4bc746feb6f33b45", "author_association": "CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/61", "merged_by": null, "auto_merge": null} {"id": 672053811, "node_id": "MDExOlB1bGxSZXF1ZXN0NjcyMDUzODEx", "number": 65, "state": "open", "locked": 0, "title": "basic support for events", "user": {"value": 231498, "label": "khimaros"}, "body": "a quick first pass at implementing the feature requested in https://github.com/dogsheep/github-to-sqlite/issues/64\r\n\r\ntesting instructions:\r\n\r\n```\r\n$ github-to-sqlite events events.db user/khimaros\r\n```\r\n\r\nif the specified user is the authenticated user, it will also include private events.\r\n\r\ncaveat: pagination appears to be broken (i don't see `next` in the response JSON from GitHub)", "created_at": "2021-06-17T00:51:30Z", "updated_at": "2022-10-03T22:35:03Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "0a252a06a15e307c8a67b2e0aac0907e2566bf19", "assignee": null, "milestone": null, "draft": 0, "head": "82da9f91deda81d92ec64c9eda960aa64340c169", "base": "0e45b72312a0756e5a562effbba08cb8de1e480b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/65", "merged_by": null, "auto_merge": null} {"id": 716357982, "node_id": "MDExOlB1bGxSZXF1ZXN0NzE2MzU3OTgy", "number": 66, "state": "open", "locked": 0, "title": "Add --merged-by flag to pull-requests sub command", "user": {"value": 30531572, "label": "sarcasticadmin"}, "body": "## Description\r\n\r\nProposing a solution to the API limitation for `merged_by` in pull_requests. Specifically the following called out in the readme:\r\n\r\n```\r\nNote that the merged_by column on the pull_requests table will only be populated for pull requests that are loaded using the --pull-request option - the GitHub API does not return this field for pull requests that are loaded in bulk.\r\n```\r\n\r\nThis approach might cause larger repos to hit rate limits called out in https://github.com/dogsheep/github-to-sqlite/issues/51 but seems to work well in the repos I tested and included below.\r\n\r\n## Old Behavior\r\n- Had to list out the pull-requests individually via multiple `--pull-request` flags\r\n\r\n## New Behavior\r\n\r\n- `--merged-by` flag for getting 'merge_by' information out of pull-requests without having to specify individual PR numbers.\r\n\r\n# Testing\r\n\r\nPicking some repo that has more than one merger (datasette only has 1 \ud83d\ude09 )\r\n\r\n```\r\n$ github-to-sqlite pull-requests ./github.db opnsense/tools --merged-by\r\n$ echo \"select id, url, merged_by from pull_requests;\" | sqlite3 ./github.db \r\n83533612|https://github.com/opnsense/tools/pull/39|1915288\r\n102632885|https://github.com/opnsense/tools/pull/43|1915288\r\n149114810|https://github.com/opnsense/tools/pull/57|1915288\r\n160394495|https://github.com/opnsense/tools/pull/64|1915288\r\n163308408|https://github.com/opnsense/tools/pull/67|1915288\r\n169723264|https://github.com/opnsense/tools/pull/69|1915288\r\n171381422|https://github.com/opnsense/tools/pull/72|1915288\r\n179938195|https://github.com/opnsense/tools/pull/77|1915288\r\n196233824|https://github.com/opnsense/tools/pull/82|1915288\r\n215289964|https://github.com/opnsense/tools/pull/93|\r\n219696100|https://github.com/opnsense/tools/pull/97|1915288\r\n223664843|https://github.com/opnsense/tools/pull/99|\r\n228446172|https://github.com/opnsense/tools/pull/103|1915288\r\n238930434|https://github.com/opnsense/tools/pull/110|1915288\r\n255507110|https://github.com/opnsense/tools/pull/119|1915288\r\n255980675|https://github.com/opnsense/tools/pull/120|1915288\r\n261906770|https://github.com/opnsense/tools/pull/125|\r\n263800503|https://github.com/opnsense/tools/pull/127|1915288\r\n264038685|https://github.com/opnsense/tools/pull/128|1915288\r\n264696704|https://github.com/opnsense/tools/pull/129|1915288\r\n266660547|https://github.com/opnsense/tools/pull/130|1915288\r\n273120409|https://github.com/opnsense/tools/pull/133|1915288\r\n274370803|https://github.com/opnsense/tools/pull/135|\r\n276600629|https://github.com/opnsense/tools/pull/139|\r\n277303655|https://github.com/opnsense/tools/pull/141|1915288\r\n293033714|https://github.com/opnsense/tools/pull/145|\r\n294827649|https://github.com/opnsense/tools/pull/146|\r\n295140008|https://github.com/opnsense/tools/pull/147|1915288\r\n305690829|https://github.com/opnsense/tools/pull/150|9783985\r\n307077931|https://github.com/opnsense/tools/pull/152|1915288\r\n321782100|https://github.com/opnsense/tools/pull/155|\r\n337265672|https://github.com/opnsense/tools/pull/160|\r\n337267484|https://github.com/opnsense/tools/pull/161|1915288\r\n368251763|https://github.com/opnsense/tools/pull/169|\r\n428262505|https://github.com/opnsense/tools/pull/181|\r\n437557011|https://github.com/opnsense/tools/pull/182|1915288\r\n447079893|https://github.com/opnsense/tools/pull/185|\r\n461822092|https://github.com/opnsense/tools/pull/191|\r\n463290142|https://github.com/opnsense/tools/pull/193|1915288\r\n470112962|https://github.com/opnsense/tools/pull/194|1915288\r\n472644649|https://github.com/opnsense/tools/pull/195|1915288\r\n488696898|https://github.com/opnsense/tools/pull/198|\r\n513289902|https://github.com/opnsense/tools/pull/201|\r\n522530265|https://github.com/opnsense/tools/pull/203|\r\n564443347|https://github.com/opnsense/tools/pull/213|\r\n597579516|https://github.com/opnsense/tools/pull/220|1915288\r\n602860357|https://github.com/opnsense/tools/pull/221|1915288\r\n608744738|https://github.com/opnsense/tools/pull/222|1915288\r\n623279673|https://github.com/opnsense/tools/pull/228|1915288\r\n664656182|https://github.com/opnsense/tools/pull/233|\r\n664781786|https://github.com/opnsense/tools/pull/234|1915288\r\n670683636|https://github.com/opnsense/tools/pull/235|1915288\r\n683150764|https://github.com/opnsense/tools/pull/237|\r\n685016233|https://github.com/opnsense/tools/pull/238|\r\n687099825|https://github.com/opnsense/tools/pull/239|1915288\r\n715705652|https://github.com/opnsense/tools/pull/244|1915288\r\n715721248|https://github.com/opnsense/tools/pull/245|1915288\r\n```\r\n`userid` are now present for those PRs that were merged.\r\n\r\nWithout the flag the `merged_by` behavior remains missing as expected when get PRs bulk:\r\n\r\n```\r\n$ github-to-sqlite pull-requests ./github.db opnsense/tools\r\n$ echo \"select id, url, merged_by from pull_requests;\" | sqlite3 ./github.db \r\n83533612|https://github.com/opnsense/tools/pull/39|\r\n102632885|https://github.com/opnsense/tools/pull/43|\r\n149114810|https://github.com/opnsense/tools/pull/57|\r\n160394495|https://github.com/opnsense/tools/pull/64|\r\n163308408|https://github.com/opnsense/tools/pull/67|\r\n169723264|https://github.com/opnsense/tools/pull/69|\r\n171381422|https://github.com/opnsense/tools/pull/72|\r\n179938195|https://github.com/opnsense/tools/pull/77|\r\n196233824|https://github.com/opnsense/tools/pull/82|\r\n215289964|https://github.com/opnsense/tools/pull/93|\r\n219696100|https://github.com/opnsense/tools/pull/97|\r\n223664843|https://github.com/opnsense/tools/pull/99|\r\n228446172|https://github.com/opnsense/tools/pull/103|\r\n238930434|https://github.com/opnsense/tools/pull/110|\r\n255507110|https://github.com/opnsense/tools/pull/119|\r\n255980675|https://github.com/opnsense/tools/pull/120|\r\n261906770|https://github.com/opnsense/tools/pull/125|\r\n263800503|https://github.com/opnsense/tools/pull/127|\r\n264038685|https://github.com/opnsense/tools/pull/128|\r\n264696704|https://github.com/opnsense/tools/pull/129|\r\n266660547|https://github.com/opnsense/tools/pull/130|\r\n273120409|https://github.com/opnsense/tools/pull/133|\r\n274370803|https://github.com/opnsense/tools/pull/135|\r\n276600629|https://github.com/opnsense/tools/pull/139|\r\n277303655|https://github.com/opnsense/tools/pull/141|\r\n293033714|https://github.com/opnsense/tools/pull/145|\r\n294827649|https://github.com/opnsense/tools/pull/146|\r\n295140008|https://github.com/opnsense/tools/pull/147|\r\n305690829|https://github.com/opnsense/tools/pull/150|\r\n307077931|https://github.com/opnsense/tools/pull/152|\r\n321782100|https://github.com/opnsense/tools/pull/155|\r\n337265672|https://github.com/opnsense/tools/pull/160|\r\n337267484|https://github.com/opnsense/tools/pull/161|\r\n368251763|https://github.com/opnsense/tools/pull/169|\r\n428262505|https://github.com/opnsense/tools/pull/181|\r\n437557011|https://github.com/opnsense/tools/pull/182|\r\n447079893|https://github.com/opnsense/tools/pull/185|\r\n461822092|https://github.com/opnsense/tools/pull/191|\r\n463290142|https://github.com/opnsense/tools/pull/193|\r\n470112962|https://github.com/opnsense/tools/pull/194|\r\n472644649|https://github.com/opnsense/tools/pull/195|\r\n488696898|https://github.com/opnsense/tools/pull/198|\r\n513289902|https://github.com/opnsense/tools/pull/201|\r\n522530265|https://github.com/opnsense/tools/pull/203|\r\n564443347|https://github.com/opnsense/tools/pull/213|\r\n597579516|https://github.com/opnsense/tools/pull/220|\r\n602860357|https://github.com/opnsense/tools/pull/221|\r\n608744738|https://github.com/opnsense/tools/pull/222|\r\n623279673|https://github.com/opnsense/tools/pull/228|\r\n664656182|https://github.com/opnsense/tools/pull/233|\r\n664781786|https://github.com/opnsense/tools/pull/234|\r\n670683636|https://github.com/opnsense/tools/pull/235|\r\n683150764|https://github.com/opnsense/tools/pull/237|\r\n685016233|https://github.com/opnsense/tools/pull/238|\r\n687099825|https://github.com/opnsense/tools/pull/239|\r\n715705652|https://github.com/opnsense/tools/pull/244|\r\n715721248|https://github.com/opnsense/tools/pull/245|\r\n```\r\n\r\nIndividual PRs passed via `--pull-request` flag behaves as expected (unchanged):\r\n\r\n```\r\n$ github-to-sqlite pull-requests ./github.db opnsense/tools --pull-request 39 --pull-request 237\r\n$ echo \"select id, url, merged_by from pull_requests;\" | sqlite3 ./github.db\r\n83533612|https://github.com/opnsense/tools/pull/39|1915288\r\n683150764|https://github.com/opnsense/tools/pull/237|\r\n```\r\n> Picking 1 PR that has a merged_by (39) and one that does not (237)", "created_at": "2021-08-20T00:57:55Z", "updated_at": "2021-09-28T21:50:31Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "6b4276d9469e4579c81588ac9e3d128026d919a0", "assignee": null, "milestone": null, "draft": 0, "head": "a92a31d5d446022baeaf7f3c9ea107094637e64d", "base": "ed3752022e45b890af63996efec804725e95d0d4", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/66", "merged_by": null, "auto_merge": null} {"id": 721686721, "node_id": "MDExOlB1bGxSZXF1ZXN0NzIxNjg2NzIx", "number": 67, "state": "open", "locked": 0, "title": "Replacing step ID key with step_id", "user": {"value": 16374374, "label": "jshcmpbll"}, "body": "Workflows that have an `id` in any step result in the following error when running `workflows`:\r\n\r\ne.g.`github-to-sqlite workflows github.db nixos/nixpkgs`\r\n\r\n```Traceback (most recent call last):\r\n File \"/usr/local/bin/github-to-sqlite\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1137, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1062, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1668, in invoke```Traceback (most recent call last):\r\n File \"/usr/local/bin/github-to-sqlite\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1137, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1062, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1668, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 1404, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/lib/python3.8/dist-packages/click/core.py\", line 763, in invoke\r\n return __callback(*args, **kwargs)\r\n File \"/usr/local/lib/python3.8/dist-packages/github_to_sqlite/cli.py\", line 601, in workflows\r\n utils.save_workflow(db, repo_id, filename, content)\r\n File \"/usr/local/lib/python3.8/dist-packages/github_to_sqlite/utils.py\", line 865, in save_workflow\r\n db[\"steps\"].insert_all(\r\n File \"/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py\", line 2596, in insert_all\r\n self.insert_chunk(\r\n File \"/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py\", line 2378, in insert_chunk\r\n result = self.db.execute(query, params)\r\n File \"/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py\", line 419, in execute\r\n return self.conn.execute(sql, parameters)\r\nsqlite3.IntegrityError: datatype mismatch\r\n```\r\n\r\n - [Information about the ID key in a step for GHA](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstepsid)\r\n - [An example workflow from a public repo](https://github.com/NixOS/nixpkgs/blob/b4cc66827745e525ce7bb54659845ac89788a597/.github/workflows/direct-push.yml#L16)\r\n\r\n# Changes\r\nI'm proposing that the key for `id` in step is replaced with `step_id` so that it no longer interferes with the table `id` for tracking the record.\r\n\r\nSpecial thanks to @sarcasticadmin @egiffen and @ruebenramirez for helping a bit on this \ud83d\ude04 ", "created_at": "2021-08-28T01:26:41Z", "updated_at": "2021-08-28T01:27:00Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "9f73c9bf29dec9a1482d9af56b9fac271869585c", "assignee": null, "milestone": null, "draft": 0, "head": "9b5acceb25cf48b00e9c6c8293358b036440deb2", "base": "ed3752022e45b890af63996efec804725e95d0d4", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/67", "merged_by": null, "auto_merge": null} {"id": 747742034, "node_id": "PR_kwDODFdgUs4skaNS", "number": 68, "state": "open", "locked": 0, "title": "Add support for retrieving teams / members", "user": {"value": 68329, "label": "philwills"}, "body": "Adds a method for retrieving all the teams within an organisation and all the members in those teams. The latter is stored as a join table `team_members` beteween `teams` and `users`.", "created_at": "2021-10-01T15:55:02Z", "updated_at": "2021-10-01T15:59:53Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "f46e276c356c893370d5893296f4b69f08baf02c", "assignee": null, "milestone": null, "draft": 0, "head": "cc838e87b1eb19b299f277a07802923104f35ce2", "base": "ed3752022e45b890af63996efec804725e95d0d4", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/68", "merged_by": null, "auto_merge": null} {"id": 862538586, "node_id": "PR_kwDODFdgUs4zaUta", "number": 70, "state": "open", "locked": 0, "title": "scrape-dependents: enable paging through package menu option if present", "user": {"value": 36061055, "label": "stanbiryukov"}, "body": "Some repos organize network dependents by a Package toggle. This PR adds the ability to page through those options and scrape underlying dependents.", "created_at": "2022-02-24T15:07:25Z", "updated_at": "2022-02-24T15:07:25Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "36cca3584a07d88d1e505111d1b23294d66ba73e", "assignee": null, "milestone": null, "draft": 0, "head": "cc8f276a474525e55ed0bcacb0cd8cc560f89614", "base": "751bc900366ca52e662ea383b858cbf4365093d9", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/70", "merged_by": null, "auto_merge": null} {"id": 959140599, "node_id": "PR_kwDODFdgUs45K1L3", "number": 73, "state": "closed", "locked": 0, "title": "Fixing 'NoneType' object has no attribute 'items'", "user": {"value": 1224205, "label": "empjustine"}, "body": "Under some conditions, GitHub caches removed starred repositories and ends up leaving dangling `None` user references.\r\n\r\n Traceback (most recent call last):\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/bin/github-to-sqlite\", line 8, in \r\n sys.exit(cli())\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py\", line 1130, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py\", line 1055, in main\r\n rv = self.invoke(ctx)\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py\", line 1657, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py\", line 1404, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py\", line 760, in invoke\r\n return __callback(*args, **kwargs)\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/cli.py\", line 181, in starred\r\n utils.save_stars(db, user, stars)\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py\", line 494, in save_stars\r\n repo_id = save_repo(db, repo)\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py\", line 308, in save_repo\r\n to_save[\"owner\"] = save_user(db, to_save[\"owner\"])\r\n File \"/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py\", line 229, in save_user\r\n for key, value in user.items()\r\n AttributeError: 'NoneType' object has no attribute 'items'", "created_at": "2022-06-06T13:58:11Z", "updated_at": "2022-07-18T19:40:12Z", "closed_at": "2022-07-18T19:40:12Z", "merged_at": "2022-07-18T19:40:12Z", "merge_commit_sha": "dbac2e5dd8a562b45d8255a265859cf8020ca22a", "assignee": null, "milestone": null, "draft": 0, "head": "d7c06886f3bb95085a3af3b2a21547e41556cc6e", "base": "a6e237f75a4b86963d91dcb5c9582e3a1b3349d6", "author_association": "CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/73", "merged_by": null, "auto_merge": null} {"id": 1047561919, "node_id": "PR_kwDODFdgUs4-cIa_", "number": 76, "state": "open", "locked": 0, "title": "Add organization support to repos command", "user": {"value": 2757699, "label": "OverkillGuy"}, "body": "New --organization flag to signify all given \"usernames\" are private\r\norgs. Adapts API URL to the organization path instead.\r\n\r\nNot the best implementation, but a first draft to talk around\r\n\r\nFixes #75 (badly, no tests, overly vague, untested)", "created_at": "2022-09-06T13:21:42Z", "updated_at": "2022-09-06T13:59:08Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "1514acfa87f57261547bc3d7fc4f161e34285d76", "assignee": null, "milestone": null, "draft": 0, "head": "bb959b46e8a7647755c14dee180fdd5209451954", "base": "ace13ec3d98090d99bd71871c286a4a612c96a50", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/76", "merged_by": null, "auto_merge": null} {"id": 335980246, "node_id": "MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2", "number": 8, "state": "closed", "locked": 0, "title": "stargazers command, refs #4", "user": {"value": 9599, "label": "simonw"}, "body": "Needs tests. Refs #4.", "created_at": "2019-11-03T00:37:36Z", "updated_at": "2020-05-02T20:00:27Z", "closed_at": "2020-05-02T20:00:26Z", "merged_at": null, "merge_commit_sha": "db25bdf8cee4c3e2d730cf269eb9a903b51cdb41", "assignee": null, "milestone": null, "draft": 0, "head": "ea07274667a08c67907e8bfbbccb6f0fb95ce817", "base": "ae9035f8fe5aff1c54bff4c6b4c2e808a44f0f2a", "author_association": "MEMBER", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "url": "https://github.com/dogsheep/github-to-sqlite/pull/8", "merged_by": null, "auto_merge": null} {"id": 948892757, "node_id": "PR_kwDODFE5qs44jvRV", "number": 11, "state": "open", "locked": 0, "title": "Update README.md", "user": {"value": 11887, "label": "ashanan"}, "body": "Fix typo", "created_at": "2022-05-27T03:13:59Z", "updated_at": "2022-05-27T03:13:59Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "3d479a1052f2661de61b15c50b7a5b2daa20a33a", "assignee": null, "milestone": null, "draft": 0, "head": "d4af1554a9b5ddedcd0b241450f7b935f38b9bf7", "base": "e54e544427f1cc3ea8189f0e95f54046301a8645", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/11", "merged_by": null, "auto_merge": null} {"id": 1505067804, "node_id": "PR_kwDODFE5qs5ZtYMc", "number": 13, "state": "open", "locked": 0, "title": "use poetry for packages, asdf for versioning, and gh actions for ci", "user": {"value": 150855, "label": "iloveitaly"}, "body": "- build: use poetry for package management, asdf for python version\n- build: cleanup poetry config, add keywords, ignore dist\n- ci: migrate circleci to gh actions\n- fix: dup method definition\n", "created_at": "2023-09-06T17:59:16Z", "updated_at": "2023-09-06T17:59:16Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "cd4d8c4a7ecd231f6c5a8886245271934177f104", "assignee": null, "milestone": null, "draft": 0, "head": "b5f0ebe91755c46e01dc4aefb808f0292848fbed", "base": "e54e544427f1cc3ea8189f0e95f54046301a8645", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/13", "merged_by": null, "auto_merge": null} {"id": 577953727, "node_id": "MDExOlB1bGxSZXF1ZXN0NTc3OTUzNzI3", "number": 5, "state": "open", "locked": 0, "title": "WIP: Add Gmail takeout mbox import", "user": {"value": 306240, "label": "UtahDave"}, "body": "WIP\r\n\r\nThis PR adds the ability to import emails from a Gmail mbox export from Google Takeout.\r\n\r\nThis is my first PR to a datasette/dogsheep repo. I've tested this on my personal Google Takeout mbox with ~520,000 emails going back to 2004. This took around ~20 minutes to process.\r\n\r\nTo provide some feedback on the progress of the import I added the \"rich\" python module. I'm happy to remove that if adding a dependency is discouraged. However, I think it makes a nice addition to give feedback on the progress of a long import.\r\n\r\nDo we want to log emails that have errors when trying to import them?\r\n\r\nDealing with encodings with emails is a bit tricky. I'm very open to feedback on how to deal with those better. As well as any other feedback for improvements.", "created_at": "2021-02-22T21:30:40Z", "updated_at": "2021-07-28T07:18:56Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "65182811d59451299e75f09b4366bb221bc32b20", "assignee": null, "milestone": null, "draft": 0, "head": "a3de045eba0fae4b309da21aa3119102b0efc576", "base": "e54e544427f1cc3ea8189f0e95f54046301a8645", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5", "merged_by": null, "auto_merge": null} {"id": 698423667, "node_id": "MDExOlB1bGxSZXF1ZXN0Njk4NDIzNjY3", "number": 8, "state": "open", "locked": 0, "title": "Add Gmail takeout mbox import (v2)", "user": {"value": 28565, "label": "maxhawkins"}, "body": "WIP\r\n\r\nThis PR builds on #5 to continue implementing gmail import support.\r\n\r\nBuilding on @UtahDave's work, these commits add a few performance and bug fixes:\r\n\r\n* Decreased memory overhead for import by manually parsing mbox headers.\r\n* Fixed error where some messages in the mbox would yield a row with NULL in all columns.\r\n\r\nI will send more commits to fix any errors I encounter as I run the importer on my personal takeout data.", "created_at": "2021-07-28T07:05:32Z", "updated_at": "2023-09-08T01:22:49Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "d2809fd3fd835358d01ad10401228a562539b29e", "assignee": null, "milestone": null, "draft": 0, "head": "8e6d487b697ce2e8ad885acf613a157bfba84c59", "base": "e54e544427f1cc3ea8189f0e95f54046301a8645", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/8", "merged_by": null, "auto_merge": null} {"id": 775078665, "node_id": "PR_kwDODFE5qs4uMsMJ", "number": 9, "state": "open", "locked": 0, "title": "Removed space from filename My Activity.json", "user": {"value": 91880982, "label": "widadmogral"}, "body": "File name from google takeout has no space. The code only runs without error if filename is \"MyActivity.json\" and not \"My Activity.json\". Is it a new change by Google?", "created_at": "2021-11-08T00:04:31Z", "updated_at": "2021-11-08T00:04:31Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "236da5c8302c09a20fcd4164c563cd9fa5c9595c", "assignee": null, "milestone": null, "draft": 0, "head": "6d111f65687e13ffd8b39aa05f1f8f4a351e7788", "base": "e54e544427f1cc3ea8189f0e95f54046301a8645", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/9", "merged_by": null, "auto_merge": null} {"id": 1038926741, "node_id": "PR_kwDODtX3eM497MOV", "number": 5, "state": "open", "locked": 0, "title": "The program fails when the user has no submissions", "user": {"value": 2467, "label": "fernand0"}, "body": "Tested with:\r\n \r\n hacker-news-to-sqlite user hacker-news.db fernand0\r\n\r\nResult:\r\n`\r\nTraceback (most recent call last):\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/bin/hacker-news-to-sqlite\", line 8, in \r\n sys.exit(cli())\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py\", line 1130, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py\", line 1055, in main\r\n rv = self.invoke(ctx)\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py\", line 1657, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py\", line 1404, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py\", line 760, in invoke\r\n return __callback(*args, **kwargs)\r\n File \"/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/hacker_news_to_sqlite/cli.py\", line 27, in user\r\n submitted = user.pop(\"submitted\", None) or []\r\nAttributeError: 'NoneType' object has no attribute 'pop'\r\n`\r\n\r\nThere is a problem of style with the patch (but not sure what to do) because with the new inicialization ( submitted = []) the part \r\n\r\n or []\r\n\r\nis not needed. Maybe there is a more adequate way of doing this.", "created_at": "2022-08-28T17:25:45Z", "updated_at": "2022-08-28T17:25:45Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "f0d7414305fc6cba4bcb7506b76a94938ccc7886", "assignee": null, "milestone": null, "draft": 0, "head": "ea97e640ad7a24020821fde5c647240120bd7099", "base": "c5585c103d124b23ba1e163f8857d4ba49fe452a", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 248903544, "label": "hacker-news-to-sqlite"}, "url": "https://github.com/dogsheep/hacker-news-to-sqlite/pull/5", "merged_by": null, "auto_merge": null} {"id": 1290512937, "node_id": "PR_kwDODtX3eM5M66op", "number": 6, "state": "open", "locked": 0, "title": "Add permalink virtual field to items table", "user": {"value": 1231935, "label": "xavdid"}, "body": "I added a virtual column (no storage overhead) to the output that easily links back to the source. It works nicely out of the box with datasette:\r\n\r\n![](https://cdn.zappy.app/faf43661d539ee0fee02c0421de22d65.png)\r\n\r\nI got bit a bit by https://github.com/simonw/sqlite-utils/issues/411, so I went with a manual `table_xinfo` and creating the table via execute. Happy to adjust if that issue moves, but this seems like it works.\r\n\r\nI also added my best-guess instructions for local development on this package. I'm shooting in the dark, so feel free to replace with how you work on it locally.", "created_at": "2023-03-26T22:22:38Z", "updated_at": "2023-03-29T18:38:52Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "99bda9434e0adaa8459bc0abbe6262785cd4086c", "assignee": null, "milestone": null, "draft": 0, "head": "b04d6c76c26820f2e0b04da58dd82789e83cbb42", "base": "c5585c103d124b23ba1e163f8857d4ba49fe452a", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 248903544, "label": "hacker-news-to-sqlite"}, "url": "https://github.com/dogsheep/hacker-news-to-sqlite/pull/6", "merged_by": null, "auto_merge": null} {"id": 521054612, "node_id": "MDExOlB1bGxSZXF1ZXN0NTIxMDU0NjEy", "number": 13, "state": "open", "locked": 0, "title": "SQLite does not have case sensitive columns", "user": {"value": 1689944, "label": "tomaskrehlik"}, "body": "This solves a weird issue when there is record with metadata key\r\nthat is only different in letter cases.\r\n\r\nSee the test for details.", "created_at": "2020-11-14T20:12:32Z", "updated_at": "2021-08-24T13:28:26Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "38856acbc724ffdb8beb9e9f4ef0dbfa8ff51ad1", "assignee": null, "milestone": null, "draft": 0, "head": "3e1b2945bc7c31be59e89c5fed86a5d2a59ebd5a", "base": "71e36e1cf034b96de2a8e6652265d782d3fdf63b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/13", "merged_by": null, "auto_merge": null} {"id": 561512503, "node_id": "MDExOlB1bGxSZXF1ZXN0NTYxNTEyNTAz", "number": 15, "state": "open", "locked": 0, "title": "added try / except to write_records ", "user": {"value": 9857779, "label": "ryancheley"}, "body": "to keep the data write from failing if it came across an error during processing. In particular when trying to convert my HealthKit zip file (and that of my wife's) it would consistently error out with the following:\r\n\r\n```\r\ndb.py 1709 insert_chunk\r\nresult = self.db.execute(query, params)\r\n\r\ndb.py 226 execute\r\nreturn self.conn.execute(sql, parameters)\r\n\r\nsqlite3.OperationalError:\r\ntoo many SQL variables\r\n\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\ndb.py 1709 insert_chunk\r\nresult = self.db.execute(query, params)\r\n\r\ndb.py 226 execute\r\nreturn self.conn.execute(sql, parameters)\r\n\r\nsqlite3.OperationalError:\r\ntoo many SQL variables\r\n\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\ndb.py 1709 insert_chunk\r\nresult = self.db.execute(query, params)\r\n\r\ndb.py 226 execute\r\nreturn self.conn.execute(sql, parameters)\r\n\r\nsqlite3.OperationalError:\r\ntable rBodyMass has no column named metadata_HKWasUserEntered\r\n\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\nhealthkit-to-sqlite 8 \r\nsys.exit(cli())\r\n\r\ncore.py 829 __call__\r\nreturn self.main(*args, **kwargs)\r\n\r\ncore.py 782 main\r\nrv = self.invoke(ctx)\r\n\r\ncore.py 1066 invoke\r\nreturn ctx.invoke(self.callback, **ctx.params)\r\n\r\ncore.py 610 invoke\r\nreturn callback(*args, **kwargs)\r\n\r\ncli.py 57 cli\r\nconvert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf)\r\n\r\nutils.py 42 convert_xml_to_sqlite\r\nwrite_records(records, db)\r\n\r\nutils.py 143 write_records\r\ndb[table].insert_all(\r\n\r\ndb.py 1899 insert_all\r\nself.insert_chunk(\r\n\r\ndb.py 1720 insert_chunk\r\nself.insert_chunk(\r\n\r\ndb.py 1720 insert_chunk\r\nself.insert_chunk(\r\n\r\ndb.py 1714 insert_chunk\r\nresult = self.db.execute(query, params)\r\n\r\ndb.py 226 execute\r\nreturn self.conn.execute(sql, parameters)\r\n\r\nsqlite3.OperationalError:\r\ntable rBodyMass has no column named metadata_HKWasUserEntered\r\n```\r\n\r\nAdding the try / except in the `write_records` seems to fix that issue. ", "created_at": "2021-01-26T03:56:21Z", "updated_at": "2021-01-26T03:56:21Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "8527278a87e448f57c7c6bd76a2d85f12d0233dd", "assignee": null, "milestone": null, "draft": 0, "head": "7f1b168c752b5af7c1f9052dfa61e26afc83d574", "base": "71e36e1cf034b96de2a8e6652265d782d3fdf63b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/15", "merged_by": null, "auto_merge": null} {"id": 592364255, "node_id": "MDExOlB1bGxSZXF1ZXN0NTkyMzY0MjU1", "number": 16, "state": "open", "locked": 0, "title": "Add a fallback ID, print if no ID found", "user": {"value": 1234956, "label": "n8henrie"}, "body": "Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/14\n", "created_at": "2021-03-13T13:38:29Z", "updated_at": "2021-03-13T14:44:04Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "16ab307b2138891f226a66e4954c5470de753a0f", "assignee": null, "milestone": null, "draft": 0, "head": "27b3d54ccfe7d861770a9d0b173f6503580fea4a", "base": "71e36e1cf034b96de2a8e6652265d782d3fdf63b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/16", "merged_by": null, "auto_merge": null} {"id": 596627780, "node_id": "MDExOlB1bGxSZXF1ZXN0NTk2NjI3Nzgw", "number": 18, "state": "open", "locked": 0, "title": "Add datetime parsing", "user": {"value": 1234956, "label": "n8henrie"}, "body": "Parses the datetime columns so they are subsequently properly recognized as\ndatetime.\n\nFixes https://github.com/dogsheep/healthkit-to-sqlite/issues/17\n", "created_at": "2021-03-19T14:34:22Z", "updated_at": "2021-03-19T14:34:22Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "c87f4e8aa88ec277c6b5a000670c2cb42a10c03d", "assignee": null, "milestone": null, "draft": 0, "head": "e0e7a0f99f844db33964b27c29b0b8d5f160202b", "base": "71e36e1cf034b96de2a8e6652265d782d3fdf63b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/18", "merged_by": null, "auto_merge": null} {"id": 718734191, "node_id": "MDExOlB1bGxSZXF1ZXN0NzE4NzM0MTkx", "number": 22, "state": "open", "locked": 0, "title": "Make sure that case-insensitive column names are unique", "user": {"value": 32016596, "label": "FabianHertwig"}, "body": "This closes #21.\r\n\r\nWhen there are metadata entries with the same case insensitive string, then there is an error when trying to create a new column for that metadata entry in the database table, because a column with that case insensitive name already exists.\r\n\r\n```xml\r\n \r\n \r\n \r\n \r\n```\r\n\r\nThe code added in this PR checks if a key already exists in a record and if so adds a number at its end. The resulting column names look like the example below then. Interestingly, the column names viewed with Datasette are not case insensitive.\r\n\r\n```text\r\nstartDate, endDate, value, unit, sourceName, sourceVersion, creationDate, metadata_meal, metadata_Meal_2, metadata_Mahlzeit\r\n```\r\n", "created_at": "2021-08-24T13:13:38Z", "updated_at": "2021-08-24T13:26:20Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "c757d372c10284cd6fa58d144549bc89691341c3", "assignee": null, "milestone": null, "draft": 0, "head": "b16fb556f84a0eed262a518ca7ec82a467155d23", "base": "9fe3cb17e03d6c73222b63e643638cf951567c4c", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/22", "merged_by": null, "auto_merge": null} {"id": 1182000455, "node_id": "PR_kwDOC8tyDs5Gc-VH", "number": 23, "state": "open", "locked": 0, "title": "Include workout statistics", "user": {"value": 2129, "label": "badboy"}, "body": "Not sure when this changed (iOS 16 maybe?), but the `WorkoutStatistics` now has a whole bunch of information about workouts, e.g. for runs it contains the distance (as a `` element).\r\n\r\nAdding it as another column at leat allows me to pull these out (using SQLite's JSON support).\r\nI'm running with this patch on my own data now.", "created_at": "2023-01-01T17:29:56Z", "updated_at": "2023-01-01T17:29:57Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "e0ad45055f363810085119f26df87a6804451056", "assignee": null, "milestone": null, "draft": 0, "head": "d5b9e3609961515cc52bcc5ef070e3b83b473339", "base": "9fe3cb17e03d6c73222b63e643638cf951567c4c", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/23", "merged_by": null, "auto_merge": null} {"id": 300580221, "node_id": "MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx", "number": 8, "state": "closed", "locked": 0, "title": "Use less RAM", "user": {"value": 9599, "label": "simonw"}, "body": "Closes #7", "created_at": "2019-07-24T06:35:01Z", "updated_at": "2019-07-24T06:35:52Z", "closed_at": "2019-07-24T06:35:52Z", "merged_at": "2019-07-24T06:35:52Z", "merge_commit_sha": "c8392df78ee3e1643d18b747a4abf585d84d5d88", "assignee": null, "milestone": null, "draft": 0, "head": "6261500b01274a739176480774e82b31f2926e7f", "base": "5d7e14d40d5a4cfd133ca5faa442312f607784c5", "author_association": "MEMBER", "repo": {"value": 197882382, "label": "healthkit-to-sqlite"}, "url": "https://github.com/dogsheep/healthkit-to-sqlite/pull/8", "merged_by": null, "auto_merge": null} {"id": 526847823, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI2ODQ3ODIz", "number": 7, "state": "closed", "locked": 0, "title": "Fixed conflicting CLI flags", "user": {"value": 8944, "label": "tlockney"}, "body": "The `-a` used for the auth credentials and the shortened form of the `--all` flags were in conflict on the `fetch` command. To be consistent with other `-to-sqlite` libraries in the Dogsheep ecosystem, I removed the shortened form of the `--all` flag.", "created_at": "2020-11-24T23:25:12Z", "updated_at": "2022-08-21T21:11:56Z", "closed_at": "2022-08-21T21:11:56Z", "merged_at": "2022-08-21T21:11:56Z", "merge_commit_sha": "4d88c84a66a501e4cb0dd2de9949072b8d42b859", "assignee": null, "milestone": null, "draft": 0, "head": "02576f9b1c234128c6a3d52123761af8486beb57", "base": "b956a01464007fe227895fe6eb6c942ed71298c8", "author_association": "CONTRIBUTOR", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "url": "https://github.com/dogsheep/pocket-to-sqlite/pull/7", "merged_by": null, "auto_merge": null} {"id": 501791663, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAxNzkxNjYz", "number": 10, "state": "closed", "locked": 0, "title": "Update utils.py to fix sqlite3.OperationalError", "user": {"value": 29426418, "label": "mattiaborsoi"}, "body": "Fixes the errors:\r\n- sqlite3.OperationalError: table posts has no column named text\r\n- sqlite3.OperationalError: table photos has no column named hasSticker\r\n\r\nThat will cause sqlite-utils to notice if there's a missing column and add it. As recommended by @simonw", "created_at": "2020-10-12T20:17:53Z", "updated_at": "2020-10-12T20:25:10Z", "closed_at": "2020-10-12T20:25:09Z", "merged_at": "2020-10-12T20:25:09Z", "merge_commit_sha": "a5a2b5feb56fef4f2b627699b7d628ee9d2d63db", "assignee": null, "milestone": null, "draft": 0, "head": "c7bdb0207708a9eb40ba095039f0918fd103b176", "base": "f4a82633da927cde672c9d9af92930bfca2e3ddf", "author_association": "CONTRIBUTOR", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "url": "https://github.com/dogsheep/swarm-to-sqlite/pull/10", "merged_by": null, "auto_merge": null} {"id": 1073492809, "node_id": "PR_kwDODD6af84__DNJ", "number": 14, "state": "open", "locked": 0, "title": "Photo links", "user": {"value": 6782721, "label": "redmanmale"}, "body": "* add to `checkin_details` view new column for a calculated photo links\r\n* supported multiple links split by newline\r\n* create `events` table if there's no events in the history to avoid SQL errors\r\n\r\nFixes #9.", "created_at": "2022-10-01T09:44:15Z", "updated_at": "2022-11-18T17:10:49Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "6ca283dc30a2713bd3fda0dc35df1c7186a5996e", "assignee": null, "milestone": null, "draft": 0, "head": "5541d9496bad73c9edce98f5562a3135359d57d6", "base": "719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "url": "https://github.com/dogsheep/swarm-to-sqlite/pull/14", "merged_by": null, "auto_merge": null} {"id": 357640186, "node_id": "MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2", "number": 6, "state": "closed", "locked": 0, "title": "don't break if source is missing", "user": {"value": 78035, "label": "mfa"}, "body": "broke for me. very old checkins in 2010 had no source set.", "created_at": "2019-12-29T10:46:47Z", "updated_at": "2020-03-28T02:28:11Z", "closed_at": "2020-03-28T02:28:11Z", "merged_at": "2020-03-28T02:28:11Z", "merge_commit_sha": "d3c4ab2848ea606417150f377a82e66ca7887c54", "assignee": null, "milestone": null, "draft": 0, "head": "a41b5bcd63012f64fe6746825d7101cc3d071483", "base": "f2c89dd613fb8a7f14e5267ccc2145463b996190", "author_association": "CONTRIBUTOR", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "url": "https://github.com/dogsheep/swarm-to-sqlite/pull/6", "merged_by": null, "auto_merge": null} {"id": 327051673, "node_id": "MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz", "number": 15, "state": "closed", "locked": 0, "title": "twitter-to-sqlite import command, refs #4", "user": {"value": 9599, "label": "simonw"}, "body": "", "created_at": "2019-10-11T06:37:14Z", "updated_at": "2019-10-11T06:45:01Z", "closed_at": "2019-10-11T06:45:01Z", "merged_at": "2019-10-11T06:45:01Z", "merge_commit_sha": "2019ee908731054c6eaa3d5123dfbdf7d2d70fc4", "assignee": null, "milestone": null, "draft": 0, "head": "df1d85897118310a2d3c1b9e5aad108165302cf2", "base": "436a170d74ec70903d1b4ca430c2c6b6435cdfcc", "author_association": "MEMBER", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/15", "merged_by": null, "auto_merge": null} {"id": 329324368, "node_id": "MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4", "number": 24, "state": "closed", "locked": 0, "title": "Tweet source extraction and new migration system", "user": {"value": 9599, "label": "simonw"}, "body": "Closes #12 and #23", "created_at": "2019-10-17T15:24:56Z", "updated_at": "2019-10-17T15:49:29Z", "closed_at": "2019-10-17T15:49:24Z", "merged_at": "2019-10-17T15:49:24Z", "merge_commit_sha": "c9295233f219c446fa2085cace987067488a31b9", "assignee": null, "milestone": null, "draft": 0, "head": "39f822a624685e321dbca8a4318741dd1e42548b", "base": "619f724a722b3f23f4364f67d3164b93e8ba2a70", "author_association": "MEMBER", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/24", "merged_by": null, "auto_merge": null} {"id": 372273608, "node_id": "MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4", "number": 33, "state": "closed", "locked": 0, "title": "Upgrade to sqlite-utils 2.2.1", "user": {"value": 9599, "label": "simonw"}, "body": "", "created_at": "2020-02-07T07:32:12Z", "updated_at": "2020-03-20T19:21:42Z", "closed_at": "2020-03-20T19:21:41Z", "merged_at": null, "merge_commit_sha": "5338f6baab3ec1424431133968d8b64a656ce4c4", "assignee": null, "milestone": null, "draft": 0, "head": "08f51271d6309aad698b9e8a7587fcebbbd67781", "base": "35c18a09fa664324dcb75e5e58ccb90644456d02", "author_association": "MEMBER", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/33", "merged_by": null, "auto_merge": null} {"id": 469944999, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY5OTQ0OTk5", "number": 49, "state": "closed", "locked": 0, "title": "Document the use of --stop_after with favorites, refs #20", "user": {"value": 370930, "label": "mikepqr"}, "body": "(I discovered this trawling the issues for how to use --since with favorites)", "created_at": "2020-08-19T06:10:52Z", "updated_at": "2021-08-20T00:02:11Z", "closed_at": "2021-08-20T00:02:11Z", "merged_at": "2021-08-20T00:02:10Z", "merge_commit_sha": "b6a4da8be3b6d4b74c6a5fac8924bf22a6824f2c", "assignee": null, "milestone": null, "draft": 0, "head": "7ace806c81faf31c1aace0f0b2a4c17dd72cfa06", "base": "21fc1cad6dd6348c67acff90a785b458d3a81275", "author_association": "CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/49", "merged_by": null, "auto_merge": null} {"id": 549204063, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQ5MjA0MDYz", "number": 55, "state": "closed", "locked": 0, "title": "Fix archive imports", "user": {"value": 21148, "label": "jacobian"}, "body": "This fixes the issues discussed in #54", "created_at": "2021-01-05T15:54:48Z", "updated_at": "2021-08-20T00:02:49Z", "closed_at": "2021-08-20T00:02:49Z", "merged_at": "2021-08-20T00:02:48Z", "merge_commit_sha": "bf622dcb82203c1cd87e914901b53afe6f90e668", "assignee": null, "milestone": null, "draft": 0, "head": "ffb127844f133fcb6a1af5cd3557995d303fb53f", "base": "21fc1cad6dd6348c67acff90a785b458d3a81275", "author_association": "CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/55", "merged_by": null, "auto_merge": null} {"id": 724317650, "node_id": "MDExOlB1bGxSZXF1ZXN0NzI0MzE3NjUw", "number": 59, "state": "closed", "locked": 0, "title": "Fix for since_id bug, closes #58", "user": {"value": 42904, "label": "rubenv"}, "body": "Fixes remaining instances of this bug", "created_at": "2021-09-01T09:49:09Z", "updated_at": "2021-09-21T17:37:40Z", "closed_at": "2021-09-21T17:37:40Z", "merged_at": "2021-09-21T17:37:40Z", "merge_commit_sha": "91aa5f578e871a7976ca0a861862f9b9dd162464", "assignee": null, "milestone": null, "draft": 0, "head": "27369e4d1c9702de34ebc125f92ef3fc9d74abed", "base": "74726190d4031bfa36db93e189555e273b35e283", "author_association": "CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/59", "merged_by": null, "auto_merge": null} {"id": 872242672, "node_id": "PR_kwDODEm0Qs4z_V3w", "number": 65, "state": "open", "locked": 0, "title": "Update Twitter dev link, clarify apps vs projects", "user": {"value": 2657547, "label": "rixx"}, "body": "Twitter pushes you heavily towards v2 projects instead of v1 apps \u2013 I know the README mentions v1 API compatibility at the top, but I still nearly got turned around here.", "created_at": "2022-03-05T11:56:08Z", "updated_at": "2022-03-05T11:56:08Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "765a450845ba26fac102d9154980cd936399546c", "assignee": null, "milestone": null, "draft": 0, "head": "b7cfe9dcb7dbccc7ba8171cfe74f19227c4351ec", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/65", "merged_by": null, "auto_merge": null} {"id": 943518450, "node_id": "PR_kwDODEm0Qs44PPLy", "number": 66, "state": "open", "locked": 0, "title": "Ageinfo workaround", "user": {"value": 11887, "label": "ashanan"}, "body": "I'm not sure if this is due to a new format or just because my ageinfo file is blank, but trying to import an archive would crash when it got to that file. This PR adds a guard clause in the `ageinfo` transformer and sets a default value that doesn't throw an exception. Seems likely to be the same issue mentioned by danp in https://github.com/dogsheep/twitter-to-sqlite/issues/54, my ageinfo file looks the same. Added that same ageinfo file to the test archive as well to help confirm my workaround didn't break anything.\r\n\r\nLet me know if you want any changes!", "created_at": "2022-05-21T21:08:29Z", "updated_at": "2022-05-21T21:09:16Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "c22e8eba634b70e914de9f72e452b1ebea55c6ef", "assignee": null, "milestone": null, "draft": 0, "head": "75ae7c94120d14083217bc76ebd603b396937104", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/66", "merged_by": null, "auto_merge": null} {"id": 1179812287, "node_id": "PR_kwDODEm0Qs5GUoG_", "number": 67, "state": "open", "locked": 0, "title": "Add support for app-only bearer tokens", "user": {"value": 26161409, "label": "sometimes-i-send-pull-requests"}, "body": "Previously, twitter-to-sqlite only supported OAuth1 authentication, and the token must be on behalf of a user. However, Twitter also supports application-only bearer tokens, documented here:\r\nhttps://developer.twitter.com/en/docs/authentication/oauth-2-0/bearer-tokens This PR adds support to twitter-to-sqlite for using application-only bearer tokens. To use, the auth.json file just needs to contain a \"bearer_token\" key instead of \"api_key\", \"api_secret_key\", etc.", "created_at": "2022-12-28T23:31:20Z", "updated_at": "2022-12-28T23:31:20Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "7825cd68047088cbdc9666586f1af9b7e1fa88c2", "assignee": null, "milestone": null, "draft": 0, "head": "52050d06eeb85f3183b086944b7b75ae758096cd", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/67", "merged_by": null, "auto_merge": null} {"id": 1179812491, "node_id": "PR_kwDODEm0Qs5GUoKL", "number": 68, "state": "open", "locked": 0, "title": "Archive: Import mute table", "user": {"value": 26161409, "label": "sometimes-i-send-pull-requests"}, "body": null, "created_at": "2022-12-28T23:32:06Z", "updated_at": "2022-12-28T23:32:06Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "47d4d3bda6d4123f58d8dbd634f9f146d97b037e", "assignee": null, "milestone": null, "draft": 0, "head": "e1cd68ea0244c4689a3c49799c6b24371cdc4978", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/68", "merged_by": null, "auto_merge": null} {"id": 1179812620, "node_id": "PR_kwDODEm0Qs5GUoMM", "number": 69, "state": "open", "locked": 0, "title": "Archive: Import new tweets table name", "user": {"value": 26161409, "label": "sometimes-i-send-pull-requests"}, "body": "Given the code here, it seems like in the past this file was named \"tweet.js\". In recent exports, it's named \"tweets.js\". The archive importer needs to be modified to take this into account. Existing logic is reused for importing this table. (However, the resulting table name will be different, matching the different file name -- archive_tweets, rather than archive_tweet).", "created_at": "2022-12-28T23:32:44Z", "updated_at": "2022-12-28T23:32:44Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "1a8c02a8d349c8fd4074139a6a3eed552676bdf3", "assignee": null, "milestone": null, "draft": 0, "head": "11e8fa64ca30cebde047a4268e65f376c42e2b60", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/69", "merged_by": null, "auto_merge": null} {"id": 1179812730, "node_id": "PR_kwDODEm0Qs5GUoN6", "number": 70, "state": "open", "locked": 0, "title": "Archive: Import Twitter Circle data", "user": {"value": 26161409, "label": "sometimes-i-send-pull-requests"}, "body": null, "created_at": "2022-12-28T23:33:09Z", "updated_at": "2022-12-28T23:33:09Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "1d2683101571550adf4a3b7bdf8e9ffbd8b77b61", "assignee": null, "milestone": null, "draft": 0, "head": "cc80cb31a9afb9a50295d6202f509e5b500607a0", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/70", "merged_by": null, "auto_merge": null} {"id": 1179812838, "node_id": "PR_kwDODEm0Qs5GUoPm", "number": 71, "state": "open", "locked": 0, "title": "Archive: Fix \"ni devices\" typo in importer", "user": {"value": 26161409, "label": "sometimes-i-send-pull-requests"}, "body": null, "created_at": "2022-12-28T23:33:31Z", "updated_at": "2022-12-28T23:33:31Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "7905dbd6e36bcabcfd9106c70ebb36ecf9e38260", "assignee": null, "milestone": null, "draft": 0, "head": "0d3c62e8ba6e545785069cc0ffc8dc1bad03db80", "base": "f09d611782a8372cfb002792dfa727325afb4db6", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "url": "https://github.com/dogsheep/twitter-to-sqlite/pull/71", "merged_by": null, "auto_merge": null} {"id": 500256485, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAwMjU2NDg1", "number": 1000, "state": "closed", "locked": 0, "title": "datasette.client internal requests mechanism", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #943", "created_at": "2020-10-08T23:58:25Z", "updated_at": "2020-10-09T16:11:26Z", "closed_at": "2020-10-09T16:11:25Z", "merged_at": "2020-10-09T16:11:25Z", "merge_commit_sha": "8f97b9b58e77f82fef1f10e9c9f6754b993544b6", "assignee": {"value": 9599, "label": "simonw"}, "milestone": {"value": 5971510, "label": "Datasette 0.50"}, "draft": 0, "head": "8a80c79deb640bc1a1864132a3564ccca59e8858", "base": "7249ac5ca04b5ddc6517750326ee7e522cc49145", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1000", "merged_by": null, "auto_merge": null} {"id": 500798091, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAwNzk4MDkx", "number": 1008, "state": "open", "locked": 0, "title": "Add json_loads and json_dumps jinja2 filters", "user": {"value": 649467, "label": "mhalle"}, "body": "", "created_at": "2020-10-09T20:11:34Z", "updated_at": "2020-12-15T02:30:28Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "e33e91ca7c9b2fdeab9d8179ce0d603918b066aa", "assignee": null, "milestone": null, "draft": 0, "head": "40858989d47043743d6b1c9108528bec6a317e43", "base": "1bdbc8aa7f4fd7a768d456146e44da86cb1b36d1", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1008", "merged_by": null, "auto_merge": null} {"id": 501579088, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAxNTc5MDg4", "number": 1017, "state": "closed", "locked": 0, "title": "Update janus requirement from <0.6,>=0.4 to >=0.4,<0.7", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "body": "Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.\n
\nChangelog\n

Sourced from janus's changelog.

\n
\n

Changes

\n

0.5.0 (2020-04-23)

\n
    \n
  • Remove explicit loop arguments and forbid creating queues outside event loops #246
  • \n
\n

0.4.0 (2018-07-28)

\n
    \n
  • Add py.typed macro #89
  • \n
  • Drop python 3.4 support and fix minimal version python3.5.3 #88
  • \n
  • Add property with that indicates if queue is closed #86
  • \n
\n

0.3.2 (2018-07-06)

\n
    \n
  • Fixed python 3.7 support #97
  • \n
\n

0.3.1 (2018-01-30)

\n
    \n
  • Fixed bug with join() in case tasks are added by sync_q.put() #75
  • \n
\n

0.3.0 (2017-02-21)

\n
    \n
  • Expose unfinished_tasks property #34
  • \n
\n

0.2.4 (2016-12-05)

\n
    \n
  • Restore tarball deploying
  • \n
\n

0.2.3 (2016-07-12)

\n
    \n
  • Fix exception type
  • \n
\n

0.2.2 (2016-07-11)

\n
    \n
  • Update asyncio.async() to use asyncio.ensure_future() #6
  • \n
\n

0.2.1 (2016-03-24)

\n
    \n
  • Fix python setup.py test command #4
  • \n
\n\n
\n
\n
\nCommits\n
    \n
  • d186724 Fix yaml
  • \n
  • dbb2d7b Fix deploy script
  • \n
  • 18df625 Bump to 0.6.0
  • \n
  • a50b7ec Test on ubuntu only, the library has no platform specific dependencies
  • \n
  • b599d94 Fix workflow
  • \n
  • 9897fca Setup github workflows
  • \n
  • cde6918 Drop Python 3.5, test on Python 3.9, format with black/isort
  • \n
  • 5f04d79 Support Python 3.9 officially
  • \n
  • ac23eb7 janus: remove unused type ignores (#287)
  • \n
  • 0da8f95 Make all tests non-skipped again
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "created_at": "2020-10-12T13:29:46Z", "updated_at": "2020-10-14T21:52:08Z", "closed_at": "2020-10-14T21:52:07Z", "merged_at": "2020-10-14T21:52:07Z", "merge_commit_sha": "7f2edb5dd2074dce0090659021991695a984844b", "assignee": null, "milestone": null, "draft": 0, "head": "f30d9da06b02a53f880b6fe7a7af9d3ff719dede", "base": "acf07a67722aa74828744726187690b59d342494", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1017", "merged_by": null, "auto_merge": null} {"id": 501579315, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAxNTc5MzE1", "number": 1018, "state": "closed", "locked": 0, "title": "Update asgiref requirement from ~=3.2.10 to >=3.2.10,<3.4.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "body": "Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.\n
\nChangelog\n

Sourced from asgiref's changelog.

\n
\n

3.3.0 (2020-10-09)

\n
    \n
  • sync_to_async now defaults to thread-sensitive mode being on
  • \n
  • async_to_sync now works inside of forked processes
  • \n
  • WsgiToAsgi now correctly clamps its response body when Content-Length is set
  • \n
\n

3.2.10 (2020-08-18)

\n
    \n
  • Fixed bugs due to bad WeakRef handling introduced in 3.2.8
  • \n
\n

3.2.9 (2020-06-16)

\n
    \n
  • Fixed regression with exception handling in 3.2.8 related to the contextvars fix.
  • \n
\n

3.2.8 (2020-06-15)

\n
    \n
  • Fixed small memory leak in local.Local
  • \n
  • contextvars are now persisted through AsyncToSync
  • \n
\n

3.2.7 (2020-03-24)

\n
    \n
  • Bug fixed in local.Local where deleted Locals would occasionally inherit\ntheir storage into new Locals due to memory reuse.
  • \n
\n

3.2.6 (2020-03-23)

\n
    \n
  • local.Local now works in all threading situations, no longer requires\nperiodic garbage collection, and works with libraries that monkeypatch\nthreading (like gevent)
  • \n
\n

3.2.5 (2020-03-11)

\n
    \n
  • self is now preserved on methods by async_to_sync
  • \n
\n

3.2.4 (2020-03-10)

\n\n
\n
\n
\nCommits\n
    \n
  • 7dba5ff Releasing 3.3.0
  • \n
  • e1e0dd9 Added ZeroCopy extension
  • \n
  • 3834d13 Added rpc.py to Implementations (#198)
  • \n
  • 03b0dbb Clamped WsgiToAsgi response body using Content-Length value
  • \n
  • cfd82e4 Fix linting with unused import removal
  • \n
  • cc1877e Fix import sorting in previous commit.
  • \n
  • 7becc9d Making thread_sensitive=True the default
  • \n
  • 66a6e68 Fixed #194: Made async_to_sync work inside a fork
  • \n
  • 4ab9d8e Fixed #193: Bumped docs version to 3.0
  • \n
  • 1c9d063 Clarified "Optional" meaning (#190)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "created_at": "2020-10-12T13:30:09Z", "updated_at": "2020-10-14T21:51:36Z", "closed_at": "2020-10-14T21:51:35Z", "merged_at": "2020-10-14T21:51:35Z", "merge_commit_sha": "b4a8e70957517ff44d6a9121422d266a3c5fd664", "assignee": null, "milestone": null, "draft": 0, "head": "4b021be087d0dba2b4ac0b872b2512f5b2203397", "base": "acf07a67722aa74828744726187690b59d342494", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1018", "merged_by": null, "auto_merge": null} {"id": 503685077, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAzNjg1MDc3", "number": 1022, "state": "closed", "locked": 0, "title": "Fix table name in spatialite example command", "user": {"value": 639012, "label": "jsfenfen"}, "body": "The example query for creating a new point geometry seems to be using a table called 'museums' but at one point it instead uses 'events'. I *believe* it is intended to be museums (the example makes more sense if so). ", "created_at": "2020-10-14T22:19:34Z", "updated_at": "2020-10-14T23:46:46Z", "closed_at": "2020-10-14T23:46:46Z", "merged_at": "2020-10-14T23:46:46Z", "merge_commit_sha": "4f7c0ebd85ccd8c1853d7aa0147628f7c1b749cc", "assignee": null, "milestone": null, "draft": 0, "head": "7cef70a5528af4626302729ff0ebc88d92b5f4ca", "base": "7f2edb5dd2074dce0090659021991695a984844b", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1022", "merged_by": null, "auto_merge": null} {"id": 505339515, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA1MzM5NTE1", "number": 1029, "state": "closed", "locked": 0, "title": "fix(docs): broken link", "user": {"value": 17075617, "label": "jthodge"}, "body": "This PR fixes a broken markdown link in the `Publish` docs page.", "created_at": "2020-10-17T20:03:20Z", "updated_at": "2020-10-17T20:05:04Z", "closed_at": "2020-10-17T20:05:04Z", "merged_at": "2020-10-17T20:05:04Z", "merge_commit_sha": "568bd7bbf590861687db8c318f3d8cfcd1dfb47a", "assignee": null, "milestone": null, "draft": 0, "head": "a65d30e832d7e65adc65dcce8ab006227e4dafe4", "base": "4f7c0ebd85ccd8c1853d7aa0147628f7c1b749cc", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1029", "merged_by": null, "auto_merge": null} {"id": 505453900, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA1NDUzOTAw", "number": 1030, "state": "open", "locked": 0, "title": "Make `package` command deal with a configuration directory argument", "user": {"value": 299380, "label": "frankier"}, "body": "Currently if we run `datasette package` on a configuration directory we'll get an exception when we try to hard link to the directory. This PR copies the tree and makes the Dockerfile run inspect on all *.db files.", "created_at": "2020-10-18T11:07:02Z", "updated_at": "2020-10-19T08:01:51Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "124142e4d2710525b09ff2bd2a7a787cbed163a4", "assignee": null, "milestone": null, "draft": 0, "head": "e0825334692967fec195e104cb6aa11095807a8e", "base": "c37a0a93ecb847e66cfe7b6f9452ba210fcae91b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1030", "merged_by": null, "auto_merge": null} {"id": 505769462, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA1NzY5NDYy", "number": 1031, "state": "closed", "locked": 0, "title": "Fallback to databases in inspect-data.json when no -i options are passed", "user": {"value": 299380, "label": "frankier"}, "body": "Currenlty `Datasette.__init__` checks immutables against None to decide whether to fallback to inspect-data.json. This patch modifies the serve command to pass None when no -i options are passed so this fallback works correctly.", "created_at": "2020-10-19T07:51:06Z", "updated_at": "2021-03-29T01:46:45Z", "closed_at": "2021-03-29T00:23:41Z", "merged_at": null, "merge_commit_sha": "3ee6b39e96ef684e1ac393bb269d804e957fee1d", "assignee": null, "milestone": null, "draft": 0, "head": "7e7eaa4e712b01de0b5a8a1b90145bdc1c3cd731", "base": "c37a0a93ecb847e66cfe7b6f9452ba210fcae91b", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1031", "merged_by": null, "auto_merge": null} {"id": 507267087, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA3MjY3MDg3", "number": 1038, "state": "closed", "locked": 0, "title": "DOC: Fix syntax error", "user": {"value": 194147, "label": "gerrymanoim"}, "body": "If I understand https://docs.datasette.io/en/stable/plugin_hooks.html#register-routes correctly, `register_routes` should return a `List[Tuple[str, Callable]]`. I believe the current code in documentation has a syntax error (extra `)`). ", "created_at": "2020-10-21T05:45:38Z", "updated_at": "2020-10-21T22:57:21Z", "closed_at": "2020-10-21T22:44:17Z", "merged_at": "2020-10-21T22:44:17Z", "merge_commit_sha": "6e26b057996c6f3fefa8ad528e2759e53c738844", "assignee": null, "milestone": null, "draft": 0, "head": "7fc0cce6a2d13ccc82c3584996acec236ae65df6", "base": "66120a7a1cb592e8a21164cf537f62a4d7ab1dfc", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1038", "merged_by": null, "auto_merge": null} {"id": 152870030, "node_id": "MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw", "number": 104, "state": "closed", "locked": 0, "title": "[WIP] Add publish to heroku support", "user": {"value": 21148, "label": "jacobian"}, "body": "\r\n\r\nRefs #90 ", "created_at": "2017-11-15T19:56:22Z", "updated_at": "2017-11-21T20:55:05Z", "closed_at": "2017-11-21T20:55:05Z", "merged_at": "2017-11-21T20:55:05Z", "merge_commit_sha": "e47117ce1d15f11246a3120aa49de70205713d05", "assignee": null, "milestone": null, "draft": 0, "head": "de42240afd1e3829fd21cbe77a89ab0eaab20d78", "base": "0331666e346c68b86de4aa19fbb37f3a408d37ca", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/104", "merged_by": null, "auto_merge": null} {"id": 507903392, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA3OTAzMzky", "number": 1040, "state": "closed", "locked": 0, "title": "/db/table/-/blob/pk/column.blob download URL", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #1036. Still needs:\r\n\r\n- [x] Comprehensive tests across all of the code branches, plus permissions\r\n- [x] A bit more refactoring to share logic cleanly with `RowView`\r\n- ~~A configuration option to disable this feature (probably)~~", "created_at": "2020-10-21T22:39:15Z", "updated_at": "2020-10-24T23:09:20Z", "closed_at": "2020-10-24T23:09:19Z", "merged_at": "2020-10-24T23:09:19Z", "merge_commit_sha": "5a1519796037105bc20bcf2f91a76e022926c204", "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "draft": 0, "head": "4f3165f25fd9241fcf1291c797f4c77766b954dc", "base": "bf82b3d6a605c9ddadd5fb739249dfe6defaf635", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1040", "merged_by": null, "auto_merge": null} {"id": 508719567, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA4NzE5NTY3", "number": 1043, "state": "closed", "locked": 0, "title": "Include LICENSE in sdist", "user": {"value": 45380, "label": "bollwyvl"}, "body": "Hi, thanks for `datasette`! \r\n\r\nThis PR adds the `LICENSE` to source distributions, which seems the norm for Apache-2.0 stuff.\r\n\r\nI noticed the [0.50.2 sdist](https://files.pythonhosted.org/packages/f2/ba/1b5f182c3f1769c0863bcaa77406bdcb81c92e31bb579959c01b1d8951c0/datasette-0.50.2.tar.gz) doesn't ship `LICENSE`, but the 0.5.2 `whl` does, so I'm assuming the intent _is_ to ship... and it's a one-liner!\r\n\r\nMotivation: \r\n\r\nIt might be a bit of a slog, but I'm looking to see about getting `datasette` (and friends!) available on conda-forge. There are a few missing upstreams (`asgi-csrf`, `python-basecov`, `mergedeep`) and some of the plugins don't even appear to _have_ tarballs (just `whl`!), but the little stuff like licenses are nice to get out handled upstream vs separately grabbing them.", "created_at": "2020-10-23T05:04:12Z", "updated_at": "2020-10-26T00:14:57Z", "closed_at": "2020-10-23T20:54:35Z", "merged_at": "2020-10-23T20:54:35Z", "merge_commit_sha": "976e5f74aae1fa0d406df6691dc8b5feeebe8788", "assignee": null, "milestone": null, "draft": 0, "head": "dc4129cb37060b52775d96e756f7cdb65ee76bc3", "base": "d0cc6f4c32e1f89238ddec782086b3122f445bd4", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1043", "merged_by": null, "auto_merge": null} {"id": 508720660, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA4NzIwNjYw", "number": 1044, "state": "closed", "locked": 0, "title": "Add minimum supported python", "user": {"value": 45380, "label": "bollwyvl"}, "body": "Thanks for `datasette`!\r\n\r\nThis PR adds `python_requires` to formally signal the [minimum supported python version](https://packaging.python.org/guides/dropping-older-python-versions/#specify-the-version-ranges-for-supported-python-distributions) (which is pointed out with classifiers, so seems pretty straightforward).", "created_at": "2020-10-23T05:08:03Z", "updated_at": "2020-10-23T20:53:08Z", "closed_at": "2020-10-23T20:53:08Z", "merged_at": "2020-10-23T20:53:08Z", "merge_commit_sha": "cab8e65261b117b493af6a0b21aa2e1ae4564419", "assignee": null, "milestone": null, "draft": 0, "head": "6453ab18e56b36bc912b6f24c4a43002c6084ade", "base": "d0cc6f4c32e1f89238ddec782086b3122f445bd4", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1044", "merged_by": null, "auto_merge": null} {"id": 509590205, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA5NTkwMjA1", "number": 1049, "state": "closed", "locked": 0, "title": "Add template block prior to extra URL loaders", "user": {"value": 82988, "label": "psychemedia"}, "body": "To handle packages that require Javascript state setting prior to loading a package (eg [`thebelab`](https://thebelab.readthedocs.io/en/latest/examples/minimal_example.html), provide a template block before the URLs are loaded.", "created_at": "2020-10-25T13:08:55Z", "updated_at": "2020-10-29T09:20:52Z", "closed_at": "2020-10-29T09:20:34Z", "merged_at": null, "merge_commit_sha": "99f994b14e2dbe22fda18b67dd5c824d359443fb", "assignee": null, "milestone": null, "draft": 0, "head": "50a743ad35684f09d3c3880f6af2019e59271237", "base": "42f4851e3e7885f1092f104d6c883cea40b12f02", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1049", "merged_by": null, "auto_merge": null} {"id": 511005542, "node_id": "MDExOlB1bGxSZXF1ZXN0NTExMDA1NTQy", "number": 1056, "state": "closed", "locked": 0, "title": "Radical new colour scheme and base styles, courtesy of @natbat", "user": {"value": 9599, "label": "simonw"}, "body": "", "created_at": "2020-10-27T19:31:48Z", "updated_at": "2020-10-27T19:39:57Z", "closed_at": "2020-10-27T19:39:56Z", "merged_at": "2020-10-27T19:39:56Z", "merge_commit_sha": "e5f5034bcdc71e4bc62a6a155ca60eb41910c335", "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "draft": 0, "head": "a7b2aabd5148c0ee382b583de68a4f0538f7dfb1", "base": "26bb4a268127da2c38f4241abe45444b2a6f7874", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1056", "merged_by": null, "auto_merge": null} {"id": 511549374, "node_id": "MDExOlB1bGxSZXF1ZXN0NTExNTQ5Mzc0", "number": 1059, "state": "closed", "locked": 0, "title": "Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "body": "Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "created_at": "2020-10-28T13:32:40Z", "updated_at": "2020-10-28T17:08:29Z", "closed_at": "2020-10-28T17:08:28Z", "merged_at": "2020-10-28T17:08:28Z", "merge_commit_sha": "879617265262024edd93722adcdcb6c21e57f5f7", "assignee": null, "milestone": null, "draft": 0, "head": "e46327a745c5a4130b65bf340d19dccc76441841", "base": "7d9fedc176717a7e3d22a96575ae0aada5a65440", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1059", "merged_by": null, "auto_merge": null} {"id": 511868153, "node_id": "MDExOlB1bGxSZXF1ZXN0NTExODY4MTUz", "number": 1060, "state": "closed", "locked": 0, "title": "New explicit versioning mechanism", "user": {"value": 9599, "label": "simonw"}, "body": "- Remove all references to versioneer\r\n- Re-implement versioning to use a static string baked into the repo\r\n- Ensure that string is output by `datasette --version` and `/-/versions`\r\n\r\nRefs #1054", "created_at": "2020-10-28T22:14:55Z", "updated_at": "2020-10-29T03:38:17Z", "closed_at": "2020-10-29T03:38:16Z", "merged_at": "2020-10-29T03:38:16Z", "merge_commit_sha": "cefd058c1c216a184bb63c79abba66893977c18e", "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "draft": 0, "head": "4725d46780783e9875bde5957f053ba19cf92ff0", "base": "abcf0222496d8148b2e585ffa0ff192270a04b06", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1060", "merged_by": null, "auto_merge": null} {"id": 512545364, "node_id": "MDExOlB1bGxSZXF1ZXN0NTEyNTQ1MzY0", "number": 1061, "state": "closed", "locked": 0, "title": ".blob output renderer", "user": {"value": 9599, "label": "simonw"}, "body": "- [x] Remove the `/-/...blob/...` route I added in #1040 in place of the new `.blob` renderer URLs\r\n- [x] Link to new `.blob` download links on the arbitrary query page (using `_blob_hash=...`) - plus tests for this\r\n\r\nCloses #1050, Closes #1051", "created_at": "2020-10-29T20:25:08Z", "updated_at": "2020-10-29T22:01:40Z", "closed_at": "2020-10-29T22:01:39Z", "merged_at": "2020-10-29T22:01:39Z", "merge_commit_sha": "78b3eeaad9189eb737014f53212082684f4bb0d4", "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "draft": 0, "head": "1196d084de6a7a6f68c7705a6cc096bb8df132e3", "base": "d6f9ff71378c4eab34dad181c23cfc143a4aef2d", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1061", "merged_by": null, "auto_merge": null} {"id": 512736705, "node_id": "MDExOlB1bGxSZXF1ZXN0NTEyNzM2NzA1", "number": 1065, "state": "closed", "locked": 0, "title": "Nav menu plus menu_links() hook", "user": {"value": 9599, "label": "simonw"}, "body": "Closes #1064, refs #690.", "created_at": "2020-10-30T03:40:18Z", "updated_at": "2020-10-30T03:45:17Z", "closed_at": "2020-10-30T03:45:16Z", "merged_at": "2020-10-30T03:45:16Z", "merge_commit_sha": "18a64fbb29271ce607937110bbdb55488c43f4e0", "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "draft": 0, "head": "5f118b56afbeff5348acd50a8b87537210e731ee", "base": "1a861be19e326e0c88230a711a1b6536366697d7", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1065", "merged_by": null, "auto_merge": null} {"id": 513106026, "node_id": "MDExOlB1bGxSZXF1ZXN0NTEzMTA2MDI2", "number": 1069, "state": "closed", "locked": 0, "title": "load_template() plugin hook", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #1042", "created_at": "2020-10-30T15:59:45Z", "updated_at": "2020-10-30T17:47:20Z", "closed_at": "2020-10-30T17:47:19Z", "merged_at": "2020-10-30T17:47:19Z", "merge_commit_sha": "81dea4b07ab2b6f4eaaf248307d2b588472054a1", "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "draft": 0, "head": "92f3840882a24da29d0d4073e5ed9d77fce438fc", "base": "fcf43589eb6a1f1d0432772a639fd35711c48e0c", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1069", "merged_by": null, "auto_merge": null} {"id": 152914480, "node_id": "MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw", "number": 107, "state": "closed", "locked": 0, "title": "add support for ?field__isnull=1", "user": {"value": 3433657, "label": "raynae"}, "body": "Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?", "created_at": "2017-11-15T23:36:36Z", "updated_at": "2017-11-17T15:12:29Z", "closed_at": "2017-11-17T13:29:22Z", "merged_at": "2017-11-17T13:29:22Z", "merge_commit_sha": "ed2b3f25beac720f14869350baacc5f62b065194", "assignee": null, "milestone": null, "draft": 0, "head": "14d5bb572fadbd45973580bd9ad2a16c2bf12909", "base": "b7c4165346ee8b6a6fbd72d6ba2275a24a8a8ae3", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/107", "merged_by": null, "auto_merge": null} {"id": 518988879, "node_id": "MDExOlB1bGxSZXF1ZXN0NTE4OTg4ODc5", "number": 1085, "state": "closed", "locked": 0, "title": "Use FTS4 in fixtures", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #1081", "created_at": "2020-11-11T06:44:30Z", "updated_at": "2020-11-12T00:02:59Z", "closed_at": "2020-11-12T00:02:58Z", "merged_at": "2020-11-12T00:02:58Z", "merge_commit_sha": "e8e0a6f284ca953b2980186c4356594c07bd1929", "assignee": null, "milestone": null, "draft": 0, "head": "51e7651c66aaf1804274ce68a6b5218bbba76338", "base": "2a981e2ac1d13125973904b777d00ea75e8df4e6", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1085", "merged_by": null, "auto_merge": null} {"id": 521276296, "node_id": "MDExOlB1bGxSZXF1ZXN0NTIxMjc2Mjk2", "number": 1097, "state": "closed", "locked": 0, "title": "Use f-strings", "user": {"value": 9599, "label": "simonw"}, "body": "Since Datasette now requires Python 3.6, how about some f-strings?\r\n\r\nI ran this in the `datasette` root checkout:\r\n```\r\npip install flynt\r\nflynt .\r\nblack .\r\n```", "created_at": "2020-11-15T23:12:36Z", "updated_at": "2020-11-15T23:24:24Z", "closed_at": "2020-11-15T23:24:23Z", "merged_at": "2020-11-15T23:24:23Z", "merge_commit_sha": "30e64c8d3b3728a86c3ca42a75322cc3feb5b0c8", "assignee": null, "milestone": null, "draft": 0, "head": "e89211d21eebb7a2e4588b06927da84416e3a555", "base": "6fd35be64de221eba4945ca24e8e1678f6142a73", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1097", "merged_by": null, "auto_merge": null} {"id": 528997614, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI4OTk3NjE0", "number": 1112, "state": "closed", "locked": 0, "title": "Fix --metadata doc usage", "user": {"value": 50527, "label": "jefftriplett"}, "body": "I stumbled on this while trying to figure out how to configure datasette-ripgrep via https://github.com/simonw/datasette-ripgrep/issues/15\r\n\r\nYou may not want to update the changelog (those are annoying) so I added two commits in case that's easier. ", "created_at": "2020-11-28T19:19:51Z", "updated_at": "2020-11-28T23:28:21Z", "closed_at": "2020-11-28T19:53:48Z", "merged_at": "2020-11-28T19:53:48Z", "merge_commit_sha": "bbde835a1fec01458e8d00929e7bab6d6a5ba948", "assignee": null, "milestone": {"value": 6055094, "label": "Datasette 0.52"}, "draft": 0, "head": "1a30fc259205df736daf068c57a0a6ae2c21ffa9", "base": "37d18a5bce08c9ee53c080f613bae84fc2ccc853", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1112", "merged_by": null, "auto_merge": null} {"id": 529783275, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI5NzgzMjc1", "number": 1117, "state": "closed", "locked": 0, "title": "Support for generated columns", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #1116. My first attempt at this worked on my laptop but broke in CI, so I'm going to iterate on it in a pull request instead.", "created_at": "2020-11-30T20:10:46Z", "updated_at": "2020-11-30T22:23:19Z", "closed_at": "2020-11-30T21:29:58Z", "merged_at": "2020-11-30T21:29:58Z", "merge_commit_sha": "461670a0b87efa953141b449a9a261919864ceb3", "assignee": null, "milestone": null, "draft": 0, "head": "ccdf2c650278b8b9465d3a2d7c916f3bb06c4f01", "base": "dea3c508b39528e566d711c38a467b3d372d220b", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1117", "merged_by": null, "auto_merge": null} {"id": 529887861, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI5ODg3ODYx", "number": 1120, "state": "closed", "locked": 0, "title": "generated_columns table in fixtures.py", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #1119", "created_at": "2020-12-01T00:17:19Z", "updated_at": "2020-12-01T00:28:03Z", "closed_at": "2020-12-01T00:28:02Z", "merged_at": "2020-12-01T00:28:02Z", "merge_commit_sha": "17cbbb1f7f230b39650afac62dd16476626001b5", "assignee": null, "milestone": null, "draft": 0, "head": "ddad8db2cc952eaf4f66f42324ccece115627b02", "base": "461670a0b87efa953141b449a9a261919864ceb3", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1120", "merged_by": null, "auto_merge": null} {"id": 530125695, "node_id": "MDExOlB1bGxSZXF1ZXN0NTMwMTI1Njk1", "number": 1122, "state": "closed", "locked": 0, "title": "Fix misaligned table actions cog", "user": {"value": 3243482, "label": "abdusco"}, "body": "Fixes https://github.com/simonw/datasette/issues/1121", "created_at": "2020-12-01T08:41:46Z", "updated_at": "2020-12-03T10:56:40Z", "closed_at": "2020-12-03T00:33:37Z", "merged_at": "2020-12-03T00:33:36Z", "merge_commit_sha": "daae35be46ec5cb8a207aa20986a4fa62e94777e", "assignee": null, "milestone": null, "draft": 0, "head": "94ea22f7b6b6c55b490c97b385f6eb6c1ea2121c", "base": "a970276b9999687b96c5e11ea1c817d814f5d267", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1122", "merged_by": null, "auto_merge": null} {"id": 532342025, "node_id": "MDExOlB1bGxSZXF1ZXN0NTMyMzQyMDI1", "number": 1128, "state": "closed", "locked": 0, "title": "Fix startup error on windows", "user": {"value": 3243482, "label": "abdusco"}, "body": "Fixes https://github.com/simonw/datasette/issues/1094\r\n\r\nThis import isn't used at all, and causes error on startup on Windows.", "created_at": "2020-12-04T07:12:26Z", "updated_at": "2020-12-06T08:41:45Z", "closed_at": "2020-12-05T19:35:04Z", "merged_at": "2020-12-05T19:35:04Z", "merge_commit_sha": "705d1a1555c4791e9be3b884285b047223ab184f", "assignee": null, "milestone": null, "draft": 0, "head": "7004c3b1462675ba3845b1efc82c816f1d2199e0", "base": "49d8fc056844d5a537d6cfd96dab0dd5686fe718", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1128", "merged_by": null, "auto_merge": null} {"id": 532348919, "node_id": "MDExOlB1bGxSZXF1ZXN0NTMyMzQ4OTE5", "number": 1130, "state": "open", "locked": 0, "title": "Fix footer not sticking to bottom in short pages", "user": {"value": 3243482, "label": "abdusco"}, "body": "Fixes https://github.com/simonw/datasette/issues/1129", "created_at": "2020-12-04T07:29:01Z", "updated_at": "2021-06-15T13:27:48Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "af3aa34786f134af8073342a3c4bb74b968750fd", "assignee": null, "milestone": null, "draft": 0, "head": "8d4c69c6fb0ef741a19070f5172017ea3522e83c", "base": "49d8fc056844d5a537d6cfd96dab0dd5686fe718", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1130", "merged_by": null, "auto_merge": null} {"id": 153201945, "node_id": "MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1", "number": 114, "state": "closed", "locked": 0, "title": "Add spatialite, switch to debian and local build", "user": {"value": 54999, "label": "ingenieroariel"}, "body": "Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite.", "created_at": "2017-11-17T02:37:09Z", "updated_at": "2017-11-17T03:50:52Z", "closed_at": "2017-11-17T03:50:52Z", "merged_at": "2017-11-17T03:50:52Z", "merge_commit_sha": "8b4c600d98b85655b3a1454ebf64f858b5fe54c8", "assignee": null, "milestone": null, "draft": 0, "head": "6c6b63d890529eeefcefb7ab126ea3bd7b2315c1", "base": "b7c4165346ee8b6a6fbd72d6ba2275a24a8a8ae3", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/114", "merged_by": null, "auto_merge": null} {"id": 539489525, "node_id": "MDExOlB1bGxSZXF1ZXN0NTM5NDg5NTI1", "number": 1145, "state": "closed", "locked": 0, "title": "Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "body": "Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.\n
\nRelease notes\n

Sourced from pytest's releases.

\n
\n

6.2.0

\n

pytest 6.2.0 (2020-12-12)

\n

Breaking Changes

\n
    \n
  • #7808: pytest now supports python3.6+ only.
  • \n
\n

Deprecations

\n
    \n
  • \n

    #7469: Directly constructing/calling the following classes/functions is now deprecated:

    \n
      \n
    • _pytest.cacheprovider.Cache
    • \n
    • _pytest.cacheprovider.Cache.for_config()
    • \n
    • _pytest.cacheprovider.Cache.clear_cache()
    • \n
    • _pytest.cacheprovider.Cache.cache_dir_from_config()
    • \n
    • _pytest.capture.CaptureFixture
    • \n
    • _pytest.fixtures.FixtureRequest
    • \n
    • _pytest.fixtures.SubRequest
    • \n
    • _pytest.logging.LogCaptureFixture
    • \n
    • _pytest.pytester.Pytester
    • \n
    • _pytest.pytester.Testdir
    • \n
    • _pytest.recwarn.WarningsRecorder
    • \n
    • _pytest.recwarn.WarningsChecker
    • \n
    • _pytest.tmpdir.TempPathFactory
    • \n
    • _pytest.tmpdir.TempdirFactory
    • \n
    \n

    These have always been considered private, but now issue a deprecation warning, which may become a hard error in pytest 7.0.0.

    \n
  • \n
  • \n

    #7530: The --strict command-line option has been deprecated, use --strict-markers instead.

    \n

    We have plans to maybe in the future to reintroduce --strict and make it an encompassing flag for all strictness\nrelated options (--strict-markers and --strict-config at the moment, more might be introduced in the future).

    \n
  • \n
  • \n

    #7988: The @pytest.yield_fixture decorator/function is now deprecated. Use pytest.fixture instead.

    \n

    yield_fixture has been an alias for fixture for a very long time, so can be search/replaced safely.

    \n
  • \n
\n

Features

\n
    \n
  • \n

    #5299: pytest now warns about unraisable exceptions and unhandled thread exceptions that occur in tests on Python>=3.8.\nSee unraisable for more information.

    \n
  • \n
  • \n

    #7425: New pytester fixture, which is identical to testdir but its methods return pathlib.Path when appropriate instead of py.path.local.

    \n

    This is part of the movement to use pathlib.Path objects internally, in order to remove the dependency to py in the future.

    \n

    Internally, the old Testdir <_pytest.pytester.Testdir> is now a thin wrapper around Pytester <_pytest.pytester.Pytester>, preserving the old interface.

    \n
  • \n
\n\n
\n
\n
\nChangelog\n

Sourced from pytest's changelog.

\n
\n
\nCommits\n
    \n
  • e7073af Prepare release version 6.2.0
  • \n
  • 683f29f Merge pull request #8129 from bluetech/docs-pygments-workaround
  • \n
  • 0feeddf doc: temporary workaround for pytest-pygments lexing error
  • \n
  • b478275 Merge pull request #8128 from bluetech/skip-reason-empty
  • \n
  • 3302ff9 terminal: when the skip/xfail is empty, don't show it as "()"
  • \n
  • 59bd0f6 Merge pull request #8126 from bluetech/tox-regen-pretend-scm2
  • \n
  • 6298ff1 tox: use pip legacy resolver for regen job
  • \n
  • d51ecbd Merge pull request #8125 from bluetech/tox-rm-pip-req
  • \n
  • f237b07 tox: remove requires: pip>=20.3.1
  • \n
  • 95e0e19 Merge pull request #8124 from bluetech/s0undt3ch-feature/skip-context-hook
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "created_at": "2020-12-14T14:22:16Z", "updated_at": "2021-01-24T21:20:29Z", "closed_at": "2020-12-16T21:44:39Z", "merged_at": "2020-12-16T21:44:39Z", "merge_commit_sha": "6119bd797366a899119f1bba51c1c8cba2efc8fc", "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "draft": 0, "head": "a8588f95568138c268e6802de0d1a4daffb7bda8", "base": "0c616f732cee79db80cad830917666f41b344262", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1145", "merged_by": null, "auto_merge": null} {"id": 153306882, "node_id": "MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy", "number": 115, "state": "closed", "locked": 0, "title": "Add keyboard shortcut to execute SQL query", "user": {"value": 198537, "label": "rgieseke"}, "body": "Very cool tool, thanks a lot!\r\n\r\nThis PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling.", "created_at": "2017-11-17T14:13:33Z", "updated_at": "2017-11-17T15:16:34Z", "closed_at": "2017-11-17T14:22:56Z", "merged_at": "2017-11-17T14:22:56Z", "merge_commit_sha": "eda848b37f8452dba7913583ef101f39d9b130ba", "assignee": null, "milestone": null, "draft": 0, "head": "bb514164e69400fc0be4e033c27f45f90b1ef651", "base": "ed2b3f25beac720f14869350baacc5f62b065194", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/115", "merged_by": null, "auto_merge": null} {"id": 544923437, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQ0OTIzNDM3", "number": 1158, "state": "closed", "locked": 0, "title": "Modernize code to Python 3.6+", "user": {"value": 6774676, "label": "eumiro"}, "body": "- compact dict and set building\r\n- remove redundant parentheses\r\n- simplify chained conditions\r\n- change method name to lowercase\r\n- use triple double quotes for docstrings\r\n\r\nplease feel free to accept/reject any of these independent commits", "created_at": "2020-12-23T16:21:38Z", "updated_at": "2021-01-24T21:20:50Z", "closed_at": "2020-12-23T17:04:32Z", "merged_at": "2020-12-23T17:04:32Z", "merge_commit_sha": "a882d679626438ba0d809944f06f239bcba8ee96", "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "draft": 0, "head": "37ce72f086d7807a32ea9012d6e6b5d235349152", "base": "90eba4c3ca569c57e96bce314e7ac8caf67d884e", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1158", "merged_by": null, "auto_merge": null} {"id": 545264436, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQ1MjY0NDM2", "number": 1159, "state": "open", "locked": 0, "title": "Improve the display of facets information", "user": {"value": 552629, "label": "lovasoa"}, "body": "This PR changes the display of facets to hopefully make them more readable.\r\n\r\nBefore | After\r\n---|---\r\n![image](https://user-images.githubusercontent.com/552629/103084609-b1ec2980-45df-11eb-85bc-68ab8df3e8d9.png) | ![image](https://user-images.githubusercontent.com/552629/103085220-620e6200-45e1-11eb-8189-5dd5d3e2569e.png)\r\n", "created_at": "2020-12-24T11:01:47Z", "updated_at": "2023-07-31T18:57:59Z", "closed_at": null, "merged_at": null, "merge_commit_sha": "0276c5609da34bfb660f65212e1a367e637979d7", "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "draft": 0, "head": "c820abd0bcb34d1ea5a03be64a2158ae7c42920c", "base": "a882d679626438ba0d809944f06f239bcba8ee96", "author_association": "FIRST_TIME_CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1159", "merged_by": null, "auto_merge": null} {"id": 153324301, "node_id": "MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx", "number": 117, "state": "closed", "locked": 0, "title": "Don't prevent tabbing to `Run SQL` button", "user": {"value": 198537, "label": "rgieseke"}, "body": "Mentioned in #115 \r\n\r\nHere you go!", "created_at": "2017-11-17T15:27:50Z", "updated_at": "2017-11-19T20:30:24Z", "closed_at": "2017-11-18T00:53:43Z", "merged_at": "2017-11-18T00:53:43Z", "merge_commit_sha": "6d39429daa4655e3cf7a6a7671493292a20a30a1", "assignee": null, "milestone": null, "draft": 0, "head": "7b4d00e87ed8ac931e6f5458599aece1a95d4e82", "base": "eda848b37f8452dba7913583ef101f39d9b130ba", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/117", "merged_by": null, "auto_merge": null} {"id": 548271472, "node_id": "MDExOlB1bGxSZXF1ZXN0NTQ4MjcxNDcy", "number": 1170, "state": "closed", "locked": 0, "title": "Install Prettier via package.json", "user": {"value": 3637, "label": "benpickles"}, "body": "This adds a package.json with Prettier and means that developers/CI will use the same version. It also ensures that NPM packages are cached on GitHub Actions which fixes #1169.", "created_at": "2021-01-04T14:18:03Z", "updated_at": "2021-01-24T21:21:01Z", "closed_at": "2021-01-04T19:52:34Z", "merged_at": "2021-01-04T19:52:33Z", "merge_commit_sha": "3054e0f7307da4c31850b74bd73238b33d6c750a", "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "draft": 0, "head": "a5761ccb8676ef1b98d95d8174211c98f140e3de", "base": "1e8fa3ac7cb2d6e516c47c306c86ed2334fc3dc0", "author_association": "CONTRIBUTOR", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1170", "merged_by": null, "auto_merge": null} {"id": 153432045, "node_id": "MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1", "number": 118, "state": "closed", "locked": 0, "title": "Foreign key information on row and table pages", "user": {"value": 9599, "label": "simonw"}, "body": "", "created_at": "2017-11-18T03:13:27Z", "updated_at": "2017-11-18T03:15:57Z", "closed_at": "2017-11-18T03:15:50Z", "merged_at": "2017-11-18T03:15:50Z", "merge_commit_sha": "1b04662585ea1539014bfbd616a8112b650d5699", "assignee": null, "milestone": null, "draft": 0, "head": "2fa60bc5e3c9d75c19e21a2384f52b58e1872fa8", "base": "6d39429daa4655e3cf7a6a7671493292a20a30a1", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/118", "merged_by": null, "auto_merge": null} {"id": 560725714, "node_id": "MDExOlB1bGxSZXF1ZXN0NTYwNzI1NzE0", "number": 1203, "state": "closed", "locked": 0, "title": "Easier way to run Prettier locally", "user": {"value": 9599, "label": "simonw"}, "body": "Refs #1167", "created_at": "2021-01-25T01:39:06Z", "updated_at": "2021-01-25T01:41:46Z", "closed_at": "2021-01-25T01:41:46Z", "merged_at": "2021-01-25T01:41:46Z", "merge_commit_sha": "ffff3a4c5398a9f40b61d59736f386444da19289", "assignee": null, "milestone": null, "draft": 0, "head": "98acc8865aa7826a40a7a076ab548ba8597af734", "base": "b6a7b58fa01af0cd5a5e94bd17d686d283a46819", "author_association": "OWNER", "repo": {"value": 107914493, "label": "datasette"}, "url": "https://github.com/simonw/datasette/pull/1203", "merged_by": null, "auto_merge": null}