{"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-798468572", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14", "id": 798468572, "node_id": "MDEyOklzc3VlQ29tbWVudDc5ODQ2ODU3Mg==", "user": {"value": 1234956, "label": "n8henrie"}, "created_at": "2021-03-13T14:47:31Z", "updated_at": "2021-03-13T14:47:31Z", "author_association": "NONE", "body": "Ok, new PR works. I'm not `git` enough so I just force-pushed over the old one.\r\n\r\nI still end up with a lot of activities that are missing an `id` and therefore skipped (since this is used as the primary key). For example:\r\n\r\n```\r\n{'workoutActivityType': 'HKWorkoutActivityTypeRunning', 'duration': '35.31666666666667', 'durationUnit': 'min', 'totalDistance': '4.010870267636999', 'totalDistanceUnit': 'mi', 'totalEnergyBurned': '660.3516235351562', 'totalEnergyBurnedUnit': 'Cal', 'sourceName': 'Strava', 'sourceVersion': '22810', 'creationDate': '2020-07-16 13:38:26 -0700', 'startDate': '2020-07-16 06:38:26 -0700', 'endDate': '2020-07-16 07:13:45 -0700'}\r\n```\r\n\r\nI also end up with some unhappy characters (in the skipped events), such as: `'sourceName': 'Nathan\u2019s Apple\\xa0Watch',`.\r\n\r\nBut it's successfully making it through the file, and the resulting db opens in datasette, so I'd call that progress.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 771608692, "label": "UNIQUE constraint failed: workouts.id"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-798436026", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14", "id": 798436026, "node_id": "MDEyOklzc3VlQ29tbWVudDc5ODQzNjAyNg==", "user": {"value": 1234956, "label": "n8henrie"}, "created_at": "2021-03-13T14:23:16Z", "updated_at": "2021-03-13T14:23:16Z", "author_association": "NONE", "body": "This PR allows my import to succeed.\r\n\r\nIt looks like some events don't have an `id`, but do have `HKExternalUUID` (which gets turned into `metadata_HKExternalUUID`), so I use this as a fallback.\r\n\r\nIf a record has neither of these, I changed it to just print the record (for debugging) and `return`.\r\n\r\nFor some odd reason this ran fine at first, and now (after removing the generated db and trying again) I'm getting a different error (duplicate column name).\r\n\r\nLooks like it may have run when I had two successive runs without remembering to delete the db in between. Will try to refactor.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 771608692, "label": "UNIQUE constraint failed: workouts.id"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/838#issuecomment-795950636", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/838", "id": 795950636, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NTk1MDYzNg==", "user": {"value": 79913, "label": "tsibley"}, "created_at": "2021-03-10T19:24:13Z", "updated_at": "2021-03-10T19:24:13Z", "author_association": "NONE", "body": "I think this could be solved by one of:\r\n\r\n1. Stop generating absolute URLs, e.g. ones that include an origin. Relative URLs with absolute paths are fine, as long as they take `base_url` into account (as they do now, yay!).\r\n2. Extend `base_url` to include the expected frontend origin, and then use that information when generating absolute URLs.\r\n3. Document which HTTP headers the reverse proxy should set (e.g. the `X-Forwarded-*` family of conventional headers) to pass the frontend origin information to Datasette, and then use that information when generating absolute URLs.\r\n\r\nOption 1 seems like the easiest to me, if you can get away with never having to generate an absolute URL.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637395097, "label": "Incorrect URLs when served behind a proxy with base_url set"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/838#issuecomment-795939998", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/838", "id": 795939998, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NTkzOTk5OA==", "user": {"value": 79913, "label": "tsibley"}, "created_at": "2021-03-10T19:16:55Z", "updated_at": "2021-03-10T19:16:55Z", "author_association": "NONE", "body": "Nod. The problem with the tests is that they're ignoring the origin (hostname, port) of links. In a reverse proxy situation, the frontend request origin is different than the backend request origin. The problem is Datasette generates links with the backend request origin.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637395097, "label": "Incorrect URLs when served behind a proxy with base_url set"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/838#issuecomment-795893813", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/838", "id": 795893813, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NTg5MzgxMw==", "user": {"value": 79913, "label": "tsibley"}, "created_at": "2021-03-10T18:43:39Z", "updated_at": "2021-03-10T18:43:39Z", "author_association": "NONE", "body": "@simonw Unfortunately this issue as I reported it is not actually solved in version 0.55.\r\n\r\nEvery link which is returned by the `Datasette.absolute_url` method is still wrong, because it uses the request URL as the base. This still includes the suggested facet links and pagination links.\r\n\r\nWhat I wrote originally still stands:\r\n\r\n> Although many of the URLs in the pages are correct (presumably because they either use absolute paths which include `base_url` or relative paths), the faceting and pagination links still use fully-qualified URLs pointing at `http://localhost:8001`.\r\n> \r\n> I looked into this a little in the source code, and it seems to be an issue anywhere `request.url` or `request.path` is used, as these contain the values for the request between the frontend (Apache) and backend (Datasette) server. Those properties are primarily used via the `path_with_\u2026` family of utility functions and the `Datasette.absolute_url` method.\r\n\r\n Would you prefer to re-open this issue or have me create a new one?\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637395097, "label": "Incorrect URLs when served behind a proxy with base_url set"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1256#issuecomment-795085921", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1256", "id": 795085921, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NTA4NTkyMQ==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-03-10T08:35:17Z", "updated_at": "2021-03-10T08:35:17Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1256?src=pr&el=h1) Report\n> Merging [#1256](https://codecov.io/gh/simonw/datasette/pull/1256?src=pr&el=desc) (4eef524) into [main](https://codecov.io/gh/simonw/datasette/commit/d0fd833b8cdd97e1b91d0f97a69b494895d82bee?el=desc) (d0fd833) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1256/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1256?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1256 +/- ##\n=======================================\n Coverage 91.56% 91.56% \n=======================================\n Files 34 34 \n Lines 4244 4244 \n=======================================\n Hits 3886 3886 \n Misses 358 358 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1256?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1256?src=pr&el=footer). Last update [d0fd833...4eef524](https://codecov.io/gh/simonw/datasette/pull/1256?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 827341657, "label": "Minor type in IP adress"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1254#issuecomment-794518438", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1254", "id": 794518438, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NDUxODQzOA==", "user": {"value": 3200608, "label": "durkie"}, "created_at": "2021-03-09T22:04:23Z", "updated_at": "2021-03-09T22:04:23Z", "author_association": "NONE", "body": "Dang, you're absolutely right. Spatialite 5.0 had been working fine for a plugin I was developing, but it apparently is broken in several other ways.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 826613352, "label": "Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1254#issuecomment-794441034", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1254", "id": 794441034, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NDQ0MTAzNA==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-03-09T20:54:18Z", "updated_at": "2021-03-09T21:12:15Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=h1) Report\n> Merging [#1254](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=desc) (b103204) into [main](https://codecov.io/gh/simonw/datasette/commit/d0fd833b8cdd97e1b91d0f97a69b494895d82bee?el=desc) (d0fd833) will **decrease** coverage by `0.04%`.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1254/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1254 +/- ##\n==========================================\n- Coverage 91.56% 91.51% -0.05% \n==========================================\n Files 34 34 \n Lines 4244 4244 \n==========================================\n- Hits 3886 3884 -2 \n- Misses 358 360 +2 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/database.py](https://codecov.io/gh/simonw/datasette/pull/1254/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2RhdGFiYXNlLnB5) | `92.93% <0.00%> (-0.75%)` | :arrow_down: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=footer). Last update [d0fd833...b103204](https://codecov.io/gh/simonw/datasette/pull/1254?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 826613352, "label": "Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1254#issuecomment-794443710", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1254", "id": 794443710, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NDQ0MzcxMA==", "user": {"value": 3200608, "label": "durkie"}, "created_at": "2021-03-09T20:56:45Z", "updated_at": "2021-03-09T20:56:45Z", "author_association": "NONE", "body": "Oh wow I didn't even see that you had opened an issue about this so recently. I'll check on `/dbname` and report back.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 826613352, "label": "Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1252#issuecomment-793308483", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1252", "id": 793308483, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MzMwODQ4Mw==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-03-09T03:06:10Z", "updated_at": "2021-03-09T03:06:10Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=h1) Report\n> Merging [#1252](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=desc) (d22aa32) into [main](https://codecov.io/gh/simonw/datasette/commit/d0fd833b8cdd97e1b91d0f97a69b494895d82bee?el=desc) (d0fd833) will **decrease** coverage by `0.04%`.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1252/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1252 +/- ##\n==========================================\n- Coverage 91.56% 91.51% -0.05% \n==========================================\n Files 34 34 \n Lines 4244 4244 \n==========================================\n- Hits 3886 3884 -2 \n- Misses 358 360 +2 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/database.py](https://codecov.io/gh/simonw/datasette/pull/1252/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2RhdGFiYXNlLnB5) | `92.93% <0.00%> (-0.75%)` | :arrow_down: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=footer). Last update [d0fd833...d22aa32](https://codecov.io/gh/simonw/datasette/pull/1252?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 825217564, "label": "Add back styling to lists within table cells (fixes #1141)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/858#issuecomment-792308036", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/858", "id": 792308036, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MjMwODAzNg==", "user": {"value": 1219001, "label": "robroc"}, "created_at": "2021-03-07T16:41:54Z", "updated_at": "2021-03-07T16:41:54Z", "author_association": "NONE", "body": "Apologies if I sound dense but I don't see where you would pass\n'shell=True'. I'm using the CLI installed via pip.\n\nOn Sun., Mar. 7, 2021, 2:15 a.m. David Smith, \nwrote:\n\n> To get it to work I had to:\n>\n> -\n>\n> add shell=true to the various commands in datasette\n> -\n>\n> use the name argument of the publish command. (\n> https://docs.datasette.io/en/stable/publish.html)\n>\n> \u2014\n> You are receiving this because you commented.\n> Reply to this email directly, view it on GitHub\n> ,\n> or unsubscribe\n> \n> .\n>\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642388564, "label": "publish heroku does not work on Windows 10"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/858#issuecomment-792230560", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/858", "id": 792230560, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MjIzMDU2MA==", "user": {"value": 39445562, "label": "smithdc1"}, "created_at": "2021-03-07T07:14:58Z", "updated_at": "2021-03-07T07:14:58Z", "author_association": "NONE", "body": "To get it to work I had to:\n\n- add `shell=true` to the various commands in datasette \n\n- use the name argument of the publish command. (https://docs.datasette.io/en/stable/publish.html)\n\n\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642388564, "label": "publish heroku does not work on Windows 10"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/858#issuecomment-792129022", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/858", "id": 792129022, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MjEyOTAyMg==", "user": {"value": 1219001, "label": "robroc"}, "created_at": "2021-03-07T00:23:34Z", "updated_at": "2021-03-07T00:23:34Z", "author_association": "NONE", "body": "@smithdc1 Can you tell us what you did to get it to publish in Windows? What commands did you pass?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 642388564, "label": "publish heroku does not work on Windows 10"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-791530093", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 791530093, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MTUzMDA5Mw==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-03-05T16:28:07Z", "updated_at": "2021-03-05T16:28:07Z", "author_association": "NONE", "body": "> I just tried to run this on a small VPS instance with 2GB of memory and it crashed out of memory while processing a 12GB mbox from Takeout.\r\n> \r\n> Is it possible to stream the emails to sqlite instead of loading it all into memory and upserting at once?\r\n\r\n@maxhawkins a limitation of the python mbox module is it loads the entire mbox into memory. I did find another approach to this problem that didn't use the builtin python mbox module and created a generator so that it didn't have to load the whole mbox into memory. I was hoping to use standard library modules, but this might be a good reason to investigate that approach a bit more. My worry is making sure a custom processor handles all the ins and outs of the mbox format correctly.\r\n\r\nHm. As I'm writing this, I thought of something. I think I can parse each message one at a time, and then use an mbox function to load each message using the python mbox module. That way the mbox module can still deal with the specifics of the mbox format, but I can use a generator.\r\n\r\nI'll give that a try. Thanks for the feedback @maxhawkins and @simonw. I'll give that a try.\r\n\r\n@simonw can we hold off on merging this until I can test this new approach?", "reactions": "{\"total_count\": 3, \"+1\": 3, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-791089881", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 791089881, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MTA4OTg4MQ==", "user": {"value": 28565, "label": "maxhawkins"}, "created_at": "2021-03-05T02:03:19Z", "updated_at": "2021-03-05T02:03:19Z", "author_association": "NONE", "body": "I just tried to run this on a small VPS instance with 2GB of memory and it crashed out of memory while processing a 12GB mbox from Takeout.\r\n\r\nIs it possible to stream the emails to sqlite instead of loading it all into memory and upserting at once?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/32#issuecomment-791053721", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32", "id": 791053721, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MTA1MzcyMQ==", "user": {"value": 6213, "label": "dsisnero"}, "created_at": "2021-03-05T00:31:27Z", "updated_at": "2021-03-05T00:31:27Z", "author_association": "NONE", "body": "I am getting the same thing for US West (N. California) us-west-1", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803333769, "label": "KeyError: 'Contents' on running upload"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-790934616", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4", "id": 790934616, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MDkzNDYxNg==", "user": {"value": 203343, "label": "Btibert3"}, "created_at": "2021-03-04T20:54:44Z", "updated_at": "2021-03-04T20:54:44Z", "author_association": "NONE", "body": "Sorry for the delay, I got sidetracked after class last night. I am getting the following error:\r\n\r\n```\r\n/content# google-takeout-to-sqlite mbox takeout.db Takeout/Mail/gmail.mbox \r\nUsage: google-takeout-to-sqlite [OPTIONS] COMMAND [ARGS]...Try 'google-takeout-to-sqlite --help' for help.\r\n\r\nError: No such command 'mbox'.\r\n```\r\n\r\nOn the box, I installed with pip after cloning: https://github.com/UtahDave/google-takeout-to-sqlite.git", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778380836, "label": "Feature Request: Gmail"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1238#issuecomment-790857004", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1238", "id": 790857004, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MDg1NzAwNA==", "user": {"value": 79913, "label": "tsibley"}, "created_at": "2021-03-04T19:06:55Z", "updated_at": "2021-03-04T19:06:55Z", "author_association": "NONE", "body": "@rgieseke Ah, that's super helpful. Thank you for the workaround for now!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813899472, "label": "Custom pages don't work with base_url setting"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790391711", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 790391711, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MDM5MTcxMQ==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-03-04T07:36:24Z", "updated_at": "2021-03-04T07:36:24Z", "author_association": "NONE", "body": "> Looks like you're doing this:\r\n> \r\n> ```python\r\n> elif message.get_content_type() == \"text/plain\":\r\n> body = message.get_payload(decode=True)\r\n> ```\r\n> \r\n> So presumably that decodes to a unicode string?\r\n> \r\n> I imagine the reason the column is a `BLOB` for me is that `sqlite-utils` determines the column type based on the first batch of items - https://github.com/simonw/sqlite-utils/blob/09c3386f55f766b135b6a1c00295646c4ae29bec/sqlite_utils/db.py#L1927-L1928 - and I got unlucky and had something in my first batch that wasn't a unicode string.\r\n\r\nAh, that's good to know. I think explicitly creating the tables will be a great improvement. I'll add that.\r\n\r\nAlso, I noticed after I opened this PR that the `message.get_payload()` is being deprecated in favor of `message.get_content()` or something like that. I'll see if that handles the decoding better, too.\r\n\r\nThanks for the feedback. I should have time tomorrow to put together some improvements.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790389335", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 790389335, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MDM4OTMzNQ==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-03-04T07:32:04Z", "updated_at": "2021-03-04T07:32:04Z", "author_association": "NONE", "body": "> The command takes quite a while to start running, presumably because this line causes it to have to scan the WHOLE file in order to generate a count:\r\n> \r\n> https://github.com/dogsheep/google-takeout-to-sqlite/blob/a3de045eba0fae4b309da21aa3119102b0efc576/google_takeout_to_sqlite/utils.py#L66-L67\r\n> \r\n> I'm fine with waiting though. It's not like this is a command people run every day - and without that count we can't show a progress bar, which seems pretty important for a process that takes this long.\r\n\r\nThe wait is from python loading the mbox file. This happens regardless if you're getting the length of the mbox. The mbox module is on the slow side. It is possible to do one's own parsing of the mbox, but I kind of wanted to avoid doing that.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/268#issuecomment-790257263", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/268", "id": 790257263, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MDI1NzI2Mw==", "user": {"value": 649467, "label": "mhalle"}, "created_at": "2021-03-04T03:20:23Z", "updated_at": "2021-03-04T03:20:23Z", "author_association": "NONE", "body": "It's kind of an ugly hack, but you can try out what using the fts5 table as an actual datasette-accessible table looks like without changing any datasette code by creating yet another view on top of the fts5 table:\r\n\r\n`create view proxyview as select *, rank, table_fts as fts from table_fts;`\r\n\r\nThat's now visible from datasette, just like any other view, but you can use `fts match escape_fts(search_string) order by rank`.\r\n\r\nThis is only good as a proof of concept because you're inefficiently going from view -> fts5 external content table -> view -> data table. However, it does show it works.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323718842, "label": "Mechanism for ranking results from SQLite full-text search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-790198930", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4", "id": 790198930, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MDE5ODkzMA==", "user": {"value": 203343, "label": "Btibert3"}, "created_at": "2021-03-04T00:58:40Z", "updated_at": "2021-03-04T00:58:40Z", "author_association": "NONE", "body": "I am just seeing this sorry, yes! I will kick the tires later on tonight. My apologies for the delay.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778380836, "label": "Feature Request: Gmail"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/283#issuecomment-789680230", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/283", "id": 789680230, "node_id": "MDEyOklzc3VlQ29tbWVudDc4OTY4MDIzMA==", "user": {"value": 605492, "label": "justinpinkney"}, "created_at": "2021-03-03T12:28:42Z", "updated_at": "2021-03-03T12:28:42Z", "author_association": "NONE", "body": "One note on using this pragma I got an error on starting datasette `no such table: pragma_database_list`. \r\n\r\nI diagnosed this to an older version of sqlite3 (3.14.2) and upgrading to a newer version (3.34.2) fixed the issue.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 325958506, "label": "Support cross-database joins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/268#issuecomment-789409126", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/268", "id": 789409126, "node_id": "MDEyOklzc3VlQ29tbWVudDc4OTQwOTEyNg==", "user": {"value": 649467, "label": "mhalle"}, "created_at": "2021-03-03T03:57:15Z", "updated_at": "2021-03-03T03:58:40Z", "author_association": "NONE", "body": "In FTS5, I think doing an FTS search is actually much easier than doing a join against the main table like datasette does now. In fact, FTS5 external content tables provide a transparent interface back to the original table or view.\r\n\r\nHere's what I'm currently doing:\r\n* build a view that joins whatever tables I want and rename the columns to non-joiny names (e.g, `chapter.name AS chapter_name` in the view where needed)\r\n* Create an FTS5 table with `content=\"viewname\"`\r\n* As described in the \"external content tables\" section (https://www.sqlite.org/fts5.html#external_content_tables), sql queries can be made directly to the FTS table, which behind the covers makes select calls to the content table when the content of the original columns are needed.\r\n* In addition, you get \"rank\" and \"bm25()\" available to you when you select on the _fts table.\r\n\r\nUnfortunately, datasette doesn't currently seem happy being coerced into doing a real query on an fts5 table. This works:\r\n```select col1, col2, col3 from table_fts where coll1=\"value\" and table_fts match escape_fts(\"search term\") order by rank```\r\n\r\nBut this doesn't work in the datasette SQL query interface:\r\n```select col1, col2, col3 from table_fts where coll1=\"value\" and table_fts match escape_fts(:search) order by rank``` (the \"search\" input text field doesn't show up)\r\n\r\nFor what datasette is doing right now, I think you could just use contentless fts5 tables (`content=\"\"`), since all you care about is the rowid since all you're doing a subselect to get the rowid anyway. In fts5, that's just a contentless table.\r\n\r\nI guess if you want to follow this suggestion, you'd need a somewhat different code path for fts5.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323718842, "label": "Mechanism for ranking results from SQLite full-text search"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/242#issuecomment-787150276", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/242", "id": 787150276, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NzE1MDI3Ng==", "user": {"value": 37962604, "label": "polyrand"}, "created_at": "2021-02-27T21:27:26Z", "updated_at": "2021-02-27T21:27:26Z", "author_association": "NONE", "body": "I had this resource by Seth Michael Larson saved https://github.com/sethmlarson/pycon-async-sync-poster I haven't had a look at it, but it may contain useful info.\r\n\r\nOn twitter, I mentioned passing an aiosqlite connection during the `Database` creation. I'm not 100% familiar with the `sqlite-utils` codebase, so I may be wrong here, but maybe decorating internal functions could be an option? Then they are awaited or not inside the decorator depending on how they are called.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 817989436, "label": "Async support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1243#issuecomment-785485597", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1243", "id": 785485597, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NTQ4NTU5Nw==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-02-25T00:28:30Z", "updated_at": "2021-02-25T00:28:30Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1243?src=pr&el=h1) Report\n> Merging [#1243](https://codecov.io/gh/simonw/datasette/pull/1243?src=pr&el=desc) (887bfd2) into [main](https://codecov.io/gh/simonw/datasette/commit/726f781c50e88f557437f6490b8479c3d6fabfc2?el=desc) (726f781) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1243/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1243?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1243 +/- ##\n=======================================\n Coverage 91.56% 91.56% \n=======================================\n Files 34 34 \n Lines 4242 4242 \n=======================================\n Hits 3884 3884 \n Misses 358 358 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1243?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1243?src=pr&el=footer). Last update [726f781...32652d9](https://codecov.io/gh/simonw/datasette/pull/1243?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 815955014, "label": "fix small typo"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-784638394", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 784638394, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NDYzODM5NA==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-02-24T00:36:18Z", "updated_at": "2021-02-24T00:36:18Z", "author_association": "NONE", "body": "I noticed that @simonw is using black for formatting. I ran black on my additions in this PR.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1241#issuecomment-784347646", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1241", "id": 784347646, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NDM0NzY0Ng==", "user": {"value": 7107523, "label": "Kabouik"}, "created_at": "2021-02-23T16:55:26Z", "updated_at": "2021-02-23T16:57:39Z", "author_association": "NONE", "body": "> I think it's possible that many users these days no longer assume they can paste a URL from the browser address bar (if they ever understood that at all) because to many apps are SPAs with broken URLs.\r\n\r\nAbsolutely, that's why I thought my corner case with `iframe` preventing access to the datasette URL could actually be relevant in more general situations.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 814595021, "label": "Share button for copying current URL"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1240#issuecomment-784312460", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1240", "id": 784312460, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NDMxMjQ2MA==", "user": {"value": 7107523, "label": "Kabouik"}, "created_at": "2021-02-23T16:07:10Z", "updated_at": "2021-02-23T16:08:28Z", "author_association": "NONE", "body": "Likewise, while answering to another issue regarding the Vega plugin, I realized that there is no such way of linking rows after a custom query, I only get this \"Link\" column with individual URLs for the default SQL view:\r\n\r\n![ss-2021-02-23_170559](https://user-images.githubusercontent.com/7107523/108871491-1e3fd500-75f1-11eb-8f76-5d5a82cc14d7.png)\r\n\r\nOr is it there and I am just missing the option in my custom queries?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 814591962, "label": "Allow facetting on custom queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1218#issuecomment-784157345", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1218", "id": 784157345, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NDE1NzM0NQ==", "user": {"value": 1244799, "label": "soobrosa"}, "created_at": "2021-02-23T12:12:17Z", "updated_at": "2021-02-23T12:12:17Z", "author_association": "NONE", "body": "Topline this fixed the same problem for me.\r\n```\r\nbrew install python@3.7\r\nln -s /usr/local/opt/python@3.7/bin/python3.7 /usr/local/opt/python/bin/python3.7\r\npip3 uninstall -y numpy\r\npip3 uninstall -y setuptools\r\npip3 install setuptools\r\npip3 install numpy\r\npip3 install datasette-publish-fly\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803356942, "label": " /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-783794520", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 783794520, "node_id": "MDEyOklzc3VlQ29tbWVudDc4Mzc5NDUyMA==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-02-23T01:13:54Z", "updated_at": "2021-02-23T01:13:54Z", "author_association": "NONE", "body": "Also, @simonw I created a test based off the existing tests. I think it's working correctly", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-783688547", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4", "id": 783688547, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MzY4ODU0Nw==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-02-22T21:31:28Z", "updated_at": "2021-02-22T21:31:28Z", "author_association": "NONE", "body": "@Btibert3 I've opened a PR with my initial attempt at this. Would you be willing to give this a try?\r\n\r\nhttps://github.com/dogsheep/google-takeout-to-sqlite/pull/5", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778380836, "label": "Feature Request: Gmail"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/220#issuecomment-783662968", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/220", "id": 783662968, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MzY2Mjk2OA==", "user": {"value": 649467, "label": "mhalle"}, "created_at": "2021-02-22T20:44:51Z", "updated_at": "2021-02-22T20:44:51Z", "author_association": "NONE", "body": "Actually, coming back to this, I have a clearer use case for enabling fts generation for views: making it easier to bring in text from lookup tables and other joins. \r\n\r\nThe datasette documentation describes populating an fts table like so:\r\n```\r\nINSERT INTO \"items_fts\" (rowid, name, description, category_name)\r\n SELECT items. rowid,\r\n items.name,\r\n items.description,\r\n categories.name\r\n FROM items JOIN categories ON items.category_id=categories.id;\r\n```\r\nAlternatively if you have fts support in sqlite_utils for views (which sqlite and fts5 support), you can do the same thing just by creating a view that captures the above joins as columns, then creating an fts table from that view. Such an fts table can be created using sqlite_utils, where one created with your method can't. \r\n\r\nThe resulting fts table can then be used by a whole family of related tables and views in the manner you described earlier in this issue. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 783778672, "label": "Better error message for *_fts methods against views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1166#issuecomment-783560017", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1166", "id": 783560017, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MzU2MDAxNw==", "user": {"value": 94334, "label": "thorn0"}, "created_at": "2021-02-22T18:00:57Z", "updated_at": "2021-02-22T18:13:11Z", "author_association": "NONE", "body": "Hi! I don't think Prettier supports this syntax for globs: `datasette/static/*[!.min].js` Are you sure that works?\r\nPrettier uses https://github.com/mrmlnc/fast-glob, which in turn uses https://github.com/micromatch/micromatch, and the docs for these packages don't mention this syntax. As per the docs, square brackets should work as in regexes (`foo-[1-5].js`).\r\n\r\nTested it. Apparently, it works as a negated character class in regexes (like `[^.min]`). I wonder where this syntax comes from. Micromatch doesn't support that:\r\n\r\n```js\r\nmicromatch(['static/table.js', 'static/n.js'], ['static/*[!.min].js']);\r\n// result: [\"static/n.js\"] -- brackets are treated like [!.min] in regexes, without negation\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777140799, "label": "Adopt Prettier for JavaScript code formatting"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-783265830", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 783265830, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MzI2NTgzMA==", "user": {"value": 30665, "label": "frankieroberto"}, "created_at": "2021-02-22T10:21:14Z", "updated_at": "2021-02-22T10:21:14Z", "author_association": "NONE", "body": "@simonw:\r\n\r\n> The problem there is that ?_size=x isn't actually doing the same thing as the SQL limit keyword.\r\n\r\nInteresting! Although I don't think it matters too much what the underlying implementation is - I more meant that `limit` is familiar to developers conceptually as \"up to and including this number, if they exist\", whereas \"size\" is potentially more ambiguous. However, it's probably no big deal either way.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-782756398", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 782756398, "node_id": "MDEyOklzc3VlQ29tbWVudDc4Mjc1NjM5OA==", "user": {"value": 601316, "label": "simonrjones"}, "created_at": "2021-02-20T22:05:48Z", "updated_at": "2021-02-20T22:05:48Z", "author_association": "NONE", "body": "> I think it\u2019s a good idea if the top level item of the response JSON is always an object, rather than an array, at least as the default.\n\nI agree it is more predictable if the top level item is an object with a rows or data object that contains an array of data, which then allows for other top-level meta data. \n\nI can see the argument for removing this and just using an array for convenience - but I think that's OK as an option (as you have now).\n\nRather than have lots of top-level keys you could have a \"meta\" object to contain non-data stuff. You could use something like \"links\" for API endpoint URLs (or use a standard like HAL). Which would then leave the top level a bit cleaner - if that's what you what. \n\nHave you had much feedback from users who use the Datasette API a lot?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-782746755", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 782746755, "node_id": "MDEyOklzc3VlQ29tbWVudDc4Mjc0Njc1NQ==", "user": {"value": 30665, "label": "frankieroberto"}, "created_at": "2021-02-20T20:44:05Z", "updated_at": "2021-02-20T20:44:05Z", "author_association": "NONE", "body": "Minor suggestion: rename `size` query param to `limit`, to better reflect that it\u2019s a maximum number of rows returned rather than a guarantee of getting that number, and also for consistency with the SQL keyword?\r\n\r\nI like the idea of specifying a limit of 0 if you don\u2019t want any rows data - and returning an empty array under the `rows` key seems fine.\r\n\r\nHave you given any thought as to whether to pretty print (format with spaces) the output or not? Can be useful for debugging/exploring in a browser or other basic tools which don\u2019t parse the JSON. Could be default (can\u2019t be much bigger with gzip?) or opt-in.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-782745199", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 782745199, "node_id": "MDEyOklzc3VlQ29tbWVudDc4Mjc0NTE5OQ==", "user": {"value": 30665, "label": "frankieroberto"}, "created_at": "2021-02-20T20:32:03Z", "updated_at": "2021-02-20T20:32:03Z", "author_association": "NONE", "body": "I think it\u2019s a good idea if the top level item of the response JSON is always an object, rather than an array, at least as the default. Mainly because it allows you to add extra keys in a backwards-compatible way. Also just seems more expected somehow.\r\n\r\nThe API design guidance for the UK government also recommends this: https://www.gov.uk/guidance/gds-api-technical-and-data-standards#use-json\r\n\r\nI also strongly dislike having versioned APIs (eg with a `/v1/` path prefix, as it invariably means that old versions stop working at some point, even though the bit of the API you\u2019re using might not have changed at all.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 1}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1232#issuecomment-781599929", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1232", "id": 781599929, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MTU5OTkyOQ==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-02-18T19:59:54Z", "updated_at": "2021-02-18T22:06:42Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=h1) Report\n> Merging [#1232](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=desc) (8876499) into [main](https://codecov.io/gh/simonw/datasette/commit/4df548e7668b5b21d64a267964951e67894f4712?el=desc) (4df548e) will **increase** coverage by `0.03%`.\n> The diff coverage is `100.00%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1232/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1232 +/- ##\n==========================================\n+ Coverage 91.42% 91.46% +0.03% \n==========================================\n Files 32 32 \n Lines 3955 3970 +15 \n==========================================\n+ Hits 3616 3631 +15 \n Misses 339 339 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1232/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `95.68% <100.00%> (+0.06%)` | :arrow_up: |\n| [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1232/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `76.62% <100.00%> (+0.36%)` | :arrow_up: |\n| [datasette/views/database.py](https://codecov.io/gh/simonw/datasette/pull/1232/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2RhdGFiYXNlLnB5) | `97.19% <100.00%> (+0.01%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=footer). Last update [4df548e...8876499](https://codecov.io/gh/simonw/datasette/pull/1232?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 811407131, "label": "--crossdb option for joining across databases"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-781451701", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4", "id": 781451701, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MTQ1MTcwMQ==", "user": {"value": 203343, "label": "Btibert3"}, "created_at": "2021-02-18T16:06:21Z", "updated_at": "2021-02-18T16:06:21Z", "author_association": "NONE", "body": "Awesome!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778380836, "label": "Feature Request: Gmail"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1230#issuecomment-781330466", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1230", "id": 781330466, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MTMzMDQ2Ng==", "user": {"value": 7107523, "label": "Kabouik"}, "created_at": "2021-02-18T13:06:22Z", "updated_at": "2021-02-18T15:22:15Z", "author_association": "NONE", "body": "[Edit] Oh, I just saw the \"Load all\" button under the cluster map as well as the [setting to alter the max number or results](https://docs.datasette.io/en/stable/settings.html#max-returned-rows). So I guess this issue only is about the Vega charts.\r\n\r\n
\r\nNote that datasette-cluster-map also seems to be limited to 998 displayed points: \r\n\r\n![ss-2021-02-18_140548](https://user-images.githubusercontent.com/7107523/108361225-15fb2a80-71ea-11eb-9a19-d885e8513f55.png)\r\n
", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 811054000, "label": "Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/283#issuecomment-780991910", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/283", "id": 780991910, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MDk5MTkxMA==", "user": {"value": 9308268, "label": "rayvoelker"}, "created_at": "2021-02-18T02:13:56Z", "updated_at": "2021-02-18T02:13:56Z", "author_association": "NONE", "body": "I was going ask you about this issue when we talk during your office-hours schedule this Friday, but was there any support ever added for doing this cross-database joining?\r\n\r\nI have a use-case where could be pretty neat to do analysis using this tool on time-specific databases from snapshots\r\n\r\nhttps://ilsweb.cincinnatilibrary.org/collection-analysis/\r\n\r\n![image](https://user-images.githubusercontent.com/9308268/108294883-ba3a8e00-7164-11eb-9206-fcd5a8cdd883.png)\r\n\r\nand thanks again for such an amazing tool!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 325958506, "label": "Support cross-database joins"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-780817596", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4", "id": 780817596, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MDgxNzU5Ng==", "user": {"value": 306240, "label": "UtahDave"}, "created_at": "2021-02-17T20:01:35Z", "updated_at": "2021-02-17T20:01:35Z", "author_association": "NONE", "body": "I've got this almost working. Just needs some polish", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778380836, "label": "Feature Request: Gmail"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/227#issuecomment-779785638", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/227", "id": 779785638, "node_id": "MDEyOklzc3VlQ29tbWVudDc3OTc4NTYzOA==", "user": {"value": 295329, "label": "camallen"}, "created_at": "2021-02-16T11:48:03Z", "updated_at": "2021-02-16T11:48:03Z", "author_association": "NONE", "body": "Thank you @simonw ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 807174161, "label": "Error reading csv files with large column data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1220#issuecomment-778467759", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1220", "id": 778467759, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ==", "user": {"value": 30607, "label": "aborruso"}, "created_at": "2021-02-12T21:35:17Z", "updated_at": "2021-02-12T21:35:17Z", "author_association": "NONE", "body": "Thank you", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806743116, "label": "Installing datasette via docker: Path 'fixtures.db' does not exist"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778014990", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 778014990, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODAxNDk5MA==", "user": {"value": 675335, "label": "leafgarland"}, "created_at": "2021-02-12T06:54:14Z", "updated_at": "2021-02-12T06:54:14Z", "author_association": "NONE", "body": "Ahh, that might be because macOS Big Sur has changed the structure of the photos db. Might need to wait for a later release, there is a PR which adds support for Big Sur. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1220#issuecomment-778008752", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1220", "id": 778008752, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg==", "user": {"value": 30607, "label": "aborruso"}, "created_at": "2021-02-12T06:37:34Z", "updated_at": "2021-02-12T06:37:34Z", "author_association": "NONE", "body": "I have used my path, I'm running it from the folder in wich I have the db.\n\nDo I must an absolute path?\n\nDo I must create exactly that folder?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806743116, "label": "Installing datasette via docker: Path 'fixtures.db' does not exist"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778002092", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 778002092, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODAwMjA5Mg==", "user": {"value": 11855322, "label": "robmarkcole"}, "created_at": "2021-02-12T06:19:32Z", "updated_at": "2021-02-12T06:19:32Z", "author_association": "NONE", "body": "hi @leafgarland that results in a new error:\r\n```\r\n(venv) (base) Robins-MacBook:datasette robin$ dogsheep-photos apple-photos photos.db\r\nTraceback (most recent call last):\r\n File \"/Users/robin/datasette/venv/bin/dogsheep-photos\", line 8, in \r\n sys.exit(cli())\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/cli.py\", line 206, in apple_photos\r\n db.conn.execute(\r\nsqlite3.OperationalError: no such table: attached.ZGENERICASSET\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-777951854", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33", "id": 777951854, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzk1MTg1NA==", "user": {"value": 675335, "label": "leafgarland"}, "created_at": "2021-02-12T03:54:39Z", "updated_at": "2021-02-12T03:54:39Z", "author_association": "NONE", "body": "I think that is a typo in the docs, you can use\r\n\r\n > dogsheep-photos apple-photos photos.db", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 803338729, "label": "photo-to-sqlite: command not found"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1223#issuecomment-777949755", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1223", "id": 777949755, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzk0OTc1NQ==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-02-12T03:45:31Z", "updated_at": "2021-02-12T03:45:31Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=h1) Report\n> Merging [#1223](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=desc) (d1cd1f2) into [main](https://codecov.io/gh/simonw/datasette/commit/9603d893b9b72653895318c9104d754229fdb146?el=desc) (9603d89) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1223/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1223 +/- ##\n=======================================\n Coverage 91.42% 91.42% \n=======================================\n Files 32 32 \n Lines 3955 3955 \n=======================================\n Hits 3616 3616 \n Misses 339 339 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=footer). Last update [9603d89...d1cd1f2](https://codecov.io/gh/simonw/datasette/pull/1223?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806918878, "label": "Add compile option to Dockerfile to fix failing test (fixes #696)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777690332", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11", "id": 777690332, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzY5MDMzMg==", "user": {"value": 3613583, "label": "dskrad"}, "created_at": "2021-02-11T18:16:01Z", "updated_at": "2021-02-11T18:16:01Z", "author_association": "NONE", "body": "I solved this issue by modifying line 31 of utils.py\r\n\r\n content = ET.tostring(ET.fromstring(content_xml.strip())).decode(\"utf-8\")", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792851444, "label": "XML parse error"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774730656", "issue_url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9", "id": 774730656, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDczMDY1Ng==", "user": {"value": 635179, "label": "merwok"}, "created_at": "2021-02-07T18:45:04Z", "updated_at": "2021-02-07T18:45:04Z", "author_association": "NONE", "body": "That URL uses TLS 1.3, but maybe only if the client supports it.\r\nIt could be your Python version or your SSL library that\u2019s not recent enough.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 801780625, "label": "SSL Error"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774726123", "issue_url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9", "id": 774726123, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDcyNjEyMw==", "user": {"value": 12669260, "label": "jfeiwell"}, "created_at": "2021-02-07T18:21:08Z", "updated_at": "2021-02-07T18:21:08Z", "author_association": "NONE", "body": "@simonw any ideas here?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 801780625, "label": "SSL Error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1217#issuecomment-774528913", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1217", "id": 774528913, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDUyODkxMw==", "user": {"value": 639730, "label": "virtadpt"}, "created_at": "2021-02-06T19:23:41Z", "updated_at": "2021-02-06T19:23:41Z", "author_association": "NONE", "body": "I've had a lot of success running it as an OpenFaaS lambda.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 802513359, "label": "Possible to deploy as a python app (for Rstudio connect server)?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1217#issuecomment-774385092", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1217", "id": 774385092, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDM4NTA5Mg==", "user": {"value": 6165713, "label": "plpxsk"}, "created_at": "2021-02-06T02:49:11Z", "updated_at": "2021-02-06T02:49:11Z", "author_association": "NONE", "body": "A good reference seems to be the note to run `datasette` as a module in https://github.com/simonw/datasette/pull/556\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 802513359, "label": "Possible to deploy as a python app (for Rstudio connect server)?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-774217792", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 774217792, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NDIxNzc5Mg==", "user": {"value": 1049910, "label": "drkane"}, "created_at": "2021-02-05T18:44:13Z", "updated_at": "2021-02-05T18:44:13Z", "author_association": "NONE", "body": "Thanks for looking at this - home schooling kids has prevented me from replying. \r\n\r\nI'd struggled with how to adapt the API for the foreign keys too - I definitely tried the String/Tuple approach. I hadn't considered the breaking changes that would introduce though. I can take a look at this and try and make the change - see which of your options works best.\r\n\r\nI've got a workaround for the use-case I was looking at this for, so it wouldn't be a problem for me if it was put on the back burner until a hypothetical v4.0 anyway.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1210#issuecomment-773977128", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1210", "id": 773977128, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Mzk3NzEyOA==", "user": {"value": 525780, "label": "heyarne"}, "created_at": "2021-02-05T11:30:34Z", "updated_at": "2021-02-05T11:30:34Z", "author_association": "NONE", "body": "Thanks for your quick reply! Having changed my `metadata.yml`, queries AND database I can't really reproduce it anymore, sorry. But at least I'm happy to say that it works now! :) Thanks again for the super nifty tool, very appreciated.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796234313, "label": "Immutable Database w/ Canned Queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-772408273", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56", "id": 772408273, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MjQwODI3Mw==", "user": {"value": 42315895, "label": "gsajko"}, "created_at": "2021-02-03T10:36:36Z", "updated_at": "2021-02-03T10:36:36Z", "author_association": "NONE", "body": "I figured it out.\r\nThose tweets are in database, because somebody quote tweeted them, or retweeted them.\r\nAnd if you grab quoted tweet or reweeted tweet from other tweet json, It doesn't grab all of the details.\r\n\r\nSo if someone quote tweeted a quote tweet, the second quote tweet won't have `quoted_status`. \r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796736607, "label": "Not all quoted statuses get fetched?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1159#issuecomment-770865698", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1159", "id": 770865698, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDg2NTY5OA==", "user": {"value": 552629, "label": "lovasoa"}, "created_at": "2021-02-01T13:42:29Z", "updated_at": "2021-02-01T13:42:29Z", "author_association": "NONE", "body": "@simonw : Could you have a look at this ? I think this really improves readability.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 774332247, "label": "Improve the display of facets information"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1211#issuecomment-770343684", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1211", "id": 770343684, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDM0MzY4NA==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-01-31T08:03:40Z", "updated_at": "2021-01-31T08:03:40Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=h1) Report\n> Merging [#1211](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=desc) (e33ccaa) into [main](https://codecov.io/gh/simonw/datasette/commit/dde3c500c73ace33529672f7d862b76753d309cc?el=desc) (dde3c50) will **decrease** coverage by `0.00%`.\n> The diff coverage is `92.85%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1211/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1211 +/- ##\n==========================================\n- Coverage 91.54% 91.53% -0.01% \n==========================================\n Files 32 32 \n Lines 3948 3959 +11 \n==========================================\n+ Hits 3614 3624 +10 \n- Misses 334 335 +1 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `77.29% <66.66%> (-0.31%)` | :arrow_down: |\n| [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `95.62% <100.00%> (+<0.01%)` | :arrow_up: |\n| [datasette/publish/cloudrun.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvY2xvdWRydW4ucHk=) | `96.96% <100.00%> (+0.09%)` | :arrow_up: |\n| [datasette/publish/heroku.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3B1Ymxpc2gvaGVyb2t1LnB5) | `87.73% <100.00%> (+0.60%)` | :arrow_up: |\n| [datasette/utils/\\_\\_init\\_\\_.py](https://codecov.io/gh/simonw/datasette/pull/1211/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.13% <100.00%> (+0.02%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=footer). Last update [dde3c50...e33ccaa](https://codecov.io/gh/simonw/datasette/pull/1211?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797649915, "label": "Use context manager instead of plain open"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769973212", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56", "id": 769973212, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTk3MzIxMg==", "user": {"value": 42315895, "label": "gsajko"}, "created_at": "2021-01-29T18:29:02Z", "updated_at": "2021-01-29T18:31:55Z", "author_association": "NONE", "body": "I think it was with `twitter-to-sqlite home-timeline home.db -a auth.json --since`\r\nand Im using only this command to grab tweets \r\n\r\nfrom cron tab\r\n`2,7,12,17,22,27,32,37,42,47,52,57 * * * * run-one /home/gsajko/miniconda3/bin/twitter-to-sqlite home-timeline /home/gsajko/work/custom_twitter_feed/home.db -a /home/gsajko/work/custom_twitter_feed/auth/auth.json --since`\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 796736607, "label": "Not all quoted statuses get fetched?"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-767888743", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54", "id": 767888743, "node_id": "MDEyOklzc3VlQ29tbWVudDc2Nzg4ODc0Mw==", "user": {"value": 19328961, "label": "henry501"}, "created_at": "2021-01-26T23:07:41Z", "updated_at": "2021-01-26T23:07:41Z", "author_association": "NONE", "body": "My import got much further with the applied fixes than 0.21.3, but not 100%. I do appear to have all of the tweets imported at least. \r\nNot sure when I'll have a chance to look further to try to fix or see what didn't make it into the import.\r\n\r\nHere's my output:\r\n\r\n```\r\ndirect-messages-group: not yet implemented\r\nbranch-links: not yet implemented\r\nperiscope-expired-broadcasts: not yet implemented\r\ndirect-messages: not yet implemented\r\nmute: not yet implemented\r\nperiscope-comments-made-by-user: not yet implemented\r\nperiscope-ban-information: not yet implemented\r\nperiscope-profile-description: not yet implemented\r\nscreen-name-change: not yet implemented\r\nmanifest: not yet implemented\r\nfleet: not yet implemented\r\nuser-link-clicks: not yet implemented\r\nperiscope-broadcast-metadata: not yet implemented\r\ncontact: not yet implemented\r\nfleet-mute: not yet implemented\r\ndevice-token: not yet implemented\r\nprotected-history: not yet implemented\r\ndirect-message-mute: not yet implemented\r\nTraceback (most recent call last):\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/bin/twitter-to-sqlite\", line 33, in \r\n sys.exit(load_entry_point('twitter-to-sqlite==0.21.3', 'console_scripts', 'twitter-to-sqlite')())\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/cli.py\", line 772, in import_\r\n archive.import_from_file(db, filepath.name, open(filepath, \"rb\").read())\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 233, in import_from_file\r\n to_insert = transformer(data)\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 21, in callback\r\n return {filename: [fn(item) for item in data]}\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 21, in \r\n return {filename: [fn(item) for item in data]}\r\n File \"/Users/henry/.local/share/virtualenvs/python-sqlite-testing-mF3G2xKl/lib/python3.9/site-packages/twitter_to_sqlite/archive.py\", line 88, in ageinfo\r\n return item[\"ageMeta\"][\"ageInfo\"]\r\nKeyError: 'ageInfo'\r\n\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 779088071, "label": "Archive import appears to be broken on recent exports"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1206#issuecomment-766589070", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1206", "id": 766589070, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NjU4OTA3MA==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-01-25T06:50:30Z", "updated_at": "2021-01-25T17:31:11Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=h1) Report\n> Merging [#1206](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=desc) (06480e1) into [main](https://codecov.io/gh/simonw/datasette/commit/a5ede3cdd455e2bb1a1fb2f4e1b5a9855caf5179?el=desc) (a5ede3c) will **not change** coverage.\n> The diff coverage is `100.00%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1206/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1206 +/- ##\n=======================================\n Coverage 91.53% 91.53% \n=======================================\n Files 32 32 \n Lines 3947 3947 \n=======================================\n Hits 3613 3613 \n Misses 334 334 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/version.py](https://codecov.io/gh/simonw/datasette/pull/1206/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZlcnNpb24ucHk=) | `100.00% <100.00%> (\u00f8)` | |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=footer). Last update [a5ede3c...571476d](https://codecov.io/gh/simonw/datasette/pull/1206?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 793086333, "label": "Release 0.54"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/224#issuecomment-765678057", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/224", "id": 765678057, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTY3ODA1Nw==", "user": {"value": 37962604, "label": "polyrand"}, "created_at": "2021-01-22T20:53:06Z", "updated_at": "2021-01-23T20:13:27Z", "author_association": "NONE", "body": "I'm using the FTS methods in sqlite-utils for this website: [drwn.io](https://drwn.io/). I wanted to get pagination to have some kind of infinite scrolling in the landing page, and I ended up using that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792297010, "label": "Add fts offset docs."}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1196#issuecomment-765639968", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1196", "id": 765639968, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTYzOTk2OA==", "user": {"value": 2826376, "label": "QAInsights"}, "created_at": "2021-01-22T19:37:15Z", "updated_at": "2021-01-22T19:37:15Z", "author_association": "NONE", "body": "I tried deployment in WSL. It is working fine https://jmeter.vercel.app/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 791237799, "label": "Access Denied Error in Windows"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765525338", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 765525338, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTUyNTMzOA==", "user": {"value": 25372415, "label": "cobiadigital"}, "created_at": "2021-01-22T16:22:44Z", "updated_at": "2021-01-22T16:22:44Z", "author_association": "NONE", "body": "rs1333049 associated with coronary artery disease\r\nhttps://www.snpedia.com/index.php/Rs1333049\r\n```\r\n\r\nselect rsid, genotype, case genotype\r\n when 'CC' then '1.9x increased risk for coronary artery disease'\r\n when 'CG' then '1.5x increased risk for CAD'\r\n when 'GG' then 'normal'\r\nend as interpretation from genome where rsid = 'rs1333049'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765523517", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 765523517, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTUyMzUxNw==", "user": {"value": 25372415, "label": "cobiadigital"}, "created_at": "2021-01-22T16:20:25Z", "updated_at": "2021-01-22T16:20:25Z", "author_association": "NONE", "body": "rs53576: the oxytocin receptor (OXTR) gene\r\n\r\n```\r\nselect rsid, genotype, case genotype\r\n when 'AA' then 'Lack of empathy?'\r\n when 'AG' then 'Lack of empathy?'\r\n when 'GG' then 'Optimistic and empathetic; handle stress well'\r\nend as interpretation from genome where rsid = 'rs53576'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765506901", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 765506901, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTUwNjkwMQ==", "user": {"value": 25372415, "label": "cobiadigital"}, "created_at": "2021-01-22T15:58:41Z", "updated_at": "2021-01-22T15:58:58Z", "author_association": "NONE", "body": "Both rs10757274 and rs2383206 can both indicate higher risks of heart disease\r\nhttps://www.snpedia.com/index.php/Rs2383206\r\n\r\n```\r\nselect rsid, genotype, case genotype\r\n when 'AA' then 'Normal'\r\n when 'AG' then '~1.2x increased risk for heart disease'\r\n when 'GG' then '~1.3x increased risk for heart disease'\r\nend as interpretation from genome where rsid = 'rs10757274'\r\n```\r\n\r\n```\r\nselect rsid, genotype, case genotype\r\n when 'AA' then 'Normal'\r\n when 'AG' then '1.4x increased risk for heart disease'\r\n when 'GG' then '1.7x increased risk for heart disease'\r\nend as interpretation from genome where rsid = 'rs2383206'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765502845", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 765502845, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTUwMjg0NQ==", "user": {"value": 25372415, "label": "cobiadigital"}, "created_at": "2021-01-22T15:53:19Z", "updated_at": "2021-01-22T15:53:19Z", "author_association": "NONE", "body": "rs7903146 Influences risk of Type-2 diabetes\r\nhttps://www.snpedia.com/index.php/Rs7903146\r\n```\r\nselect rsid, genotype, case genotype\r\n when 'CC' then 'Normal (lower) risk of Type 2 Diabetes and Gestational Diabetes.'\r\n when 'CT' then '1.4x increased risk for diabetes (and perhaps colon cancer).'\r\n when 'TT' then '2x increased risk for Type-2 diabetes'\r\nend as interpretation from genome where rsid = 'rs7903146'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765498984", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 765498984, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTQ5ODk4NA==", "user": {"value": 25372415, "label": "cobiadigital"}, "created_at": "2021-01-22T15:48:25Z", "updated_at": "2021-01-22T15:49:33Z", "author_association": "NONE", "body": "The \"Warrior Gene\" https://www.snpedia.com/index.php/Rs4680\r\n\r\n```\r\nselect rsid, genotype, case genotype\r\n when 'AA' then '(worrier) advantage in memory and attention tasks'\r\n when 'AG' then 'Intermediate dopamine levels, other effects'\r\n when 'GG' then '(warrior) multiple associations, see details'\r\nend as interpretation from genome where rsid = 'rs4680'\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765495861", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 765495861, "node_id": "MDEyOklzc3VlQ29tbWVudDc2NTQ5NTg2MQ==", "user": {"value": 25372415, "label": "cobiadigital"}, "created_at": "2021-01-22T15:44:00Z", "updated_at": "2021-01-22T15:44:00Z", "author_association": "NONE", "body": "Risk of autoimmune disorders: https://www.snpedia.com/index.php/Genotype\r\n```\r\nselect rsid, genotype, case genotype\r\n when 'AA' then '2x risk of rheumatoid arthritis and other autoimmune diseases'\r\n when 'GG' then 'Normal risk for autoimmune disorders'\r\nend as interpretation from genome where rsid = 'rs2476601'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1175#issuecomment-762488336", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1175", "id": 762488336, "node_id": "MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg==", "user": {"value": 758858, "label": "hannseman"}, "created_at": "2021-01-18T21:59:28Z", "updated_at": "2021-01-18T22:00:31Z", "author_association": "NONE", "body": "I encountered your issue when trying to find a solution and came up with the following, maybe it can help.\r\n\r\n```python\r\nimport logging.config\r\nfrom typing import Tuple\r\n\r\nimport structlog\r\nimport uvicorn\r\n\r\nfrom example.config import settings\r\n\r\nshared_processors: Tuple[structlog.types.Processor, ...] = (\r\n structlog.contextvars.merge_contextvars,\r\n structlog.stdlib.add_logger_name,\r\n structlog.stdlib.add_log_level,\r\n structlog.processors.TimeStamper(fmt=\"iso\"),\r\n)\r\n\r\nlogging_config = {\r\n \"version\": 1,\r\n \"disable_existing_loggers\": False,\r\n \"formatters\": {\r\n \"json\": {\r\n \"()\": structlog.stdlib.ProcessorFormatter,\r\n \"processor\": structlog.processors.JSONRenderer(),\r\n \"foreign_pre_chain\": shared_processors,\r\n },\r\n \"console\": {\r\n \"()\": structlog.stdlib.ProcessorFormatter,\r\n \"processor\": structlog.dev.ConsoleRenderer(),\r\n \"foreign_pre_chain\": shared_processors,\r\n },\r\n **uvicorn.config.LOGGING_CONFIG[\"formatters\"],\r\n },\r\n \"handlers\": {\r\n \"default\": {\r\n \"level\": \"DEBUG\",\r\n \"class\": \"logging.StreamHandler\",\r\n \"formatter\": \"json\" if not settings.debug else \"console\",\r\n },\r\n \"uvicorn.access\": {\r\n \"level\": \"INFO\",\r\n \"class\": \"logging.StreamHandler\",\r\n \"formatter\": \"access\",\r\n },\r\n \"uvicorn.default\": {\r\n \"level\": \"INFO\",\r\n \"class\": \"logging.StreamHandler\",\r\n \"formatter\": \"default\",\r\n },\r\n },\r\n \"loggers\": {\r\n \"\": {\"handlers\": [\"default\"], \"level\": \"INFO\"},\r\n \"uvicorn.error\": {\r\n \"handlers\": [\"default\" if not settings.debug else \"uvicorn.default\"],\r\n \"level\": \"INFO\",\r\n \"propagate\": False,\r\n },\r\n \"uvicorn.access\": {\r\n \"handlers\": [\"default\" if not settings.debug else \"uvicorn.access\"],\r\n \"level\": \"INFO\",\r\n \"propagate\": False,\r\n },\r\n },\r\n}\r\n\r\n\r\ndef setup_logging() -> None:\r\n structlog.configure(\r\n processors=[\r\n structlog.stdlib.filter_by_level,\r\n *shared_processors,\r\n structlog.stdlib.PositionalArgumentsFormatter(),\r\n structlog.processors.StackInfoRenderer(),\r\n structlog.processors.format_exc_info,\r\n structlog.stdlib.ProcessorFormatter.wrap_for_formatter,\r\n ],\r\n context_class=dict,\r\n logger_factory=structlog.stdlib.LoggerFactory(),\r\n wrapper_class=structlog.stdlib.AsyncBoundLogger,\r\n cache_logger_on_first_use=True,\r\n )\r\n logging.config.dictConfig(logging_config)\r\n```\r\n\r\nAnd then it'll be run on the startup event:\r\n```python\r\n@app.on_event(\"startup\")\r\nasync def startup_event() -> None:\r\n setup_logging()\r\n```\r\n\r\nIt depends on a setting called `debug` which controls if it should output the regular uvicorn logging or json. ", "reactions": "{\"total_count\": 15, \"+1\": 7, \"-1\": 0, \"laugh\": 1, \"hooray\": 1, \"confused\": 0, \"heart\": 5, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 779156520, "label": "Use structlog for logging"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1036#issuecomment-762391426", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1036", "id": 762391426, "node_id": "MDEyOklzc3VlQ29tbWVudDc2MjM5MTQyNg==", "user": {"value": 4997607, "label": "philshem"}, "created_at": "2021-01-18T17:45:00Z", "updated_at": "2021-01-18T17:45:00Z", "author_association": "NONE", "body": "It might be possible with this library: https://docs.python.org/3/library/imghdr.html\r\n\r\nquick test of the downloaded blob:\r\n\r\n```\r\n>>> import imghdr\r\n>>> imghdr.what('material_culture-1-image.blob')\r\n'jpeg'\r\n```\r\n\r\nThe output plugin would be cool. I'll look into making my first datasette plugin. I'm also imagining displaying the image in the browser -- but that would be a step 2.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 725996507, "label": "Make it possible to download BLOB data from the Datasette UI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1036#issuecomment-762385981", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1036", "id": 762385981, "node_id": "MDEyOklzc3VlQ29tbWVudDc2MjM4NTk4MQ==", "user": {"value": 4997607, "label": "philshem"}, "created_at": "2021-01-18T17:32:13Z", "updated_at": "2021-01-18T17:34:50Z", "author_association": "NONE", "body": "Hi Simon\r\n\r\nJust finding this old issue regarding downloading blobs. Nice work!\r\n\r\n\"image\"\r\n\r\nAs a feature request, maybe it would be possible to assign a blob column as a certain data type (e.g. `.jpg`) and then each blob could be downloaded as that type of file (perhaps if the file types were constrained to normal blobs that people store in sqlite databases, this could avoid the execution stuff mentioned above).\r\n\r\nI guess the column blob-type definition could fit into this dropdown selection:\r\n\r\n\"image\"\r\n\r\nLet me know if I should open a new issue with a feature request. (This could slowly go in the direction of displaying image blob-types in the browser.)\r\n\r\nThanks for the great tool!\r\n\r\n\r\n---\r\n\r\nedit: just reading the rest of the twitter thread: https://twitter.com/simonw/status/1318685933256855552\r\n\r\nperhaps this is already possible in some form with the plugin datasette-media: https://github.com/simonw/datasette-media", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 725996507, "label": "Make it possible to download BLOB data from the Datasette UI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/220#issuecomment-761015218", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/220", "id": 761015218, "node_id": "MDEyOklzc3VlQ29tbWVudDc2MTAxNTIxOA==", "user": {"value": 649467, "label": "mhalle"}, "created_at": "2021-01-15T15:40:08Z", "updated_at": "2021-01-15T15:40:08Z", "author_association": "NONE", "body": "Make sense. If you're coming from the sqlite3 side of things, rather than the datasette side, wanting the fts methods to work for views makes more sense. sqlite3 allows fts5 tables on views, so I was looking for CLI functionality to build the fts virtual tables. Ultimately, though, sharing fts virtual tables across tables and derivative views is likely more efficient. \r\n\r\nMaybe an explicit error message like, \"fts is not supported for views\" rather than just throwing an exception that the method doesn't exist\" might be helpful. Not critical though.\r\n\r\nThanks.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 783778672, "label": "Better error message for *_fts methods against views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1159#issuecomment-759306228", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1159", "id": 759306228, "node_id": "MDEyOklzc3VlQ29tbWVudDc1OTMwNjIyOA==", "user": {"value": 552629, "label": "lovasoa"}, "created_at": "2021-01-13T08:59:31Z", "updated_at": "2021-01-13T08:59:31Z", "author_association": "NONE", "body": "@simonw : Did you have the time to take a look at this ?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 774332247, "label": "Improve the display of facets information"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1091#issuecomment-758668359", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1091", "id": 758668359, "node_id": "MDEyOklzc3VlQ29tbWVudDc1ODY2ODM1OQ==", "user": {"value": 6739646, "label": "tballison"}, "created_at": "2021-01-12T13:52:29Z", "updated_at": "2021-01-12T13:52:29Z", "author_association": "NONE", "body": "Y, thank you to both of you!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 742011049, "label": ".json and .csv exports fail to apply base_url"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1091#issuecomment-758448525", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1091", "id": 758448525, "node_id": "MDEyOklzc3VlQ29tbWVudDc1ODQ0ODUyNQ==", "user": {"value": 19328961, "label": "henry501"}, "created_at": "2021-01-12T06:55:08Z", "updated_at": "2021-01-12T06:55:08Z", "author_association": "NONE", "body": "Great, really happy I could help! Reverse proxies get tricky.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 742011049, "label": ".json and .csv exports fail to apply base_url"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1091#issuecomment-758280611", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1091", "id": 758280611, "node_id": "MDEyOklzc3VlQ29tbWVudDc1ODI4MDYxMQ==", "user": {"value": 6739646, "label": "tballison"}, "created_at": "2021-01-11T23:06:10Z", "updated_at": "2021-01-11T23:06:10Z", "author_association": "NONE", "body": "+1\r\n\r\nYep! Fixes it. If I navigate to https://corpora.tika.apache.org/datasette, I get a 404 (database not found: datasette), but if I navigate to https://corpora.tika.apache.org/datasette/file_profiles/, everything WORKS!\r\n\r\nThank you!", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 742011049, "label": ".json and .csv exports fail to apply base_url"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1091#issuecomment-756425587", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1091", "id": 756425587, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NjQyNTU4Nw==", "user": {"value": 19328961, "label": "henry501"}, "created_at": "2021-01-07T22:27:19Z", "updated_at": "2021-01-07T22:27:19Z", "author_association": "NONE", "body": "I found this issue while troubleshooting the same behavior with an nginx reverse proxy. The solution was to make sure I set:\r\n\r\n`proxy_pass http://server:8001/baseurl/ \r\n`\r\ninstead of just:\r\n\r\n`proxy_pass http://server:8001\r\n`\r\nThe custom SQL query and header links are now correct.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 742011049, "label": ".json and .csv exports fail to apply base_url"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754911290", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754911290, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA==", "user": {"value": 59874, "label": "rcoup"}, "created_at": "2021-01-05T21:31:15Z", "updated_at": "2021-01-05T21:31:15Z", "author_association": "NONE", "body": "We did this for [Sno](https://sno.earth) under macOS \u2014 it's a PyInstaller binary/setup which uses [Packages](http://s.sudre.free.fr/Software/Packages/about.html) for packaging.\r\n\r\n* [Building & Signing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L67-L95)\r\n* [Packaging & Notarizing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L121-L215)\r\n* [Github Workflow](https://github.com/koordinates/sno/blob/master/.github/workflows/build.yml#L228-L269) has the CI side of it\r\n\r\nFYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an \"EV CodeSigning for HSM\" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with [azuresigntool](https://github.com/vcsjones/AzureSignTool) to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time).\r\n", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-754210356", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 754210356, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIxMDM1Ng==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2021-01-04T20:49:05Z", "updated_at": "2021-01-04T20:49:05Z", "author_association": "NONE", "body": "For reasons [I've written about elsewhere](https://blog.carlmjohnson.net/post/2020/time-to-kill-ie11/), I'm in favor of modules. It has several beneficial effects. One, old browsers just ignore it all together. Two, if you include the same plain script on the page more than once, it will be executed twice, but if you include the same module script on a page twice, it will only execute once. Three, you get a module local namespace, instead of having to use the global window namespace or a function private namespace.\r\n\r\nOTOH, if you are going to use an old style script, the code from before isn't ideal, because you wipe out your registry if the script it included more than once. Also you may as well use object methods and splat arguments.\r\n\r\nThe event based architecture probably makes more sense though. Just make up some event names prefixed with `datasette:` and listen for them on the root. The only concern with that approach is it can sometimes be tricky to make sure your plugins are run after datasette has run. Maybe \r\n\r\n```js\r\nfunction mycallback(){\r\n // whatever\r\n}\r\n\r\nif (window.datasette) {\r\n window.datasette.init(mycallback);\r\n} else {\r\n document.addEventListener('datasette:init', mycallback);\r\n}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-754181647", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 754181647, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Nw==", "user": {"value": 11941245, "label": "jussiarpalahti"}, "created_at": "2021-01-04T19:52:40Z", "updated_at": "2021-01-04T19:52:40Z", "author_association": "NONE", "body": "I was thinking JavaScript plugins going with server side template extensions custom HTML. Attach my own widgets on there and listen for Datasette events to refresh when user interacts with main UI. Like a map view or table that updates according to selected column. There's certainly other ways to look at this. Perhaps you could list possible hooks or high level design doc on what would be possible with the plugin system?\n\nRe: modules. I would like to see modules supported at least in development. The developer experience is so much better than what JavaScript coding has been in the past. With large parts of NPM at your disposal I\u2019d imagine even less experienced coder can whisk a custom plugin in no time. Proper production build system (like one you get with Pika or Parcel) could package everything up into bundles that older browsers can understand. Though that does come with performance and size penalties alongside the added complexity. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1170#issuecomment-754002859", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1170", "id": 754002859, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDAwMjg1OQ==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-01-04T14:22:52Z", "updated_at": "2021-01-04T14:22:52Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=h1) Report\n> Merging [#1170](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=desc) (a5761cc) into [main](https://codecov.io/gh/simonw/datasette/commit/1e8fa3ac7cb2d6e516c47c306c86ed2334fc3dc0?el=desc) (1e8fa3a) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1170/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1170 +/- ##\n=======================================\n Coverage 91.55% 91.55% \n=======================================\n Files 32 32 \n Lines 3932 3932 \n=======================================\n Hits 3600 3600 \n Misses 332 332 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=footer). Last update [1e8fa3a...a5761cc](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778126516, "label": "Install Prettier via package.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753600999", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753600999, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzYwMDk5OQ==", "user": {"value": 475613, "label": "MarkusH"}, "created_at": "2021-01-03T11:11:21Z", "updated_at": "2021-01-03T11:11:21Z", "author_association": "NONE", "body": "With regards to JS/Browser events, given your example of menu items that plugins could add, I could imagine this code to work:\r\n\r\n```js\r\n// as part of datasette\r\ndatasette.events.AddMenuItem = 'DatasetteAddMenuItemEvent';\r\ndocument.addEventListener(datasette.events.AddMenuItem, (e) => {\r\n // do whatever is needed to add the menu item. Data comes from `e`\r\n alert(e.title + ' ' + e.link);\r\n});\r\n\r\n// as part of a plugin\r\nconst event = new Event(datasette.events.AddMenuItem, {link: '/foo/bar', title: 'Go somewhere'});\r\nDocument.dispatchEvent(event)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753587963", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753587963, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU4Nzk2Mw==", "user": {"value": 154364, "label": "dracos"}, "created_at": "2021-01-03T09:02:50Z", "updated_at": "2021-01-03T10:00:05Z", "author_association": "NONE", "body": "> but I'm already commited to requiring support for () => {} arrow functions\r\n\r\nDon't think you are :) (e.g. gzipped, using arrow functions in my example saves 2 bytes over spelling out function). On FMS, past month, looking at popular browsers, looks like we'd have 95.41% arrow support, 94.19% module support, and 4.58% (mostly IE9/IE11/Safari 9) supporting neither.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753224999", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753224999, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyNDk5OQ==", "user": {"value": 11941245, "label": "jussiarpalahti"}, "created_at": "2020-12-31T23:29:36Z", "updated_at": "2020-12-31T23:29:36Z", "author_association": "NONE", "body": "I have yet to build Datasette plugin and am unfamiliar with Pluggy. Since browsers have event handling builtin Datasette could communicate with plugins through it. Handlers register as listeners for custom Datasette events and Datasette's JS can then trigger said events.\r\n\r\nI was also wondering if you had looked at Javascript Modules for JS plugins? With services like Skypack (https://www.skypack.dev) NPM libraries can be loaded directly into browser, no build step needed. Same goes for local JS if you adhere to ES Module spec. \r\n\r\nIf minification is required then tools such as Snowpack (https://www.snowpack.dev) could fit better. It uses https://github.com/evanw/esbuild for bundling and minification.\r\n\r\nOn plugins you'd simply:\r\n\r\n```javascript\r\nimport {register} from '/assets/js/datasette'\r\nregister.on({'click' : my_func})\r\n```\r\n\r\nIn Datasette HTML pages' head you'd merely import these files as modules one by one.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753218817", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753218817, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIxODgxNw==", "user": {"value": 173848, "label": "yozlet"}, "created_at": "2020-12-31T22:32:25Z", "updated_at": "2020-12-31T22:32:25Z", "author_association": "NONE", "body": "Amazing work! And you've put in far more work than I'd expect to reduce the payload (which is admirable).\r\n\r\nSo, to add a plugin with the current design, it goes in (a) the template or (b) a bookmarklet, right?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1165#issuecomment-753033121", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1165", "id": 753033121, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzAzMzEyMQ==", "user": {"value": 154364, "label": "dracos"}, "created_at": "2020-12-31T19:33:47Z", "updated_at": "2020-12-31T19:33:47Z", "author_association": "NONE", "body": "Sorry to go on about it, but it's my only example ;) And thought it might be of interest/use. Here is FixMyStreet's Cypress workflow https://github.com/mysociety/fixmystreet/blob/master/.github/workflows/cypress.yml with the master script that sets up server etc at https://github.com/mysociety/fixmystreet/blob/master/bin/browser-tests (that has features such as working inside/outside Vagrant, and can do JS code coverage) and then the tests are at https://github.com/mysociety/fixmystreet/tree/master/.cypress/cypress/integration", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 776635426, "label": "Mechanism for executing JavaScript unit tests"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-752882797", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 752882797, "node_id": "MDEyOklzc3VlQ29tbWVudDc1Mjg4Mjc5Nw==", "user": {"value": 154364, "label": "dracos"}, "created_at": "2020-12-31T08:07:59Z", "updated_at": "2020-12-31T15:04:32Z", "author_association": "NONE", "body": "If you're using arrow functions, you can presumably use default parameters, not much difference in support. That would save you 9 bytes. But OTOH you need `\"use strict\";` to use arrow functions etc, and that's 13 bytes.\r\n\r\nYour latest 250-byte one, with use strict, gzips to 199 bytes. The following might be 292 bytes, but compresses to 204, basically the same, and works in any browser (well, IE9+) at all:\r\n\r\n`var datasette=datasette||{};datasette.plugins=function(){var d={};return{register:function(b,c,e){d[b]||(d[b]=[]);d[b].push([c,e])},call:function(b,c){c=c||{};var e=[];(d[b]||[]).forEach(function(a){a=a[0].apply(a[0],a[1].map(function(a){return c[a]}));void 0!==a&&e.push(a)});return e}}}();`\r\n\r\nSource for that is below; I replaced the [fn,parameters] because closure-compiler includes a polyfill for that, and I ran `closure-compiler --language_out ECMASCRIPT3`:\r\n\r\n```js\r\nvar datasette = datasette || {};\r\ndatasette.plugins = (() => {\r\n var registry = {};\r\n return {\r\n register: (hook, fn, parameters) => {\r\n if (!registry[hook]) {\r\n registry[hook] = [];\r\n }\r\n registry[hook].push([fn, parameters]);\r\n },\r\n call: (hook, args) => {\r\n args = args || {};\r\n var results = [];\r\n (registry[hook] || []).forEach((data) => {\r\n /* Call with the correct arguments */\r\n var result = data[0].apply(data[0], data[1].map(parameter => args[parameter]));\r\n if (result !== undefined) {\r\n results.push(result);\r\n }\r\n });\r\n return results;\r\n }\r\n };\r\n})();\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-752888552", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 752888552, "node_id": "MDEyOklzc3VlQ29tbWVudDc1Mjg4ODU1Mg==", "user": {"value": 154364, "label": "dracos"}, "created_at": "2020-12-31T08:33:11Z", "updated_at": "2020-12-31T08:34:27Z", "author_association": "NONE", "body": "If you could say that all hook functions had to accept one options parameter (and could use object destructuring if they wished to only see a subset), you could have this, which minifies (to all-browser-JS) to 200 bytes, gzips to 146, and works practically the same:\r\n\r\n```js\r\nvar datasette = datasette || {};\r\ndatasette.plugins = (() => {\r\n var registry = {};\r\n return {\r\n register: (hook, fn) => {\r\n registry[hook] = registry[hook] || [];\r\n registry[hook].push(fn);\r\n },\r\n call: (hook, args) => {\r\n var results = (registry[hook] || []).map(fn => fn(args||{}));\r\n return results;\r\n }\r\n };\r\n})();\r\n```\r\n\r\n`var datasette=datasette||{};datasette.plugins=function(){var b={};return{register:function(a,c){b[a]=b[a]||[];b[a].push(c)},call:function(a,c){return(b[a]||[]).map(function(a){return a(c||{})})}}}();`\r\n\r\nCalled the same, definitions tiny bit different:\r\n\r\n```js\r\ndatasette.plugins.register('numbers', ({a, b}) => a + b)\r\ndatasette.plugins.register('numbers', o => o.a * o.b)\r\ndatasette.plugins.call('numbers', {a: 4, b: 6})\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/417#issuecomment-751504136", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/417", "id": 751504136, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MTUwNDEzNg==", "user": {"value": 212369, "label": "drewda"}, "created_at": "2020-12-27T19:02:06Z", "updated_at": "2020-12-27T19:02:06Z", "author_association": "NONE", "body": "Very much looking forward to seeing this functionality come together. This is probably out-of-scope for an initial release, but in the future it could be useful to also think of how to run this is a container'ized context. For example, an immutable datasette container that points to an S3 bucket of SQLite DBs or CSVs. Or an immutable datasette container pointing to a NFS volume elsewhere on a Kubernetes cluster.", "reactions": "{\"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 421546944, "label": "Datasette Library"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1150#issuecomment-751476406", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1150", "id": 751476406, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MTQ3NjQwNg==", "user": {"value": 18221871, "label": "noklam"}, "created_at": "2020-12-27T14:51:39Z", "updated_at": "2020-12-27T14:51:39Z", "author_association": "NONE", "body": "I like the idea of _internal, it's a nice way to get a data catalog quickly. I wonder if this trick applies to db other than SQLite.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 770436876, "label": "Maintain an in-memory SQLite table of connected databases and their tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/417#issuecomment-751127384", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/417", "id": 751127384, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MTEyNzM4NA==", "user": {"value": 1279360, "label": "dyllan-to-you"}, "created_at": "2020-12-24T22:56:48Z", "updated_at": "2020-12-24T22:56:48Z", "author_association": "NONE", "body": "Instead of scanning the directory every 10s, have you considered listening for the native system events to notify you of updates?\r\n\r\nI think python has a nice module to do this for you called [watchdog](https://pypi.org/project/watchdog/)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 421546944, "label": "Datasette Library"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/28#issuecomment-751125270", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/28", "id": 751125270, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MTEyNTI3MA==", "user": {"value": 129786, "label": "jmelloy"}, "created_at": "2020-12-24T22:26:22Z", "updated_at": "2020-12-24T22:26:22Z", "author_association": "NONE", "body": "This comes around if you\u2019ve run the photo export without running an s3 upload. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 624490929, "label": "Invalid SQL no such table: main.uploads"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1159#issuecomment-750849460", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1159", "id": 750849460, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MDg0OTQ2MA==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2020-12-24T11:07:35Z", "updated_at": "2020-12-24T11:29:21Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=h1) Report\n> Merging [#1159](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=desc) (c820abd) into [main](https://codecov.io/gh/simonw/datasette/commit/a882d679626438ba0d809944f06f239bcba8ee96?el=desc) (a882d67) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1159/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1159 +/- ##\n=======================================\n Coverage 91.55% 91.55% \n=======================================\n Files 32 32 \n Lines 3930 3930 \n=======================================\n Hits 3598 3598 \n Misses 332 332 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=footer). Last update [a882d67...c820abd](https://codecov.io/gh/simonw/datasette/pull/1159?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 774332247, "label": "Improve the display of facets information"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1158#issuecomment-750373496", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1158", "id": 750373496, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MDM3MzQ5Ng==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2020-12-23T16:26:06Z", "updated_at": "2020-12-23T16:26:06Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=h1) Report\n> Merging [#1158](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=desc) (37ce72f) into [main](https://codecov.io/gh/simonw/datasette/commit/90eba4c3ca569c57e96bce314e7ac8caf67d884e?el=desc) (90eba4c) will **not change** coverage.\n> The diff coverage is `87.50%`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1158/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1158 +/- ##\n=======================================\n Coverage 91.55% 91.55% \n=======================================\n Files 32 32 \n Lines 3930 3930 \n=======================================\n Hits 3598 3598 \n Misses 332 332 \n```\n\n\n| [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=tree) | Coverage \u0394 | |\n|---|---|---|\n| [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `77.41% <\u00f8> (\u00f8)` | |\n| [datasette/facets.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2ZhY2V0cy5weQ==) | `89.04% <\u00f8> (\u00f8)` | |\n| [datasette/filters.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2ZpbHRlcnMucHk=) | `94.35% <\u00f8> (\u00f8)` | |\n| [datasette/hookspecs.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2hvb2tzcGVjcy5weQ==) | `100.00% <\u00f8> (\u00f8)` | |\n| [datasette/inspect.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2luc3BlY3QucHk=) | `36.11% <\u00f8> (\u00f8)` | |\n| [datasette/renderer.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3JlbmRlcmVyLnB5) | `94.02% <\u00f8> (\u00f8)` | |\n| [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `95.01% <50.00%> (\u00f8)` | |\n| [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `95.85% <100.00%> (\u00f8)` | |\n| [datasette/utils/\\_\\_init\\_\\_.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `94.11% <100.00%> (\u00f8)` | |\n| [datasette/utils/asgi.py](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL2FzZ2kucHk=) | `92.13% <100.00%> (\u00f8)` | |\n| ... and [1 more](https://codecov.io/gh/simonw/datasette/pull/1158/diff?src=pr&el=tree-more) | |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=footer). Last update [90eba4c...37ce72f](https://codecov.io/gh/simonw/datasette/pull/1158?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 773913793, "label": "Modernize code to Python 3.6+"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15", "id": 748436115, "node_id": "MDEyOklzc3VlQ29tbWVudDc0ODQzNjExNQ==", "user": {"value": 8573886, "label": "nickvazz"}, "created_at": "2020-12-19T07:43:38Z", "updated_at": "2020-12-19T07:47:36Z", "author_association": "NONE", "body": "Hey Simon! I really enjoy datasette so far, just started trying it out today following your iPhone photos [example](https://simonwillison.net/2020/May/21/dogsheep-photos/). \r\n\r\nI am not sure if you had run into this or not, but it seems like they might have changed one of the column names from\r\n`ZGENERICASSET` to `ZASSET`. Should I open a PR? \r\n\r\nWould change:\r\n- [here](https://github.com/dogsheep/dogsheep-photos/blob/master/dogsheep_photos/cli.py#L209-L213)\r\n- [here](https://github.com/dogsheep/dogsheep-photos/blob/master/dogsheep_photos/cli.py#L238)\r\n- [here](https://github.com/dogsheep/dogsheep-photos/blob/master/dogsheep_photos/cli.py#L240)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612151767, "label": "Expose scores from ZCOMPUTEDASSETATTRIBUTES"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/53#issuecomment-748436453", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53", "id": 748436453, "node_id": "MDEyOklzc3VlQ29tbWVudDc0ODQzNjQ1Mw==", "user": {"value": 27, "label": "anotherjesse"}, "created_at": "2020-12-19T07:47:01Z", "updated_at": "2020-12-19T07:47:01Z", "author_association": "NONE", "body": "I think this should probably be closed as won't fix.\r\n\r\nAttempting to make a patch for this I realized that the since_id would limit to tweets posted since that since_id, not when it was favorited. So favoriting something in the older would be missed if you used `--since` with a cron script\r\n\r\nBetter to just use `--stop_after` set to a couple hundred", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 771324837, "label": "--since support for favorites"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-748436195", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 748436195, "node_id": "MDEyOklzc3VlQ29tbWVudDc0ODQzNjE5NQ==", "user": {"value": 8573886, "label": "nickvazz"}, "created_at": "2020-12-19T07:44:32Z", "updated_at": "2020-12-19T07:44:49Z", "author_association": "NONE", "body": "I have also run into this a bit, would it be possible to post your `requirements.txt` so I can try and reproduce your [blog post](https://simonwillison.net/2020/May/21/dogsheep-photos/)?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null}