{"id": 1823352380, "node_id": "PR_kwDOBm6k_c5Wfgd9", "number": 2118, "title": "New JSON design for query views", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 9700784, "label": "Datasette 1.0a3"}, "comments": 11, "created_at": "2023-07-26T23:29:21Z", "updated_at": "2023-08-08T01:47:40Z", "closed_at": "2023-08-08T01:47:39Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/2118", "body": "WIP. Refs:\r\n- #2109 \r\n\r\n\r\n----\n:books: Documentation preview :books:: https://datasette--2118.org.readthedocs.build/en/2118/\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/2118/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 317001500, "node_id": "MDU6SXNzdWUzMTcwMDE1MDA=", "number": 236, "title": "datasette publish lambda plugin", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2018-04-23T22:10:30Z", "updated_at": "2023-03-12T14:04:15Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Refs #217 - create a publish plugin that can deploy to AWS Lambda.\r\n\r\nhttps://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it).\r\n\r\nLambdas do get a 512 MB `/tmp` directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the `.db` file there and then have the lambda download it on startup.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/236/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1497909798, "node_id": "I_kwDOBm6k_c5ZSEom", "number": 1958, "title": "datasette --root running in Docker doesn't reliably show the magic URL", "user": {"value": 11729897, "label": "davidhaley"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-12-13T16:29:13Z", "updated_at": "2022-12-16T00:59:12Z", "closed_at": "2022-12-16T00:55:19Z", "author_association": "NONE", "pull_request": null, "body": "I followed these steps:\r\n\r\n`docker run datasetteproject/datasette pip install datasette-upload-csvs`\r\n\r\n`docker commit $(docker ps -lq) datasette-with-plugins`\r\n\r\n`docker run -p 8001:8001 -v $(pwd):/mnt datasette-with-plugins datasette --root -p 8001 -h 0.0.0.0`\r\n\r\nVisited: http://127.0.0.1:8001/-/plugins\r\n\r\n![image](https://user-images.githubusercontent.com/11729897/207392071-d939cd5e-1d96-4e11-b0be-dc06dd207866.png)\r\n\r\n\r\nVisited: http://localhost:8001/-/upload-csvs\r\n\r\n![image](https://user-images.githubusercontent.com/11729897/207389241-3e96ca66-ca74-4a16-8b7d-4427ee862c5e.png)\r\n\r\nI may have missed a step?\r\n\r\nThank you.\r\n\r\n---\r\n\r\nUbuntu 22.04.1 LTS", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1958/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1428630253, "node_id": "I_kwDOBm6k_c5VJyrt", "number": 1873, "title": "Ensure insert API has good tests for rowid and compound primark key tables", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 8755003, "label": "Datasette 1.0a-next"}, "comments": 11, "created_at": "2022-10-30T06:22:17Z", "updated_at": "2022-12-13T05:29:08Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Following:\r\n- #1866\r\n\r\nI need to design and implement various edge-cases or primary keys:\r\n\r\n- Table without an auto-incrementing primary key\r\n- Table with compound primary keys\r\n- Table with just a `rowid`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1873/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "reopened"} {"id": 1452572348, "node_id": "I_kwDOBm6k_c5WlH68", "number": 1900, "title": "datasette package --spatialite throws error during build", "user": {"value": 419145, "label": "rdmurphy"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-11-17T02:03:28Z", "updated_at": "2022-11-18T08:00:38Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hello! Attempting to use `datasette package` to bundle up a SpatiaLite DB and I'm getting this error during the `docker build`:\r\n\r\n```\r\nsqlite3.OperationalError: /usr/lib/x86_64-linux-gnu/mod_spatialite.so.so: cannot open shared object file: No such file or directory\r\n```\r\n\r\nSeems to be throwing when this step is ran:\r\n\r\n```\r\nERROR [6/6] RUN datasette inspect results.db --inspect-file inspect-data.json\r\n```\r\n\r\nThis is with `v0.63.1`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1900/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1373595927, "node_id": "I_kwDOBm6k_c5R32kX", "number": 1809, "title": "`prepare_jinja2_environment()` hook should take `datasette` argument", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-09-14T21:15:46Z", "updated_at": "2022-09-17T03:39:05Z", "closed_at": "2022-09-17T03:38:33Z", "author_association": "OWNER", "pull_request": null, "body": "That plugin hook's current signature is:\r\n\r\nhttps://github.com/simonw/datasette/blob/610425460b519e9c16d386cb81aa081c9d730ef0/datasette/hookspecs.py#L28-L30\r\n\r\nAs a result in the first alpha release of `datasette-edit-templates` I had to include this horrific hack: https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/datasette_edit_templates/__init__.py#L72-L75\r\n\r\n```python\r\n@hookimpl\r\ndef prepare_jinja2_environment(env):\r\n # TODO: This should ideally take datasette, but that's not an argument yet\r\n datasette = inspect.currentframe().f_back.f_back.f_back.f_back.f_locals[\"self\"]\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1809/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1363766973, "node_id": "I_kwDOCGYnMM5RSW69", "number": 484, "title": "Expose convert recipes to `sqlite-utils --functions`", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-09-06T20:15:08Z", "updated_at": "2022-09-07T19:09:52Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "`--functions` was added in:\r\n- #471 \r\n\r\nIt would be useful if the `r.jsonsplit()` and similar recipes for `sqlite-utils convert` could be used in these blocks of code too: https://sqlite-utils.datasette.io/en/stable/cli.html#sqlite-utils-convert-recipes", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/484/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 1173023272, "node_id": "I_kwDOCGYnMM5F6uoo", "number": 416, "title": "Options for how `r.parsedate()` should handle invalid dates", "user": {"value": 638427, "label": "mattkiefer"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-03-17T23:29:55Z", "updated_at": "2022-05-03T21:36:49Z", "closed_at": "2022-03-21T04:01:39Z", "author_association": "NONE", "pull_request": null, "body": "Exceptions are normal expected behavior when typecasting an invalid format. However, r.parsedate() is really just re-formatting strings and keeping the type as text. So it may be better to print-and-pass on exception so the user can see a complete list of invalid values -- while also allowing for the parser to reformat the remaining valid values. \r\n```\r\nsqlite-utils convert idfpr.db license \"Expiration Date\" \"r.parsedate(value)\"\r\n [#######-----------------------------] 21% 00:01:57Traceback (most recent call last):\r\n File \"/usr/local/lib/python3.9/dist-packages/sqlite_utils/db.py\", line 2336, in convert_value\r\n return fn(v)\r\n File \"\", line 2, in fn\r\n File \"/usr/local/lib/python3.9/dist-packages/sqlite_utils/recipes.py\", line 8, in parsedate\r\n parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst).date().isoformat()\r\n File \"/usr/lib/python3/dist-packages/dateutil/parser/_parser.py\", line 1374, in parse\r\n return DEFAULTPARSER.parse(timestr, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/dateutil/parser/_parser.py\", line 652, in parse\r\n raise ParserError(\"String does not contain a date: %s\", timestr)\r\ndateutil.parser._parser.ParserError: String does not contain a date: / / \r\n```\r\nIn this case, I had just one variation of an invalid date: ' / / '. But theoretically there could be many values that would have to be fixed one at a time with the current exception handling. ", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/416/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1175854982, "node_id": "I_kwDOBm6k_c5GFh-G", "number": 1679, "title": "Research: how much overhead does the n=1 time limit have?", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 11, "created_at": "2022-03-21T19:27:46Z", "updated_at": "2022-03-21T21:55:57Z", "closed_at": "2022-03-21T21:55:56Z", "author_association": "OWNER", "pull_request": null, "body": "https://github.com/simonw/datasette/blob/1a7750eb29fd15dd2eea3b9f6e33028ce441b143/datasette/utils/__init__.py#L181-L200\r\n\r\n```python\r\n@contextmanager\r\ndef sqlite_timelimit(conn, ms):\r\n deadline = time.perf_counter() + (ms / 1000)\r\n # n is the number of SQLite virtual machine instructions that will be\r\n # executed between each check. It's hard to know what to pick here.\r\n # After some experimentation, I've decided to go with 1000 by default and\r\n # 1 for time limits that are less than 50ms\r\n n = 1000\r\n if ms < 50:\r\n n = 1\r\n\r\n def handler():\r\n if time.perf_counter() >= deadline:\r\n return 1\r\n\r\n conn.set_progress_handler(handler, n)\r\n try:\r\n yield\r\n finally:\r\n conn.set_progress_handler(None, n)\r\n```\r\nHow often do I set a time limit of 50 or less? How much slower does it go thanks to this code?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1679/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1097091527, "node_id": "I_kwDOCGYnMM5BZEnH", "number": 369, "title": "Research how much of a difference analyze / sqlite_stat1 makes", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2022-01-09T03:03:36Z", "updated_at": "2022-02-03T21:07:41Z", "closed_at": "2022-02-03T21:07:35Z", "author_association": "OWNER", "pull_request": null, "body": "> Is there a downside to having a `sqlite_stat1` table if it has wildly incorrect statistics in it?\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008163050_\r\n\r\nMore generally: how much of a difference does the `sqlite_stat1` table created by `ANALYZE` make to queries?\r\n\r\nI'm particularly interested in `group by` / `count *` queries since Datasette uses those for faceting.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/369/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1099723916, "node_id": "I_kwDOBm6k_c5BjHSM", "number": 1590, "title": "Table+query JSON and CSV links broken when using `base_url` setting", "user": {"value": 1001306, "label": "eelkevdbos"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 7571612, "label": "Datasette 0.60"}, "comments": 11, "created_at": "2022-01-11T23:46:39Z", "updated_at": "2022-01-14T01:16:34Z", "closed_at": "2022-01-14T01:16:08Z", "author_association": "NONE", "pull_request": null, "body": "Datasette appends the prefix found in the `base_url` setting twice if a `base_url` is set.\r\n\r\nIn the follow asgi example, I'm hosting a custom Datasette instance:\r\n\r\n```python\r\n# asgi.py\r\nimport pathlib\r\nfrom asgi_cors import asgi_cors\r\nfrom channels.routing import URLRouter\r\nfrom django.urls import re_path\r\nfrom datasette.app import Datasette\r\n\r\ndatasette_ = Datasette(\r\n files=[],\r\n settings={\r\n \"base_url\": \"/datasettes/\",\r\n \"plugins\": {}\r\n },\r\n config_dir=pathlib.Path('.'),\r\n)\r\napplication = URLRouter([\r\n re_path(r\"^datasettes/.*\", asgi_cors(datasette_.app(), allow_all=True)),\r\n])\r\n```\r\n\r\nRunning it with:\r\n```shell\r\n$ daphne -p 8002 asgi:application\r\n```\r\n\r\nUsing a simple query on the `_memory` table: \r\n```sql\r\nselect sqlite_version()\r\n```\r\n\r\nhttp://localhost:8002/datasettes/_memory?sql=select+sqlite_version%28%29\r\n\r\nIt renders the following upon inspection:\r\n![image](https://user-images.githubusercontent.com/1001306/149038851-aa842950-126a-467c-9a86-fae13bce6221.png)\r\n\r\nI am using datasette version `0.59.4`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1590/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1077431957, "node_id": "I_kwDOCGYnMM5AOE6V", "number": 356, "title": "`sqlite-utils insert --convert` option", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-12-11T07:24:48Z", "updated_at": "2022-01-06T06:30:13Z", "closed_at": "2022-01-06T06:28:53Z", "author_association": "OWNER", "pull_request": null, "body": "Idea come to me while re-reading this: https://simonwillison.net/2021/Aug/6/sqlite-utils-convert/\r\n\r\nThis is a bit of a hack:\r\n```\r\ncat /tmp/log.txt | \\\r\n jq --raw-input '{line: .}' --compact-output | \\\r\n sqlite-utils insert /tmp/logs.db log - --nl\r\n```\r\nWould be great if you could pipe lines to `insert` and transform them on the way in.\r\n\r\nA `--convert python-code` option, modeled after `sqlite-utils convert`, could do this.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/356/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 1042569687, "node_id": "I_kwDOCGYnMM4-JFnX", "number": 335, "title": "sqlite-utils index-foreign-keys fails due to pre-existing index", "user": {"value": 596279, "label": "zaneselvans"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-11-02T16:22:11Z", "updated_at": "2021-11-14T22:55:56Z", "closed_at": "2021-11-14T22:55:56Z", "author_association": "NONE", "pull_request": null, "body": "While running the command:\r\n```sh\r\nsqlite-utils index-foreign-keys $SQLITE_DIR/pudl.sqlite\r\n```\r\n\r\nI got the following error:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/zane/miniconda3/envs/pudl-dev/bin/sqlite-utils\", line 8, in \r\n sys.exit(cli())\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/cli.py\", line 454, in index_foreign_keys\r\n db.index_foreign_keys()\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py\", line 902, in index_foreign_keys\r\n table.create_index([fk.column])\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1563, in create_index\r\n self.db.execute(sql)\r\n File \"/home/zane/miniconda3/envs/pudl-dev/lib/python3.9/site-packages/sqlite_utils/db.py\", line 421, in execute\r\n return self.conn.execute(sql)\r\nsqlite3.OperationalError: index idx_generators_eia860_report_date already exists\r\n```\r\n\r\nThis DB was created with the foreign key constraint `PRAGMA` enabled and a bunch of column-level `CHECK` constraints. Is this an expected behavior? Should one not try to index foreign keys if FK constraints are already being enforced within the DB?\r\n\r\nI'm also noticing that the size of the DB after FK indexes have been added went from 483MB to 835MB, which seems like a much bigger jump than when I've done this previously.\r\n\r\nSoftware versions...\r\n* sqlite-utils 3.17.1\r\n* sqlite 3.36.0\r\n* SQLAlchemy 1.4.26 (used to create the DB)", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/335/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 963528457, "node_id": "MDU6SXNzdWU5NjM1Mjg0NTc=", "number": 1425, "title": "render_cell() hook should support returning an awaitable", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-08-08T22:32:29Z", "updated_at": "2021-08-09T07:14:35Z", "closed_at": "2021-08-09T03:00:37Z", "author_association": "OWNER", "pull_request": null, "body": "Many of the plugin hooks can return an awaitable - e.g. https://docs.datasette.io/en/stable/plugin_hooks.html#plugin-hook-extra-template-vars - but `render_cell()` doesn't support this.\r\n\r\nI recently found myself wanting to execute an additional SQL query from that hook, but it wasn't possible to do that since I couldn't use `await`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1425/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 959999095, "node_id": "MDU6SXNzdWU5NTk5OTkwOTU=", "number": 1421, "title": "\"Query parameters\" form shows wrong input fields if query contains \"03:31\" style times", "user": {"value": 6988, "label": "j4mie"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-08-04T07:29:04Z", "updated_at": "2021-08-09T03:41:07Z", "closed_at": "2021-08-09T03:33:02Z", "author_association": "NONE", "pull_request": null, "body": "Datasette version `0.58.1`.\r\n\r\nI'm guessing this is a bug in the code that looks for `:param`-style query parameters..\r\n\r\n\"image\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1421/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 811367257, "node_id": "MDU6SXNzdWU4MTEzNjcyNTc=", "number": 1231, "title": "Race condition errors in new refresh_schemas() mechanism", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2021-02-18T18:49:54Z", "updated_at": "2021-07-16T19:44:59Z", "closed_at": "2021-07-16T19:44:59Z", "author_association": "OWNER", "pull_request": null, "body": "I tried running a Locust load test against Datasette and hit an error message about a failure to create tables because they already existed. I think this means there are race conditions in the new `refresh_schemas()` mechanism added in #1150.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1231/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 770448622, "node_id": "MDU6SXNzdWU3NzA0NDg2MjI=", "number": 1151, "title": "Database class mechanism for cross-connection in-memory databases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "comments": 11, "created_at": "2020-12-17T23:25:43Z", "updated_at": "2021-01-26T19:07:44Z", "closed_at": "2020-12-18T01:01:26Z", "author_association": "OWNER", "pull_request": null, "body": "> Next challenge: figure out how to use the `Database` class from https://github.com/simonw/datasette/blob/0.53/datasette/database.py for an in-memory database which persists data for the duration of the lifetime of the server, and allows access to that in-memory database from multiple threads in a way that lets them see each other's changes.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1150#issuecomment-747768112_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1151/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 615474990, "node_id": "MDU6SXNzdWU2MTU0NzQ5OTA=", "number": 21, "title": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-05-10T20:58:06Z", "updated_at": "2020-12-19T07:44:49Z", "closed_at": "2020-05-10T21:57:13Z", "author_association": "MEMBER", "pull_request": null, "body": "```\r\n% python -i $(which photos-to-sqlite) apple-photos photos.db \r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py\", line 611, in place\r\n return self._place # pylint: disable=access-member-before-definition\r\nAttributeError: 'PhotoInfo' object has no attribute '_place'\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite\", line 11, in \r\n load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')()\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py\", line 249, in apple_photos\r\n photo_row = osxphoto_to_row(sha256, photo)\r\n File \"/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py\", line 91, in osxphoto_to_row\r\n place = photo.place\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py\", line 614, in place\r\n self._place = PlaceInfo5(self._info[\"reverse_geolocation\"])\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 505, in __init__\r\n self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 16, in unarchive\r\n return Unarchive(plist).top_object()\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 256, in top_object\r\n return self.decode_object(self.top_uid)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 126, in decode_archive\r\n mapItem = archive.decode(\"mapItem\")\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 140, in decode\r\n return self._unarchiver.decode_key(self._object, key)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 216, in decode_key\r\n return self.decode_object(val)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 180, in decode_archive\r\n sortedPlaceInfos = archive.decode(\"sortedPlaceInfos\")\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 140, in decode\r\n return self._unarchiver.decode_key(self._object, key)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 216, in decode_key\r\n return self.decode_object(val)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 112, in decode_archive\r\n return [archive._decode_index(index) for index in uids]\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 112, in \r\n return [archive._decode_index(index) for index in uids]\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 137, in _decode_index\r\n return self._unarchiver.decode_object(index)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 217, in decode_archive\r\n placeType = archive.decode(\"placeType\")\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 140, in decode\r\n return self._unarchiver.decode_key(self._object, key)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 216, in decode_key\r\n return self.decode_object(val)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 227, in decode_object\r\n raise CircularReference(index)\r\nbpylist.archiver.CircularReference: archive has a cycle with uid(13)\r\n```\r\nIn the debugger I traced this back to:\r\n```\r\n178 \t @staticmethod\r\n179 \t def decode_archive(archive):\r\n180 ->\t sortedPlaceInfos = archive.decode(\"sortedPlaceInfos\")\r\n181 \t finalPlaceInfos = archive.decode(\"finalPlaceInfos\")\r\n182 \t return PLRevGeoMapItem(sortedPlaceInfos, finalPlaceInfos)\r\n```", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 644582921, "node_id": "MDU6SXNzdWU2NDQ1ODI5MjE=", "number": 865, "title": "base_url doesn't seem to work when adding criteria and clicking \"apply\"", "user": {"value": 6739646, "label": "tballison"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 11, "created_at": "2020-06-24T12:39:57Z", "updated_at": "2020-11-12T23:49:24Z", "closed_at": "2020-10-20T05:22:59Z", "author_association": "NONE", "pull_request": null, "body": "Over on Apache Tika, we're using datasette to allow users to make sense of the metadata for our file regression testing corpus.\r\n\r\nThis could be user error in how I've set up the reverse proxy!\r\n\r\nI started datasette like so:\r\n`docker run -d -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/corpora-metadata.db --config sql_time_limit_ms:60000 --config base_url:/datasette/`\r\n\r\nI then reverse proxied like so:\r\n\r\n ProxyPreserveHost On\r\n ProxyPass /datasette http://x.y.z.q:xxxx\r\n ProxyPassReverse /datasette http://x.y.z.q:xxx\r\n\r\nRegular sql works perfectly:\r\nhttps://corpora.tika.apache.org/datasette/corpora-metadata?sql=select+mime_string%2C+count%281%29+as+cnt%0D%0Afrom+profiles+p%0D%0Ajoin+mimes+m+on+p.mime_id%3Dm.mime_id%0D%0Agroup+by+mime_string%0D%0Aorder+by+cnt+desc\r\n\r\n\r\nHowever, adding criteria and clicking 'Apply' \r\nhttps://corpora.tika.apache.org/datasette/corpora-metadata/tika_1_24_1_mimes?_sort=file&mime__exact=text%2Fplain\r\n\r\nbounces back to:\r\nhttps://corpora.tika.apache.org/corpora-metadata/tika_1_24_1_mimes?_sort=file&file__contains=bug&mime__exact=text%2Fplain", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/865/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 573755726, "node_id": "MDU6SXNzdWU1NzM3NTU3MjY=", "number": 690, "title": "Mechanism for plugins to add action menu items for various things", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 11, "created_at": "2020-03-02T06:48:36Z", "updated_at": "2020-10-30T05:20:43Z", "closed_at": "2020-10-30T05:20:42Z", "author_association": "OWNER", "pull_request": null, "body": "Now that we have support for plugins that can write I'm seeing all sorts of places where a plugin might need to add UI to the table page.\r\n\r\nSome examples:\r\n\r\n- `datasette-configure-fts` needs to add a \"configure search for this table\" link\r\n- a plugin that lets you render or delete tables needs to add a link or button somewhere\r\n- existing plugins like `datasette-vega` and `datasette-cluster-map` already do this with JavaScript\r\n\r\nThe challenge here is that multiple plugins may want to do this, so simply overriding templates and populating names blocks doesn't entirely work as templates may override each other.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/690/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 663228985, "node_id": "MDU6SXNzdWU2NjMyMjg5ODU=", "number": 904, "title": "datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 11, "created_at": "2020-07-21T18:42:52Z", "updated_at": "2020-10-23T19:44:05Z", "closed_at": "2020-10-20T00:51:51Z", "author_association": "OWNER", "pull_request": null, "body": "I tried using this block of template in a plugin and got an error:\r\n```html\r\n{% block nav %}\r\n

\r\n home /\r\n {{ database }} /\r\n {{ table }}\r\n

\r\n {{ super() }}\r\n{% endblock %}\r\n```\r\nError: `'database_url' is undefined`\r\n\r\nThat's because `database_url` is only made available by the BaseView template here:\r\n\r\nhttps://github.com/simonw/datasette/blob/d6e03b04302a0852e7133dc030eab50177c37be7/datasette/views/base.py#L110-L125", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/904/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 702069429, "node_id": "MDU6SXNzdWU3MDIwNjk0Mjk=", "number": 967, "title": "Writable canned queries with magic parameters fail if POST body is empty", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-09-15T16:14:43Z", "updated_at": "2020-09-15T20:13:10Z", "closed_at": "2020-09-15T20:13:10Z", "author_association": "OWNER", "pull_request": null, "body": "When I try to use the new `?_json=1` feature from #880 with magic parameters from #842 I get this error:\r\n\r\n> Incorrect number of bindings supplied. The current statement uses 1, and there are 0 supplied", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/967/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 648637666, "node_id": "MDU6SXNzdWU2NDg2Mzc2NjY=", "number": 880, "title": "POST to /db/canned-query that returns JSON should be supported (for API clients)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5818042, "label": "Datasette 0.49"}, "comments": 11, "created_at": "2020-07-01T03:14:43Z", "updated_at": "2020-09-14T21:28:21Z", "closed_at": "2020-09-14T21:25:01Z", "author_association": "OWNER", "pull_request": null, "body": "Now that CSRF is solved for API requests (#835) it would be good to support API requests to the `.json` extension.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/880/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 632724154, "node_id": "MDU6SXNzdWU2MzI3MjQxNTQ=", "number": 805, "title": "Writable canned queries live demo on Glitch", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-06-06T20:52:13Z", "updated_at": "2020-07-01T22:44:01Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Needs to run somewhere with a mutable disk drive, so not Cloud Run or Heroku or Vercel.\r\n\r\nI think I'll put it on Glitch.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/805/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 572896293, "node_id": "MDU6SXNzdWU1NzI4OTYyOTM=", "number": 687, "title": "Expand plugins documentation to multiple pages", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5533512, "label": "Datasette 0.45"}, "comments": 11, "created_at": "2020-02-28T17:26:21Z", "updated_at": "2020-06-22T03:55:20Z", "closed_at": "2020-06-22T03:53:54Z", "author_association": "OWNER", "pull_request": null, "body": "I think the plugins docs need to extend beyond a single page now. I want to add a whole section on writing tests for plugins, showing how `httpx` can be used as seen in https://github.com/simonw/datasette-atom/issues/3 and suchlike.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/687/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 628025100, "node_id": "MDU6SXNzdWU2MjgwMjUxMDA=", "number": 785, "title": "Datasette secret mechanism - initially for signed cookies", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5512395, "label": "Datasette 0.44"}, "comments": 11, "created_at": "2020-05-31T19:14:52Z", "updated_at": "2020-06-06T00:43:40Z", "closed_at": "2020-06-01T00:18:40Z", "author_association": "OWNER", "pull_request": null, "body": "See comment in https://github.com/simonw/datasette/issues/784#issuecomment-636514974\r\n\r\nDatasette needs to be able to set signed cookies - which means it needs a mechanism for safely handling a signing secret.\r\n\r\nSince Datasette is a long-running process the default behaviour here can be to create a random secret on startup. This means that if the server restarts any signed cookies will be invalidated.\r\n\r\nIf the user wants a persistent secret they'll have to generate it themselves - maybe by setting an environment variable?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/785/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 590666760, "node_id": "MDU6SXNzdWU1OTA2NjY3NjA=", "number": 39, "title": "--since feature can be confused by retweets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-03-30T23:25:33Z", "updated_at": "2020-04-01T03:45:16Z", "closed_at": "2020-04-01T03:45:16Z", "author_association": "MEMBER", "pull_request": null, "body": "If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run.\r\n\r\nIt does this by seeking out the max ID of their previous tweets:\r\n\r\nhttps://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311\r\n\r\nBUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 565518772, "node_id": "MDU6SXNzdWU1NjU1MTg3NzI=", "number": 673, "title": "Mechanism for checking if a SQLite database file is safe to open", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-02-14T19:36:04Z", "updated_at": "2020-02-14T20:13:59Z", "closed_at": "2020-02-14T20:13:59Z", "author_association": "OWNER", "pull_request": null, "body": "Opening a SpatiaLite database file without SpatiaLite will result in errors later on. Same for database files which use custom extensions, like the Apple Photos database.\r\n\r\nI've figured out how to tell if a database is safe to open or not:\r\n```sql\r\nselect sql from sqlite_master where sql like 'CREATE VIRTUAL TABLE%';\r\n```\r\nThis returns the SQL definitions for virtual tables. The bit after `using` tells you what they need.\r\n\r\nRun this against a SpatiaLite database and you get the following:\r\n```sql\r\nCREATE VIRTUAL TABLE SpatialIndex USING VirtualSpatialIndex()\r\nCREATE VIRTUAL TABLE ElementaryGeometries USING VirtualElementary()\r\n```\r\nRun it against an Apple Photos `photos.db` file (found with `find ~/Library | grep photos.db`) and you get this (partial list):\r\n```sql\r\nCREATE VIRTUAL TABLE RidList_VirtualReader using RidList_VirtualReaderModule\r\nCREATE VIRTUAL TABLE Array_VirtualReader using Array_VirtualReaderModule\r\nCREATE VIRTUAL TABLE LiGlobals_VirtualBufferReader using VirtualBufferReaderModule\r\nCREATE VIRTUAL TABLE RKPlace_RTree using rtree (modelId,minLongitude,maxLongitude,minLatitude,maxLatitude)\r\n```\r\nFor a database with FTS4 you get:\r\n```sql\r\nCREATE VIRTUAL TABLE \"docs_fts\" USING FTS4 (\r\n [title], [content], content=\"docs\"\r\n)\r\n```\r\nFTS5:\r\n```sql\r\nCREATE VIRTUAL TABLE [FARA_All_Registrants_fts] USING FTS5 (\r\n [Name], [Address_1], [Address_2],\r\n content=[FARA_All_Registrants]\r\n )\r\n```\r\nSo I can use this to figure out all of the `using` pieces and then compare them to a list of known support ones.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/pull/672#issuecomment-586441484_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/673/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 316444720, "node_id": "MDU6SXNzdWUzMTY0NDQ3MjA=", "number": 233, "title": "Option to expose expanded foreign keys in JSON/CSV", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2018-04-21T00:18:25Z", "updated_at": "2018-06-16T22:26:21Z", "closed_at": "2018-06-16T22:20:14Z", "author_association": "OWNER", "pull_request": null, "body": "https://datasette-cluster-map-demo.datasettes.com/sf-trees-02c8ef1/Street_Tree_List?qCareAssistant=1\r\n\r\n![f36b87c0-478e-4d55-9a5f-ad37df0b47cb](https://user-images.githubusercontent.com/9599/39078411-bb3e4f88-44be-11e8-9d0c-d22324793c77.png)\r\n\r\nIt would be nice if the info bubbles there could expose more than just the IDs, and if the title showed the expanded name of the selected qCareAssistant.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/233/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 268470572, "node_id": "MDU6SXNzdWUyNjg0NzA1NzI=", "number": 40, "title": "Implement command-line tool interface", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 2857392, "label": "Ship first public release"}, "comments": 11, "created_at": "2017-10-25T16:47:15Z", "updated_at": "2017-11-11T07:27:33Z", "closed_at": "2017-11-11T07:27:33Z", "author_association": "OWNER", "pull_request": null, "body": "The first version needs to take one or more file names or URLs, then generate and deploy an app to Now. It will assume you already have the now command installed and configured.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/40/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}