{"id": 766494367, "node_id": "MDExOlB1bGxSZXF1ZXN0NTM5NDg5NTI1", "number": 1145, "title": "Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "comments": 1, "created_at": "2020-12-14T14:22:16Z", "updated_at": "2021-01-24T21:20:29Z", "closed_at": "2020-12-16T21:44:39Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1145", "body": "Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.\n
\nRelease notes\n

Sourced from pytest's releases.

\n
\n

6.2.0

\n

pytest 6.2.0 (2020-12-12)

\n

Breaking Changes

\n\n

Deprecations

\n\n

Features

\n\n\n
\n
\n
\nChangelog\n

Sourced from pytest's changelog.

\n
\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1145/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 742011049, "node_id": "MDU6SXNzdWU3NDIwMTEwNDk=", "number": 1091, "title": ".json and .csv exports fail to apply base_url", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6346396, "label": "Datasette 0.54"}, "comments": 22, "created_at": "2020-11-12T23:45:16Z", "updated_at": "2021-01-24T21:20:24Z", "closed_at": "2021-01-09T22:19:29Z", "author_association": "OWNER", "pull_request": null, "body": "> Just tested with the latest Docker image, and it works pretty much everywhere! THANK YOU!\r\n> \r\n> I did notice that if I try to export json or csv, the base is not applied. Not sure if I should reopen this issue or open a new one.\r\n> \r\n> To see this, go here: https://corpora.tika.apache.org/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES\r\n> \r\n> Click/hover over json or CSV and you'll see that the 'datasette' base is not included.\r\n\r\n_Originally posted by @tballison in https://github.com/simonw/datasette/issues/865#issuecomment-726385422_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1091/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 760605882, "node_id": "MDU6SXNzdWU3NjA2MDU4ODI=", "number": 1135, "title": "Feature: --create option to create database file if it does not yet exist", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-09T19:23:58Z", "updated_at": "2021-01-24T21:19:39Z", "closed_at": "2020-12-09T19:45:52Z", "author_association": "OWNER", "pull_request": null, "body": "I'd like to be able to tell people to run the following in the Datasette documentation to get started:\r\n\r\n brew install datasette\r\n datasette install datasette-upload-csvs\r\n datasette data.db --create --root --open\r\n\r\nThis would give them a local Datasette instance with the ability to drag-and-drop CSV files directly into it.\r\n\r\nJust one catch: I don't want to have to talk them through creating an empty SQLite database file. So I want to add a new `--create` option which means \"If any of the database files passed on the command-line do not yet exist, create them\".", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1135/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 791381623, "node_id": "MDU6SXNzdWU3OTEzODE2MjM=", "number": 1197, "title": "DB size limit for publishing with Heroku", "user": {"value": 1186275, "label": "mtdukes"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-21T18:08:43Z", "updated_at": "2021-01-24T20:53:44Z", "closed_at": "2021-01-24T20:53:44Z", "author_association": "NONE", "pull_request": null, "body": "Hello,\r\nI tried searching for this, but can't seem to get a great answer: Does anybody know the size limit for databases deploying to Heroku? The files I'm working with are pretty large, but I might be able to pare them down if I have a limit in mind.\r\n\r\n I'm getting the following error when running `datasette heroku publish`:\r\n\r\n`RangeError [ERR_INVALID_OPT_VALUE]: The value \"14504095744\" is invalid for option \"size\"`", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1197/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 792625812, "node_id": "MDU6SXNzdWU3OTI2MjU4MTI=", "number": 1198, "title": "Plugin testing documentation on using pytest-httpx", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-23T18:46:16Z", "updated_at": "2021-01-24T20:40:38Z", "closed_at": "2021-01-24T20:38:43Z", "author_association": "OWNER", "pull_request": null, "body": "I keep on having to figure this out: if you use the https://pypi.org/project/pytest-httpx/ fixture to write tests against mocked external APIs, they will fail because that module will break Datasette's own `datasette.client` testing mechanism.\r\n\r\nYou can fix this using:\r\n\r\n```python\r\n@pytest.fixture\r\ndef non_mocked_hosts():\r\n return [\"localhost\"]\r\n```\r\n\r\nSee https://github.com/simonw/datasette-indieauth/blob/1.2/tests/test_indieauth.py\r\n\r\nI can add this tip to the https://docs.datasette.io/en/stable/testing_plugins.html page.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1198/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 725996507, "node_id": "MDU6SXNzdWU3MjU5OTY1MDc=", "number": 1036, "title": "Make it possible to download BLOB data from the Datasette UI", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 16, "created_at": "2020-10-20T22:47:56Z", "updated_at": "2021-01-18T17:45:00Z", "closed_at": "2020-10-25T00:14:52Z", "author_association": "OWNER", "pull_request": null, "body": "Currently you can only extract binary BLOB data as base64-encoded JSON, which is not user friendly at all. It should always be possible for end-users to get the binary data out.\r\n\r\nI'm worried about XSS vulnerabilities here, but hopefully sending `Content-Type: application/octet-stream` helps there? Need to research that.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1036/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 743400216, "node_id": "MDU6SXNzdWU3NDM0MDAyMTY=", "number": 11, "title": "Error thrown: sqlite3.OperationalError: table users has no column named lastName", "user": {"value": 61791, "label": "beaugunderson"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-11-16T01:21:18Z", "updated_at": "2021-01-18T04:35:22Z", "closed_at": "2021-01-18T04:35:22Z", "author_association": "NONE", "pull_request": null, "body": "Just installed `swarm-to-sqlite-0.3.2` and tried according to the docs:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/swarm-to-sqlite\", line 8, in \r\n sys.exit(cli())\r\n File \"/usr/local/lib/python3.9/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/local/lib/python3.9/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/local/lib/python3.9/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/local/lib/python3.9/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/cli.py\", line 73, in cli\r\n save_checkin(checkin, db)\r\n File \"/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/utils.py\", line 82, in save_checkin\r\n checkins_table.m2m(\"users\", user, m2m_table=\"likes\", pk=\"id\")\r\n File \"/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1914, in m2m\r\n id = other_table.insert(record, pk=pk, replace=True).last_pk\r\n File \"/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1647, in insert\r\n return self.insert_all(\r\n File \"/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1765, in insert_all\r\n self.insert_chunk(\r\n File \"/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py\", line 1575, in insert_chunk\r\n result = self.db.execute(query, params)\r\n File \"/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py\", line 200, in execute\r\n return self.conn.execute(sql, parameters)\r\nsqlite3.OperationalError: table users has no column named lastName\r\n```", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 787900412, "node_id": "MDU6SXNzdWU3ODc5MDA0MTI=", "number": 222, "title": ".m2m() should accept alter=True parameter", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-18T04:15:43Z", "updated_at": "2021-01-18T04:26:10Z", "closed_at": "2021-01-18T04:26:10Z", "author_association": "OWNER", "pull_request": null, "body": "Needed by https://github.com/dogsheep/swarm-to-sqlite/issues/11", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/222/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 548591089, "node_id": "MDU6SXNzdWU1NDg1OTEwODk=", "number": 657, "title": "Allow creation of virtual tables at startup", "user": {"value": 1055831, "label": "dazzag24"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-01-12T16:10:55Z", "updated_at": "2021-01-15T20:24:35Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi,\r\n \r\nI've been experimenting with SQLite reading from huge datasets using this excellent Parquet extension from @cldellow.\r\nhttps://cldellow.com/2018/06/22/sqlite-parquet-vtable.html\r\nhttps://github.com/cldellow/sqlite-parquet-vtable\r\n\r\nThis works really well, but I was keen to see if I could combine datasette with this. Having previously experimented with the spatialite extension I knew that datasette supports loading extensions in the underlying sqlite instance. However I hit a blocker as the current design only allows SELECT statements to be executed and so I am unable to execute the crucial \r\n\r\nCREATE VIRTUAL TABLE .........\r\n\r\ncommand that is required to load the data from the parquet file into the table.\r\n\r\nIt seems like this would be a simple-ish change, but I don't know enough about the architecture of datasette to start implementing this myself? Could this be done as a datasette plugin? or would this require more fundamental changes at initialisation time?\r\n\r\nMy thoughts are that something at init time could detect that the user was loading a *.parquet file and then switch to a mode were it loads that via the \"CREATE VIRTUAL TABLE...\" rather than loading the *.db file in the default case??\r\n\r\nI'm happy to contribute code and testing, I just need some pointers on the best approach.\r\n\r\nThanks\r\nDarren", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/657/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 787104850, "node_id": "MDU6SXNzdWU3ODcxMDQ4NTA=", "number": 1192, "title": "Form Plugin for in-depth Datasette Querying", "user": {"value": 1024355, "label": "tomershvueli"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-15T18:24:50Z", "updated_at": "2021-01-15T18:24:50Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I envision a sort of easy-to-build form plugin that would be able to map a user's inputs to different fields/columns in a Datasette database. ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1192/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 783910901, "node_id": "MDU6SXNzdWU3ODM5MTA5MDE=", "number": 221, "title": ".add_missing_columns() does not take case insensitivity into account", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-12T05:01:00Z", "updated_at": "2021-01-12T23:17:33Z", "closed_at": "2021-01-12T23:17:33Z", "author_association": "OWNER", "pull_request": null, "body": "SQLite columns are case insensitive - but the `.add_missing_columns()` method doesn't know that. This means that it can crash if it identifies a column that is a case-insensitive duplicate of an existing column. https://github.com/simonw/sqlite-utils/blob/4cc82fd0bccc9d2eeb3510beb4e691d7da099f84/sqlite_utils/db.py#L1974-L1980", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/221/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 782708469, "node_id": "MDU6SXNzdWU3ODI3MDg0Njk=", "number": 1183, "title": "Take advantage of sqlite-utils cached table counts, if available", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-09T23:51:48Z", "updated_at": "2021-01-12T02:42:08Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "sqlite-utils 3.2 now has a mechanism for creating a `_counts` table with triggers to maintain counts for individual tables. Datasette could look for this table and use it for counts, dramatically speeding up places that currently suffer from slow counts. Refs #859.\r\n\r\nhttps://sqlite-utils.datasette.io/en/stable/python-api.html#cached-table-counts-using-triggers", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1183/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 778450486, "node_id": "MDU6SXNzdWU3Nzg0NTA0ODY=", "number": 1171, "title": "GitHub Actions workflow to build and sign macOS binary executables", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2021-01-04T23:36:59Z", "updated_at": "2021-01-07T19:36:00Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Using PyInstaller, as explored in #93 and https://til.simonwillison.net/python/packaging-pyinstaller\r\n\r\nThe bigger challenge will be the code signing bit. I'll need a Apple Developer account ($99/year) and some extensive CI fiddling.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1171/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 780767542, "node_id": "MDU6SXNzdWU3ODA3Njc1NDI=", "number": 1180, "title": "Lazily evaluated arguments for call_with_supported_arguments", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-06T18:43:34Z", "updated_at": "2021-01-07T18:56:24Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "While building https://github.com/simonw/datasette-export-notebook I thought it would be nice to be able to show a count of exported records on the page \"This will stream 10,422 records to your notebook\".\r\n\r\nNone of the documented arguments on https://docs.datasette.io/en/0.53/plugin_hooks.html#register-output-renderer-datasette expose the count. The closest is `sql` which could be executed as `select count(*) from ({sql})` but that's a bit inelegant.\r\n\r\nSo, idea: if your defined render function takes a `total` argument, the caller runs a `count(*)` query and passes you that total.\r\n\r\nTo implement this I would need to teach the `call_with_supported_arguments` that some arguments are lazy - they should execute a function (or an async function) but only if they are needed.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1180/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 497170355, "node_id": "MDU6SXNzdWU0OTcxNzAzNTU=", "number": 576, "title": "Documented internals API for use in plugins", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 10, "created_at": "2019-09-23T15:28:50Z", "updated_at": "2021-01-05T23:12:51Z", "closed_at": "2021-01-05T23:12:37Z", "author_association": "OWNER", "pull_request": null, "body": "Quite a few of the plugin hooks make a `datasette\u201d`instance of the Datasette class available to the plugins, so that they can look up configuration settings and execute database queries.\r\n\r\nThis means it should provide a documented, stable API so that plugin authors can rely on it.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/576/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 778682317, "node_id": "MDU6SXNzdWU3Nzg2ODIzMTc=", "number": 1173, "title": "GitHub Actions workflow to build manylinux binary", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-05T07:41:11Z", "updated_at": "2021-01-05T07:41:43Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Refs #1171 and #93", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1173/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 778530523, "node_id": "MDU6SXNzdWU3Nzg1MzA1MjM=", "number": 1172, "title": "/-/static should be excluded from auth and permission checks", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-05T02:53:41Z", "updated_at": "2021-01-05T02:53:41Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I want to set far future / immutable cache headers on everything served from `/-/static` and `/-/static-plugins`\r\n\r\nThis has security implications since it will be possible to see what plugins are installed by checking for known static URLs. I'm fine with that - performance is more important here.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1172/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 435819321, "node_id": "MDU6SXNzdWU0MzU4MTkzMjE=", "number": 436, "title": "400 Error when trying to register new user via https://publish.datasettes.com/", "user": {"value": 317694, "label": "nniiicc"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-04-22T17:55:00Z", "updated_at": "2021-01-04T20:15:42Z", "closed_at": "2021-01-04T20:15:41Z", "author_association": "NONE", "pull_request": null, "body": "Behavior: When registering a new user via Zeit - confirmation is sent and screen acknowledges registered user... When clicking grant access the next screen is a white 400 error message. \r\n\r\nReplicated: Chrome and Firefox; 2 different email accounts", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/436/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 459537047, "node_id": "MDU6SXNzdWU0NTk1MzcwNDc=", "number": 517, "title": "Add unit test for \"static\" mechanism in plugins", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-06-23T05:03:31Z", "updated_at": "2021-01-04T20:15:19Z", "closed_at": "2021-01-04T20:15:19Z", "author_association": "OWNER", "pull_request": null, "body": "Split out from #272 - this is actually quite tricky. Here's the relevant code:\r\n\r\nhttps://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/utils.py#L602-L614", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/517/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 377156339, "node_id": "MDU6SXNzdWUzNzcxNTYzMzk=", "number": 371, "title": "datasette publish digitalocean plugin", "user": {"value": 82988, "label": "psychemedia"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-11-04T14:07:41Z", "updated_at": "2021-01-04T20:14:28Z", "closed_at": "2021-01-04T20:14:28Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Provide support for launching `datasette` on Digital Ocean.\r\n\r\nExample: [Deploy Docker containers into Digital Ocean](https://blog.machinebox.io/deploy-machine-box-in-digital-ocean-385265fbeafd).\r\n\r\nDigital Ocean also has a preconfigured VM running Docker that can be launched from the command line via the Digital Ocean API: [Docker One-Click Application](https://www.digitalocean.com/docs/one-clicks/docker/).\r\n\r\nRelated:\r\n- Launching containers in Digital Ocean servers running docker: [How To Provision and Manage Remote Docker Hosts with Docker Machine on Ubuntu 16.04](https://www.digitalocean.com/community/tutorials/how-to-provision-and-manage-remote-docker-hosts-with-docker-machine-on-ubuntu-16-04)\r\n- [How To Use Doctl, the Official DigitalOcean Command-Line Client](https://www.digitalocean.com/community/tutorials/how-to-use-doctl-the-official-digitalocean-command-line-client)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/371/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 274264175, "node_id": "MDU6SXNzdWUyNzQyNjQxNzU=", "number": 102, "title": "datasette publish elasticbeanstalk", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2017-11-15T18:48:31Z", "updated_at": "2021-01-04T20:13:20Z", "closed_at": "2021-01-04T20:13:19Z", "author_association": "OWNER", "pull_request": null, "body": "It looks like Elastic Beanstalk is the most convenient way to deploy a docker container to AWS without first deploying a cluster.\r\n\r\nhttps://aws.amazon.com/blogs/devops/dockerizing-a-python-web-app/ looks helpful. We would need to automate the deployment with Boto: http://boto3.readthedocs.io/en/latest/reference/services/elasticbeanstalk.html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/102/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 315142414, "node_id": "MDU6SXNzdWUzMTUxNDI0MTQ=", "number": 221, "title": "Allow plugins to add new cli sub commands ", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2018-04-17T16:40:13Z", "updated_at": "2021-01-04T20:12:14Z", "closed_at": "2021-01-04T20:12:14Z", "author_association": "OWNER", "pull_request": null, "body": "I could then test this out by having https://github.com/simonw/csvs-to-sqlite register itself as a plugin", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/221/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 267739593, "node_id": "MDU6SXNzdWUyNjc3Mzk1OTM=", "number": 18, "title": "See if I can get a websockets interface working", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2017-10-23T16:46:41Z", "updated_at": "2021-01-04T20:05:52Z", "closed_at": "2021-01-04T20:05:48Z", "author_association": "OWNER", "pull_request": null, "body": "Since I am already running on Sanic, how hard would it be to add a websocket ebdpoint that lets you talk to sqlite interactively?\r\n\r\nCould this be used to efficiently support streaming in answers to giant queries?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/18/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 274265878, "node_id": "MDU6SXNzdWUyNzQyNjU4Nzg=", "number": 103, "title": "datasette publish appengine", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2017-11-15T18:54:18Z", "updated_at": "2021-01-04T20:05:14Z", "closed_at": "2021-01-04T20:05:14Z", "author_association": "OWNER", "pull_request": null, "body": "Similar approach to Heroku, discussed in #90 \r\n\r\nLooks like this could be pretty easy: https://cloud.google.com/appengine/docs/flexible/python/quickstart", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/103/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 670209331, "node_id": "MDU6SXNzdWU2NzAyMDkzMzE=", "number": 913, "title": "Mechanism for passing additional options to `datasette my.db` that affect plugins", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-07-31T20:38:26Z", "updated_at": "2021-01-04T20:04:11Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "> It's a shame there's no obvious mechanism for passing additional options to `datasette my.db` that affect how plugins work.\r\n>\r\n>The only way I can think of at the moment is via environment variables:\r\n>\r\n> DATASETTE_INSERT_UNSAFE=1 datasette my.db\r\n>\r\n>This will have to do for the moment - it's ugly enough that people will at least know they are doing something unsafe, which is the goal here.\r\n_Originally posted by @simonw in https://github.com/simonw/datasette-insert/issues/15#issuecomment-667346438_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/913/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 751195017, "node_id": "MDU6SXNzdWU3NTExOTUwMTc=", "number": 1111, "title": "Accessing a database's `.json` is slow for very large SQLite files", "user": {"value": 15178711, "label": "asg017"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-11-26T00:27:27Z", "updated_at": "2021-01-04T19:57:53Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "I have a SQLite DB that's pretty large, 23GB and something like 300 million rows. I expect that most queries I run on it will be slow, which is fine, but there are some things that Datasette does that makes working with the DB very slow. Specifically, when I access the `.json` metadata for a table (which I believe it comes from `datasette/views/database.py`, it takes 43 seconds for the request to come in:\r\n\r\n```bash\r\n$ time curl localhost:9999/out.json\r\n{\"database\": \"out\", \"size\": 24291454976, \"tables\": [{\"name\": \"PageviewsHour\", \"columns\": [\"file\", \"code\", \"page\", \"pageviews\"], \"primary_keys\": [], \"count\": null, \"hidden\": false, \"fts_table\": null, \"foreign_keys\": {\"incoming\": [], \"outgoing\": [{\"other_table\": \"PageviewsHourFiles\", \"column\": \"file\", \"other_column\": \"file_id\"}]}, \"private\": false}, {\"name\": \"PageviewsHourFiles\", \"columns\": [\"file_id\", \"filename\", \"sha256\", \"size\", \"day\", \"hour\"], \"primary_keys\": [\"file_id\"], \"count\": null, \"hidden\": false, \"fts_table\": null, \"foreign_keys\": {\"incoming\": [{\"other_table\": \"PageviewsHour\", \"column\": \"file_id\", \"other_column\": \"file\"}], \"outgoing\": []}, \"private\": false}, {\"name\": \"sqlite_sequence\", \"columns\": [\"name\", \"seq\"], \"primary_keys\": [], \"count\": 1, \"hidden\": false, \"fts_table\": null, \"foreign_keys\": {\"incoming\": [], \"outgoing\": []}, \"private\": false}], \"hidden_count\": 0, \"views\": [], \"queries\": [], \"private\": false, \"allow_execute_sql\": true, \"query_ms\": 43340.23213386536}\r\nreal\t0m43.417s\r\nuser\t0m0.006s\r\nsys\t0m0.016s\r\n```\r\n\r\nI suspect this is because a `COUNT(*)` is happening under the hood, which, when I run it through sqlite directly, does take around the same time:\r\n\r\n```bash\r\n$ time sqlite3 out.db < <(echo \"select count(*) from PageviewsHour;\")\r\n362794272\r\n\r\nreal\t0m44.523s\r\nuser\t0m2.497s\r\nsys\t0m6.703s\r\n```\r\n\r\nI'm using the `.json` request in the [Observable Datasette Client](https://observablehq.com/@asg017/datasette-client) to 1) verify that a link passed in is a reachable Datasette instance, and 2) a quick way to look at metadata for a db. A few different solutions I can think of:\r\n\r\n1. Have some other endpoint, like `/-/datasette.json` that the Observable Datasette client can fetch from to verify that the passed in URL is a valid Datasette (doesnt solve the slow problem, feel free to split this issue into 2)\r\n2. Have a way to turn off table counts when accessing a database's `.json` view, like `?no_count=1` or something\r\n3. Maybe have a timeout on the `table_counts()` function if it takes too long. which is odd, because it seems like it already does that (I think?), I can debug a little more if that's the case\r\n\r\nMore than happy to debug further, or send a PR if you like one of the proposals above!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1111/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 777677671, "node_id": "MDU6SXNzdWU3Nzc2Nzc2NzE=", "number": 1169, "title": "Prettier package not actually being cached", "user": {"value": 3637, "label": "benpickles"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-01-03T17:04:41Z", "updated_at": "2021-01-04T19:52:34Z", "closed_at": "2021-01-04T19:52:33Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "With the current configuration Prettier seems to be installed on every run - which can been [seen from the output](https://github.com/simonw/datasette/runs/1631686028?check_suite_focus=true#step:4:4):\r\n\r\n```\r\nnpx: installed 1 in 5.166s\r\n```\r\n\r\nPrettier isn't explicitly being installed (it's surprising that actually installing the dependencies isn't included in the [actions/cache docs](https://github.com/actions/cache/blob/main/examples.md#macos-and-ubuntu)) but it turns out that `npx` will automatically install the package for the specified command (it actually _guesses_ the package name from the name of the command). I'm not sure where Prettier ends up being installed but it doesn't appear to be in `~/.npm` according to the [post-cache output](https://github.com/simonw/datasette/runs/1631686028#step:7:2) (or `./node_modules` when I tested locally):\r\n\r\n```\r\nCache hit occurred on the primary key Linux-npm-565329898f77080e58b14d45cf816ab94877e6f2ece9d395c369c533548a7ee7, not saving cache.\r\n```\r\n\r\nI think there are a couple of approaches to tackling this, you could manually install/cache Prettier within the action, or add a `package.json` with Prettier. I would go with the latter because it's a more standard and maintainable approach and it will also ensure that, along with CI, anyone working on the project will run the same version of Prettier (you'll also get Dependabot JavaScript updates).\r\n\r\nI've tested the [`package.json` approach on a branch](https://github.com/simonw/datasette/compare/main...benpickles:cache-prettier) and am happy to turn it into a pull request if you fancy.\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1169/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777707544, "node_id": "MDU6SXNzdWU3Nzc3MDc1NDQ=", "number": 219, "title": "reset_counts() method and command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2021-01-03T20:08:28Z", "updated_at": "2021-01-03T20:59:37Z", "closed_at": "2021-01-03T20:59:37Z", "author_association": "OWNER", "pull_request": null, "body": "> Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers.\r\n>\r\n> One way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table.\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757_", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/219/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777535402, "node_id": "MDU6SXNzdWU3Nzc1MzU0MDI=", "number": 215, "title": "Use _counts to speed up counts", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2021-01-02T22:30:17Z", "updated_at": "2021-01-03T20:19:40Z", "closed_at": "2021-01-03T20:19:40Z", "author_association": "OWNER", "pull_request": null, "body": "Utility mechanism for taking advantage of the new `_counts` table from #212 would be nice.\r\n\r\nThese can trigger automatically if the `_counts` table exists, but since `sqlite-utils` needs to work against any existing database there should be a way of opting out of this optimization.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/215/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 738514367, "node_id": "MDU6SXNzdWU3Mzg1MTQzNjc=", "number": 202, "title": "sqlite-utils insert -f colname - for configuring full-text search", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-11-08T17:30:09Z", "updated_at": "2021-01-03T05:00:36Z", "closed_at": "2021-01-03T05:00:27Z", "author_association": "OWNER", "pull_request": null, "body": "A mechanism for specifying columns that should be configured for full-text search as part of the initial data import:\r\n\r\n sqlite-utils insert mydb.db articles articles.csv --csv -f title -f body", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/202/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777543336, "node_id": "MDU6SXNzdWU3Nzc1NDMzMzY=", "number": 217, "title": "Rename .escape() to .quote()", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-02T23:40:52Z", "updated_at": "2021-01-03T04:27:38Z", "closed_at": "2021-01-03T04:15:23Z", "author_association": "OWNER", "pull_request": null, "body": "`.quote()` is a better name because it reflects that the method adds quotes around the value.\r\n\r\nThis method has never been documented so I'm going to rename it without a major version bump. ", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/217/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777540352, "node_id": "MDU6SXNzdWU3Nzc1NDAzNTI=", "number": 216, "title": "database.triggers_dict introspection property", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-02T23:13:00Z", "updated_at": "2021-01-03T04:27:14Z", "closed_at": "2021-01-03T04:25:36Z", "author_association": "OWNER", "pull_request": null, "body": "Following #211", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/216/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777530107, "node_id": "MDU6SXNzdWU3Nzc1MzAxMDc=", "number": 214, "title": "sqlite-utils enable-counts command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-02T21:45:48Z", "updated_at": "2021-01-03T04:26:44Z", "closed_at": "2021-01-03T04:26:44Z", "author_association": "OWNER", "pull_request": null, "body": "The CLI version of #212 and #213.\r\n\r\n # Enable counts for all tables:\r\n sqlite-utils enable-counts data.db\r\n\r\n # Enable counts for specific tables:\r\n sqlite-utils enable-counts data.db table1 table2", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/214/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777560474, "node_id": "MDU6SXNzdWU3Nzc1NjA0NzQ=", "number": 218, "title": "\"sqlite-utils triggers\" command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-03T02:34:50Z", "updated_at": "2021-01-03T03:49:51Z", "closed_at": "2021-01-03T03:03:35Z", "author_association": "OWNER", "pull_request": null, "body": "A command to list the triggers in the database.\r\n\r\n sqlite-utils triggers my.db\r\n\r\nCan optionally take one or more tables:\r\n\r\n sqlite-utils triggers my.db table1 table2", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/218/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777529979, "node_id": "MDU6SXNzdWU3Nzc1Mjk5Nzk=", "number": 213, "title": "db.enable_counts() method", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-02T21:44:55Z", "updated_at": "2021-01-02T22:04:02Z", "closed_at": "2021-01-02T22:04:02Z", "author_association": "OWNER", "pull_request": null, "body": "Following #212 it would be useful if there was a utility method for enabling counts for ALL tables in a database:\r\n\r\n```python\r\ndb.enable_counts()\r\n```\r\n\r\nOpen question: should this setup triggers for virtual tables such as FTS tables? Could that break things?\r\n\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/213/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777392020, "node_id": "MDU6SXNzdWU3NzczOTIwMjA=", "number": 212, "title": "Mechanism for maintaining cache of table counts using triggers", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-01-02T02:58:53Z", "updated_at": "2021-01-02T21:40:27Z", "closed_at": "2021-01-02T21:40:27Z", "author_association": "OWNER", "pull_request": null, "body": "Counting all of the rows in a large table is expensive - this is one of the main causes of performance problems in Datasette when running against large databases.\r\n\r\nCarefully constructed SQL triggers could be used to maintain accurate cached counts for a table, by incrementing and decrementing a counter every time a row is inserted or deleted.\r\n\r\n`sqlite-utils` already has a mechanism for creating triggers for FTS - the `table.enable_fts(..., create_triggers=True)` method. How about a similar mechanism for setting up triggers to maintain a count of table rows?", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/212/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 777386465, "node_id": "MDU6SXNzdWU3NzczODY0NjU=", "number": 211, "title": "table.triggers_dict introspection property", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-01-02T02:04:00Z", "updated_at": "2021-01-02T02:10:10Z", "closed_at": "2021-01-02T02:10:10Z", "author_association": "OWNER", "pull_request": null, "body": "`table.triggers` currently returns a list of `Trigger` values. A `table.triggers_dict` property could behave like `columns_dict`, returning a dictionary mapping trigger names to their SQL definitions for that table.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/211/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 767685961, "node_id": "MDU6SXNzdWU3Njc2ODU5NjE=", "number": 210, "title": "Support of RData files", "user": {"value": 23739126, "label": "PeterBailey"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-15T15:04:14Z", "updated_at": "2021-01-02T00:02:40Z", "closed_at": "2021-01-02T00:02:40Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon,\r\n\r\nWould be great if you could ingest RData files!\r\n\r\nI could do this in a few lines of code but I am too lazy - sorry!\r\n\r\nPeter", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/210/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 766156875, "node_id": "MDU6SXNzdWU3NjYxNTY4NzU=", "number": 209, "title": "Test failure with sqlite 3.34 in test_cli.py::test_optimize", "user": {"value": 191622, "label": "meatcar"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-14T08:58:18Z", "updated_at": "2021-01-01T23:52:46Z", "closed_at": "2021-01-01T23:52:46Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "pytest output:\r\n```\r\n...\r\n============================== short test summary info ===============================\r\nFAILED tests/test_cli.py::test_optimize[tables0] - assert 1662976 < 1662976\r\nFAILED tests/test_cli.py::test_optimize[tables1] - assert 1667072 < 1662976\r\n===================== 2 failed, 538 passed, 3 skipped in 34.32s ======================\r\n```\r\n\r\nCame across this while packaging `sqlite-utils` for NixOS, but it can be recreated it using the `alpine:edge` docker image as well as follows:\r\n\r\n```\r\ndocker run --rm -it alpine:edge /bin/sh\r\n# apk update && apk add git sqlite python3 gcc python3-dev musl-dev && python3 -m ensurepip\r\n# git clone https://github.com/simonw/sqlite-utils.git\r\n# cd sqlite-utils/\r\n# pip3 install -e .[test]\r\n# pytest\r\n```\r\n\r\nThis definitely works on sqlite v3.33.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/209/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 776101101, "node_id": "MDU6SXNzdWU3NzYxMDExMDE=", "number": 1161, "title": "Update a whole bunch of links to datasette.io instead of datasette.readthedocs.io", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-29T21:47:31Z", "updated_at": "2020-12-29T21:49:57Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "https://ripgrep.datasette.io/-/ripgrep?pattern=%28datasette%7Csqlite-utils%29%5C.readthedocs%5C.io", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1161/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 421546944, "node_id": "MDU6SXNzdWU0MjE1NDY5NDQ=", "number": 417, "title": "Datasette Library", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2019-03-15T14:30:22Z", "updated_at": "2020-12-29T14:34:50Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "The ability to run Datasette in a mode where it automatically picks up new (or modified) files in a directory tree without needing to restart the server.\r\n\r\nSuggested command:\r\n\r\n datasette library /path/to/mydbs/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/417/reactions\", \"total_count\": 8, \"+1\": 8, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 567902704, "node_id": "MDU6SXNzdWU1Njc5MDI3MDQ=", "number": 675, "title": "--cp option for datasette publish and datasette package for shipping additional files and directories", "user": {"value": 141844, "label": "aviflax"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 12, "created_at": "2020-02-19T22:55:56Z", "updated_at": "2020-12-28T18:49:21Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I\u2019m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image \u2014 in my case, it\u2019s a sort of a deployment directive. I\u2019ve worked out a way to do this after the image has been created, but it\u2019s convoluted and brittle.\r\n\r\nSo it\u2019d be excellent if there was an additional option for this command, something like, like, `--copy`.\r\n\r\nI\u2019d envision it looking something like:\r\n\r\n```shell\r\n$ datasette package --copy /the/source/path:/the/target/path data.db\r\n```\r\n\r\nI\u2019d be happy to help design, specify, implement, and test this feature, if you\u2019d be interested.\r\n\r\nThanks for the fantastic tools!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/675/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 770436876, "node_id": "MDU6SXNzdWU3NzA0MzY4NzY=", "number": 1150, "title": "Maintain an in-memory SQLite table of connected databases and their tables", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 32, "created_at": "2020-12-17T23:02:13Z", "updated_at": "2020-12-27T14:51:39Z", "closed_at": "2020-12-18T22:34:12Z", "author_association": "OWNER", "pull_request": null, "body": "I want Datasette to have its own internal metadata about connected tables, to power features like a paginated searchable homepage in #461. I want this to be a SQLite table.\r\n\r\nThis could also be part of the directory scanning mechanism prototyped in #672 - where Datasette can be set to continually scan a directory for new database files that it can serve.\r\n\r\nAlso relevant to the Datasette Library concept in #417.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1150/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 624490929, "node_id": "MDU6SXNzdWU2MjQ0OTA5Mjk=", "number": 28, "title": "Invalid SQL no such table: main.uploads", "user": {"value": 41439, "label": "dmd"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-25T21:25:39Z", "updated_at": "2020-12-24T22:26:22Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "http://127.0.0.1:8001/photos/photos_with_apple_metadata gives \"Invalid SQL no such table: main.uploads\"", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/28/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 456568880, "node_id": "MDU6SXNzdWU0NTY1Njg4ODA=", "number": 509, "title": "Support opening multiple databases with the same stem", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": {"value": 9599, "label": "simonw"}, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 4, "created_at": "2019-06-15T19:32:00Z", "updated_at": "2020-12-22T20:04:35Z", "closed_at": "2020-12-22T20:04:35Z", "author_association": "OWNER", "pull_request": null, "body": "e.g. I should be able to do this:\r\n\r\n datasette App/data.db Other_App/data.db\r\n\r\nThis currently errors because you can't have two databases taking the `/data` URL path.\r\n\r\nInstead, how about in this particular case assigning the second database `/data-1`?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/509/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 771216293, "node_id": "MDU6SXNzdWU3NzEyMTYyOTM=", "number": 1155, "title": "Better internal database_name for _internal database", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-12-18T22:47:27Z", "updated_at": "2020-12-22T20:04:35Z", "closed_at": "2020-12-22T20:04:35Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/login-as-root then https://latest.datasette.io/_internal\r\n\r\n\"_schemas__columns__132_rows\"\r\n\r\nThat `database_name` is ugly. It should just be `_internal`.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1155/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 612151767, "node_id": "MDU6SXNzdWU2MTIxNTE3Njc=", "number": 15, "title": "Expose scores from ZCOMPUTEDASSETATTRIBUTES", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-05-04T20:36:07Z", "updated_at": "2020-12-20T04:44:22Z", "closed_at": "2020-05-05T00:11:45Z", "author_association": "MEMBER", "pull_request": null, "body": "The Apple Photos database has a `ZCOMPUTEDASSETATTRIBUTES` that looks absurdly interesting... it has calculated scores for every photo:\r\n\r\n\"Photos__ZCOMPUTEDASSETATTRIBUTES\"\r\n", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 771324837, "node_id": "MDU6SXNzdWU3NzEzMjQ4Mzc=", "number": 53, "title": "--since support for favorites", "user": {"value": 27, "label": "anotherjesse"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-19T07:08:23Z", "updated_at": "2020-12-19T07:47:11Z", "closed_at": "2020-12-19T07:47:11Z", "author_association": "NONE", "pull_request": null, "body": "Having support for `--since` for updating your favorites would be ideal as the api is both slow and it only returns ~3k most recent favorites.\r\n\r\nhttps://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3\r\n\r\nThe api seems to take an optional `since_id` parameter - https://developer.twitter.com/en/docs/twitter-api/v1/tweets/post-and-engage/api-reference/get-favorites-list", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 615474990, "node_id": "MDU6SXNzdWU2MTU0NzQ5OTA=", "number": 21, "title": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-05-10T20:58:06Z", "updated_at": "2020-12-19T07:44:49Z", "closed_at": "2020-05-10T21:57:13Z", "author_association": "MEMBER", "pull_request": null, "body": "```\r\n% python -i $(which photos-to-sqlite) apple-photos photos.db \r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py\", line 611, in place\r\n return self._place # pylint: disable=access-member-before-definition\r\nAttributeError: 'PhotoInfo' object has no attribute '_place'\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite\", line 11, in \r\n load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')()\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py\", line 249, in apple_photos\r\n photo_row = osxphoto_to_row(sha256, photo)\r\n File \"/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py\", line 91, in osxphoto_to_row\r\n place = photo.place\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py\", line 614, in place\r\n self._place = PlaceInfo5(self._info[\"reverse_geolocation\"])\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 505, in __init__\r\n self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 16, in unarchive\r\n return Unarchive(plist).top_object()\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 256, in top_object\r\n return self.decode_object(self.top_uid)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 126, in decode_archive\r\n mapItem = archive.decode(\"mapItem\")\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 140, in decode\r\n return self._unarchiver.decode_key(self._object, key)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 216, in decode_key\r\n return self.decode_object(val)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 180, in decode_archive\r\n sortedPlaceInfos = archive.decode(\"sortedPlaceInfos\")\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 140, in decode\r\n return self._unarchiver.decode_key(self._object, key)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 216, in decode_key\r\n return self.decode_object(val)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 112, in decode_archive\r\n return [archive._decode_index(index) for index in uids]\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 112, in \r\n return [archive._decode_index(index) for index in uids]\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 137, in _decode_index\r\n return self._unarchiver.decode_object(index)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 247, in decode_object\r\n obj = klass.decode_archive(ArchivedObject(raw_obj, self))\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py\", line 217, in decode_archive\r\n placeType = archive.decode(\"placeType\")\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 140, in decode\r\n return self._unarchiver.decode_key(self._object, key)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 216, in decode_key\r\n return self.decode_object(val)\r\n File \"/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py\", line 227, in decode_object\r\n raise CircularReference(index)\r\nbpylist.archiver.CircularReference: archive has a cycle with uid(13)\r\n```\r\nIn the debugger I traced this back to:\r\n```\r\n178 \t @staticmethod\r\n179 \t def decode_archive(archive):\r\n180 ->\t sortedPlaceInfos = archive.decode(\"sortedPlaceInfos\")\r\n181 \t finalPlaceInfos = archive.decode(\"finalPlaceInfos\")\r\n182 \t return PLRevGeoMapItem(sortedPlaceInfos, finalPlaceInfos)\r\n```", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 771316301, "node_id": "MDU6SXNzdWU3NzEzMTYzMDE=", "number": 31, "title": "Searching for \"github-to-sqlite\" throws an error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-12-19T06:07:20Z", "updated_at": "2020-12-19T06:18:07Z", "closed_at": "2020-12-19T06:18:07Z", "author_association": "MEMBER", "pull_request": null, "body": "https://datasette.io/-/beta?q=github-to-sqlite&sort=relevance&type=blog.db%2Fentries - \"no such column: to\"", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 443021509, "node_id": "MDU6SXNzdWU0NDMwMjE1MDk=", "number": 461, "title": "Paginate + search for databases/tables on the homepage", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 4, "created_at": "2019-05-11T18:05:34Z", "updated_at": "2020-12-17T22:14:46Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Split out from #460 - in order to support large numbers of connected databases the homepage needs to be paginated.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/461/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 750089847, "node_id": "MDU6SXNzdWU3NTAwODk4NDc=", "number": 1109, "title": "Deprecate --config in Datasette 1.0 (in favour of --setting)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 0, "created_at": "2020-11-24T21:43:57Z", "updated_at": "2020-12-17T22:07:49Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I added a deprecation warning to this in #992.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1109/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 634663505, "node_id": "MDU6SXNzdWU2MzQ2NjM1MDU=", "number": 815, "title": "Group permission checks by request on /-/permissions debug page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2020-06-08T14:25:23Z", "updated_at": "2020-12-17T22:06:48Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Now that we're making a LOT more permission checks (on the DB index page we do a check for every listed table for example) the `/-/permissions` page gets filled up pretty quickly.\r\n\r\nCan make this more readable by grouping permission checks by request. Have most recent request at the top of the page but the permission requests within that page sorted chronologically by most recent last.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/815/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 450032134, "node_id": "MDU6SXNzdWU0NTAwMzIxMzQ=", "number": 495, "title": "facet_m2m gets confused by multiple relationships", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-05-29T21:37:28Z", "updated_at": "2020-12-17T05:08:22Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I got this for a database I was playing with:\r\n\r\n\"hackathon__hacks_project__24_rows\"\r\n\r\nI think this is because of these three tables:\r\n\r\n\"hackathon__select___from_sqlite_master_\"\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/495/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 463492815, "node_id": "MDU6SXNzdWU0NjM0OTI4MTU=", "number": 534, "title": "500 error on m2m facet detection", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-07-03T00:42:42Z", "updated_at": "2020-12-17T05:08:22Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This may help debug:\r\n```\r\ndiff --git a/datasette/facets.py b/datasette/facets.py\r\nindex 76d73e5..07a4034 100644\r\n--- a/datasette/facets.py\r\n+++ b/datasette/facets.py\r\n@@ -499,11 +499,14 @@ class ManyToManyFacet(Facet):\r\n \"outgoing\"\r\n ]\r\n if len(other_table_outgoing_foreign_keys) == 2:\r\n- destination_table = [\r\n- t\r\n- for t in other_table_outgoing_foreign_keys\r\n- if t[\"other_table\"] != self.table\r\n- ][0][\"other_table\"]\r\n+ try:\r\n+ destination_table = [\r\n+ t\r\n+ for t in other_table_outgoing_foreign_keys\r\n+ if t[\"other_table\"] != self.table\r\n+ ][0][\"other_table\"]\r\n+ except IndexError:\r\n+ import pdb; pdb.pm()\r\n # Only suggest if it's not selected already\r\n if (\"_facet_m2m\", destination_table) in args:\r\n continue\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/534/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 769397742, "node_id": "MDU6SXNzdWU3NjkzOTc3NDI=", "number": 3, "title": "sqlite-utils error on takeout import", "user": {"value": 231498, "label": "khimaros"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-17T01:18:48Z", "updated_at": "2020-12-17T01:19:04Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "```\r\n$ google-takeout-to-sqlite my-activity takeout.db /path/to/zip\r\n...\r\nsqlite3.OperationalError: no such table: main.my_activity\r\n```\r\n\r\nthere is no table create in `utils.py`, unlike other importers such as github-to-sqlite\r\n\r\nadditionally, this package and hackernews-to-sqlite have conflicting `sqlite-utils` dep with datasette and dogsheep-beta", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/3/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 769376447, "node_id": "MDU6SXNzdWU3NjkzNzY0NDc=", "number": 2, "title": "killed by oomkiller on large location-history", "user": {"value": 231498, "label": "khimaros"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-17T00:32:24Z", "updated_at": "2020-12-17T00:48:32Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "memory seems to grow unbounded and is oom-killed after about 20GB memory usage.\r\n\r\nthis is happening while loading a ~1GB uncompressed location history.", "repo": {"value": 206649770, "label": "google-takeout-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 767561886, "node_id": "MDU6SXNzdWU3Njc1NjE4ODY=", "number": 1148, "title": "Syntax error with + symbol when deployed to Vercel", "user": {"value": 765871, "label": "kaihendry"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-15T12:53:12Z", "updated_at": "2020-12-16T21:57:42Z", "closed_at": "2020-12-16T21:51:54Z", "author_association": "NONE", "pull_request": null, "body": "Works locally:\r\n```\r\n(ins)[hendry@t14s aws-partners]$ sqlite3 partners.db < test.sql\r\n5|{\"href\":\"https://partners.amazonaws.com/partners/001E000000NaBI0IAN\",\"label\":\"Slalom\"}\r\n6|{\"href\":\"https://partners.amazonaws.com/partners/001E000000VHBQIIA5\",\"label\":\"Accenture\"}\r\n7|{\"href\":\"https://partners.amazonaws.com/partners/001E000000Rp588IAB\",\"label\":\"Druva\"}\r\n8|{\"href\":\"https://partners.amazonaws.com/partners/001E0000013FeQXIA0\",\"label\":\"Palo Alto Networks\"}\r\n9|{\"href\":\"https://partners.amazonaws.com/partners/001E000000Rl12lIAB\",\"label\":\"New Relic\"}\r\n10|{\"href\":\"https://partners.amazonaws.com/partners/001E000000NaBHWIA3\",\"label\":\"Deloitte\"}\r\n11|{\"href\":\"https://partners.amazonaws.com/partners/001E000000Rp5GDIAZ\",\"label\":\"MegazoneCloud\"}\r\n12|{\"href\":\"https://partners.amazonaws.com/partners/001E000000NaBHMIA3\",\"label\":\"iret, Inc.\"}\r\n(ins)[hendry@t14s aws-partners]$ cat test.sql\r\nselect A.launch_rank, A.partner_info from summary A INNER JOIN summary B\r\n ON A.launch_rank>=B.launch_rank-3 AND A.launch_rank<=B.launch_rank+4\r\n WHERE B.\"partner_info\" LIKE '%Palo Alto%'\r\n```\r\n\r\nCopy the SQL into https://aws-partners-singapore.vercel.app/partners and get syntax error:\r\n\r\n![image](https://user-images.githubusercontent.com/765871/102217393-78e4f280-3f17-11eb-9e13-ca79ed62ec34.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1148/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 769282206, "node_id": "MDU6SXNzdWU3NjkyODIyMDY=", "number": 30, "title": "Upgrade to sqlite-utils 3.0 (tests are failing)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-16T21:25:15Z", "updated_at": "2020-12-16T21:27:11Z", "closed_at": "2020-12-16T21:27:10Z", "author_association": "MEMBER", "pull_request": null, "body": "```\r\n results = beta_db[\"search_index\"].search(\"run\")\r\n if use_porter:\r\n assert results == [\r\n (\r\n \"dogs.db/dogs\",\r\n \"1\",\r\n \"Cleo\",\r\n \"2020-08-22 04:41:33\",\r\n 1,\r\n 0,\r\n \"running\",\r\n None,\r\n None,\r\n )\r\n ]\r\n else:\r\n> assert results == []\r\nE assert == []\r\nE +\r\nE -[]\r\nE Full diff:\r\nE - []\r\nE + \r\n```\r\nThis was caused by a backwards incompatible change in sqlite-utils 3.0: https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v3-0", "repo": {"value": 197431109, "label": "dogsheep-beta"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-beta/issues/30/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 769150394, "node_id": "MDU6SXNzdWU3NjkxNTAzOTQ=", "number": 58, "title": "Readme HTML has broken internal links", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-16T17:58:11Z", "updated_at": "2020-12-16T19:20:14Z", "closed_at": "2020-12-16T19:20:14Z", "author_association": "MEMBER", "pull_request": null, "body": "From https://github.com/simonw/datasette.io/issues/46\r\n```html\r\n
  • Filtering tables
  • \r\n...\r\n

    Filtering tables

    \r\n```\r\nSo this is a bug in GitHub's API, but we need to work around it.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 758944006, "node_id": "MDU6SXNzdWU3NTg5NDQwMDY=", "number": 57, "title": "--readme throws 404 error if README does not exist in repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-07T23:58:49Z", "updated_at": "2020-12-16T18:17:54Z", "closed_at": "2020-12-16T18:17:54Z", "author_association": "MEMBER", "pull_request": null, "body": "It should fail silently (populate the column with a null) instead.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/57/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 718395987, "node_id": "MDExOlB1bGxSZXF1ZXN0NTAwNzk4MDkx", "number": 1008, "title": "Add json_loads and json_dumps jinja2 filters", "user": {"value": 649467, "label": "mhalle"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-10-09T20:11:34Z", "updated_at": "2020-12-15T02:30:28Z", "closed_at": null, "author_association": "FIRST_TIME_CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1008", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1008/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 761713079, "node_id": "MDU6SXNzdWU3NjE3MTMwNzk=", "number": 1138, "title": "\"Powered by Datasette\" should link to new datasette.io site", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-10T23:33:41Z", "updated_at": "2020-12-15T02:28:10Z", "closed_at": "2020-12-10T23:37:14Z", "author_association": "OWNER", "pull_request": null, "body": "https://datasette.io/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1138/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 765637324, "node_id": "MDU6SXNzdWU3NjU2MzczMjQ=", "number": 1144, "title": "JavaScript to help plugins interact with the fragment part of the URL", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-13T20:36:06Z", "updated_at": "2020-12-14T14:47:11Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Suggested by Markus Holtermann on Twitter, who is building https://github.com/MarkusH/datasette-chartjs\r\n\r\n> I've been looking at datasette-vega for how you persist chart settings between form submissions. I've adopted that for datasette-chartjs. Any thoughts on adding a public JS API to #datasette itself, that plugins can rely on?\r\n>\r\n> I'm talking about functions like onFragmentChange, serialize, unserialize, ... That turn an object into a URL encoded string and put it into the location's hash. And also updating all links/forms automatically.\r\n>\r\n> Essentially, a plugins could do something like `document.datasette.setConfigValue('prefix', 'foo', 'bar')` and `.getConfigValue('prefix', 'foo')`. And the functions would take care of updating document.location.hash, all (necessary) a.href and form.action\r\n\r\nhttps://twitter.com/m_holtermann/status/1338183973311295492", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1144/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 763283616, "node_id": "MDU6SXNzdWU3NjMyODM2MTY=", "number": 207, "title": "sqlite-utils analyze-tables command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-12-12T04:33:12Z", "updated_at": "2020-12-13T07:25:23Z", "closed_at": "2020-12-13T07:20:13Z", "author_association": "OWNER", "pull_request": null, "body": "A command which analyzes a table (potentially taking quite a while if the table is large) and outputs information for each column - things like:\r\n\r\n- How many unique values does this column have?\r\n- How many null rows?\r\n- How many blank rows? (defined as empty string)\r\n- What are the 10 most common values?\r\n- What are the 10 least common values?\r\n\r\nThe command can output this information to the terminal, but it should also provide an option for writing the information to a database table so it can be explored later.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/207/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 763320133, "node_id": "MDExOlB1bGxSZXF1ZXN0NTM3NzkxNjc1", "number": 208, "title": "sqlite-utils analyze-tables command and table.analyze_column() method", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-12-12T05:27:49Z", "updated_at": "2020-12-13T07:20:16Z", "closed_at": "2020-12-13T07:20:12Z", "author_association": "OWNER", "pull_request": "simonw/sqlite-utils/pulls/208", "body": "Refs #207\r\n\r\n- [x] Improve design of CLI output\r\n- [x] Truncate long values in least/most common\r\n- [x] Add a `-c` column selection option\r\n- [x] Tests\r\n- [x] Documentation", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/208/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 717699884, "node_id": "MDU6SXNzdWU3MTc2OTk4ODQ=", "number": 998, "title": "Wide tables should scroll horizontally within the page", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 6026070, "label": "0.51"}, "comments": 8, "created_at": "2020-10-08T22:13:27Z", "updated_at": "2020-12-11T09:25:09Z", "closed_at": "2020-10-22T01:12:26Z", "author_association": "OWNER", "pull_request": null, "body": "Wrap the main table in `
    `", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/998/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 551834842, "node_id": "MDU6SXNzdWU1NTE4MzQ4NDI=", "number": 659, "title": "README information is obscured by feature history", "user": {"value": 55480210, "label": "labstersteve"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-01-18T22:34:51Z", "updated_at": "2020-12-10T23:28:51Z", "closed_at": "2020-12-10T23:28:51Z", "author_association": "NONE", "pull_request": null, "body": "While it's sometimes valuable to know how a project has developed, there is usually little justification for including this information in the README, and certainly not immediately after other key information such as \"what does this package do, and who might want to use it?\"\r\n\r\nMight I recommend that the feature history is migrated to an Appendix in the documentation?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/659/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 761706858, "node_id": "MDU6SXNzdWU3NjE3MDY4NTg=", "number": 1137, "title": "Update README to reflect new datasette.io site", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-10T23:22:06Z", "updated_at": "2020-12-10T23:28:50Z", "closed_at": "2020-12-10T23:28:50Z", "author_association": "OWNER", "pull_request": null, "body": "Can finally close #659.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1137/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 760960559, "node_id": "MDU6SXNzdWU3NjA5NjA1NTk=", "number": 205, "title": "sqlite3.OperationalError: near \"(\": syntax error", "user": {"value": 765871, "label": "kaihendry"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-10T06:44:40Z", "updated_at": "2020-12-10T19:18:22Z", "closed_at": "2020-12-10T07:24:23Z", "author_association": "NONE", "pull_request": null, "body": "The sqlite version is 3.22.0 2018-01-22 18:45:57 0c55d179733b46d8d0ba4d88e01a25e10677046ee3da1d5b1581e86726f2alt1\r\nsqlite-utils, version 3.0\r\n\r\nIt fails here: https://github.com/kaihendry/aws-partners-datasette/runs/1528432635?check_suite_focus=true\r\n\r\nI'm not sure where the problem is, since it works _fine locally_ on Archlinux system running 3.34.0 2020-12-01 16:14:00 a26b6597e3ae272231b96f9982c3bcc17ddec2f2b6eb4df06a224b91089fed5b\r\n\r\n\r\nhttps://github.com/kaihendry/aws-partners-datasette/blob/main/create-summary-view.sh\r\n\r\nMaybe I need to bump up from ubuntu-latest to ?\r\n", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/205/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 760312579, "node_id": "MDU6SXNzdWU3NjAzMTI1Nzk=", "number": 1134, "title": "\"_searchmode=raw\" throws an index out of range error when combined with \"_search_COLUMN\"", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-12-09T13:05:37Z", "updated_at": "2020-12-10T05:57:17Z", "closed_at": "2020-12-09T19:56:55Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon!\r\nMaybe it's just me, but when [using _searchmode=raw (trying to enable wildcard-searching) in combination with the \"_search_COLUMN\"-table argument](https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search_title=sundhedsfrem*), I get a list index out of range error. [When combining with the simpler \"_search\"-argument everything works, including wildcard-seaches.](https://byraadsarkivet.aarhus.dk/db/cases?_search=sundhedsfrem*&_searchmode=raw). Here's the traceback:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 122, in route_path\r\n return await view(new_scope, receive, send)\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py\", line 196, in view\r\n request, **scope[\"url_route\"][\"kwargs\"]\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py\", line 204, in get\r\n request, database, hash, correct_hash_provided, **kwargs\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py\", line 342, in view_get\r\n request, database, hash, **kwargs\r\n File \"/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py\", line 393, in data\r\n search_col = key.split(\"_search_\", 1)[1]\r\nIndexError: list index out of range\r\n\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1134/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 760621356, "node_id": "MDU6SXNzdWU3NjA2MjEzNTY=", "number": 1136, "title": "Establish pattern for release branches to support bug fixes", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2020-12-09T19:48:18Z", "updated_at": "2020-12-09T20:17:02Z", "closed_at": "2020-12-09T20:14:41Z", "author_association": "OWNER", "pull_request": null, "body": "I want to fix the bug in #1134 and ship it as Datasette 0.52.5 - but the `main` branch now has a feature in it (4c25b035b2370983c8dd5e0c8762e9154e379774 added `arraynotcontains`, #1132).\r\n\r\nI'm not ready for a feature release, so instead I want to release 0.52.5 with just that bug fix.\r\n\r\nThis is the first time I will have shipped a release from a branch. I need to establish that pattern and add it to the documentation in https://docs.datasette.io/en/stable/contributing.html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1136/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 752888228, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI5MDkwNTYw", "number": 204, "title": "use jsonify_if_need for sql updates", "user": {"value": 78035, "label": "mfa"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-11-29T10:49:00Z", "updated_at": "2020-12-08T17:49:42Z", "closed_at": "2020-12-08T17:49:42Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/sqlite-utils/pulls/204", "body": "", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/204/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 758899581, "node_id": "MDU6SXNzdWU3NTg4OTk1ODE=", "number": 1132, "title": "New filter: array does not contain", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-07T22:28:20Z", "updated_at": "2020-12-07T22:50:36Z", "closed_at": "2020-12-07T22:41:09Z", "author_association": "OWNER", "pull_request": null, "body": "I want to see all of my GitHub repos that are tagged `datasette-plugin` but are NOT tagged `datasette-io`:\r\n\r\nhttps://github-to-sqlite.dogsheep.net/github/repos?_facet=owner&owner=9599&_facet_array=topics&topics__arraycontains=datasette-plugin&topics__arraycontains=datasette-io#facet-topics", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1132/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 756867924, "node_id": "MDExOlB1bGxSZXF1ZXN0NTMyMzQyMDI1", "number": 1128, "title": "Fix startup error on windows", "user": {"value": 3243482, "label": "abdusco"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-04T07:12:26Z", "updated_at": "2020-12-06T08:41:45Z", "closed_at": "2020-12-05T19:35:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1128", "body": "Fixes https://github.com/simonw/datasette/issues/1094\r\n\r\nThis import isn't used at all, and causes error on startup on Windows.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1128/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 757481949, "node_id": "MDU6SXNzdWU3NTc0ODE5NDk=", "number": 1131, "title": "\"datasette inspect\" outputs invalid JSON if an error is logged", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-12-05T00:00:45Z", "updated_at": "2020-12-05T20:48:34Z", "closed_at": "2020-12-05T05:21:19Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/register-of-members-interests/issues/6:\r\n```\r\n% datasette inspect regmem.db \r\nERROR: conn=, sql = 'select count(*) from [items_fts]', params = None: SQL logic error\r\n{\r\n \"regmem\": {\r\n \"hash\": \"6fde27e3dea80d6b65f2ac7f89cd8448980fee8c91b505ba29c311ba0393317f\",\r\n \"size\": 936198144,\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1131/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 398011658, "node_id": "MDU6SXNzdWUzOTgwMTE2NTg=", "number": 398, "title": "Ensure downloading a 100+MB SQLite database file works", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 3, "created_at": "2019-01-10T20:57:52Z", "updated_at": "2020-12-05T19:36:27Z", "closed_at": "2020-12-05T19:36:27Z", "author_association": "OWNER", "pull_request": null, "body": "I've seen attempted downloads of large files fail after about ten seconds.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/398/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 743011397, "node_id": "MDU6SXNzdWU3NDMwMTEzOTc=", "number": 1094, "title": "import EX_CANTCREAT means datasette fails to work on Windows", "user": {"value": 1049910, "label": "drkane"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-11-14T14:17:11Z", "updated_at": "2020-12-05T19:35:04Z", "closed_at": "2020-12-05T19:35:04Z", "author_association": "NONE", "pull_request": null, "body": "Trying to use datasette 0.51.1 gives the following error:\r\n\r\n```\r\nImportError: cannot import name 'EX_CANTCREAT' from 'os' (C:\\Users\\drkan\\AppData\\Local\\Programs\\Python\\Python39\\lib\\os.py)\r\n```\r\n\r\nLooks like that code is only available on unix: https://docs.python.org/3/library/os.html#os.EX_CANTCREAT\r\n\r\nRemoving the line makes it work fine (`EX_CANTCREAT` doesn't seem to be used anywhere?)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1094/reactions\", \"total_count\": 3, \"+1\": 3, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 638375985, "node_id": "MDExOlB1bGxSZXF1ZXN0NDM0MTYyMzE2", "number": 29, "title": "Fixed bug in SQL query for photo scores", "user": {"value": 41546558, "label": "RhetTbull"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-06-14T15:39:22Z", "updated_at": "2020-12-04T22:32:36Z", "closed_at": "2020-12-04T22:32:27Z", "author_association": "CONTRIBUTOR", "pull_request": "dogsheep/dogsheep-photos/pulls/29", "body": "The join on ZCOMPUTEDASSETATTRIBUTES used the wrong columns. In most of the Photos database tables, table.ZASSET joins with ZGENERICASSET.Z_PK", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/29/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 309047460, "node_id": "MDU6SXNzdWUzMDkwNDc0NjA=", "number": 188, "title": "Ability to bundle metadata and templates inside the SQLite file", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2018-03-27T16:42:07Z", "updated_at": "2020-12-04T17:18:34Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "One of the nicest qualities of SQLite as a data format is that you get a single file which you can then backup or share with other people. \r\n\r\nDatasette breaks this a little once you start including custom metadata.json or template files and CSS.\r\n\r\nIt would be cool if there was an optional mechanism for baking that extra configuration into the SQLite file itself. That way entire datasette mini-applications (including canned queries and custom HTML and CSS) could be constructed as single .db files.\r\n\r\nSince datasette configuration is all file-based, one way to achieve that would be to support a \"datasette_files\" table which, if present is used to search for file contents by path.\r\n\r\nThis is inline with the philosophy described by https://www.sqlite.org/appfileformat.html\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/188/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 756875827, "node_id": "MDU6SXNzdWU3NTY4NzU4Mjc=", "number": 1129, "title": "Fix footer to the bottom of the page", "user": {"value": 3243482, "label": "abdusco"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-04T07:28:07Z", "updated_at": "2020-12-04T16:04:29Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Footer doesn't stick to the bottom if the body content isn't long enough to reach the end of viewport.\r\n\r\n![before & after](https://user-images.githubusercontent.com/3243482/101134785-f6595a80-361b-11eb-81ce-b8b5cb9c5bc2.png)\r\n\r\n\r\nThis can be fixed using flexbox.\r\n\r\n```css\r\nbody {\r\n min-height: 100vh;\r\n display: flex;\r\n flex-direction: column;\r\n}\r\n\r\n.content {\r\n flex-grow: 1;\r\n}\r\n```\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1129/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 756622648, "node_id": "MDU6SXNzdWU3NTY2MjI2NDg=", "number": 1125, "title": "Show pysqlite3 version on /-/versions", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-12-03T21:57:23Z", "updated_at": "2020-12-04T04:16:57Z", "closed_at": "2020-12-04T04:16:57Z", "author_association": "OWNER", "pull_request": null, "body": "This code can use `pysqlite3` if available. The version should be exposed on `/-/versions`.\r\n\r\nhttps://github.com/simonw/datasette/blob/4cce5516661b24afeddaf35bee84b00fbf5c7f89/datasette/utils/sqlite.py#L1-L4", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1125/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 756761963, "node_id": "MDU6SXNzdWU3NTY3NjE5NjM=", "number": 1126, "title": "Switch to google-github-actions/setup-gcloud for demo deploy", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-04T03:11:24Z", "updated_at": "2020-12-04T03:51:38Z", "closed_at": "2020-12-04T03:51:38Z", "author_association": "OWNER", "pull_request": null, "body": "That workflow is showing warnings: https://github.com/simonw/datasette/actions/runs/399400077\r\n\r\n> Thank you for using setup-gcloud Action. GoogleCloudPlatform/github-actions/setup-gcloud has been deprecated, please switch to google-github-actions/setup-gcloud.\r\n\r\nhttps://github.com/google-github-actions/setup-gcloud has a Cloud Run example here: https://github.com/google-github-actions/setup-gcloud/blob/master/example-workflows/cloud-run/.github/workflows/cloud-run.yml", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1126/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 756439516, "node_id": "MDU6SXNzdWU3NTY0Mzk1MTY=", "number": 1124, "title": "Datasette on Amazon Linux on ARM returns 404 for static assets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-12-03T18:20:37Z", "updated_at": "2020-12-03T21:42:02Z", "closed_at": "2020-12-03T21:10:54Z", "author_association": "OWNER", "pull_request": null, "body": "Very weird bug this one. Steps to reproduce:\r\n```\r\n# I started a amzn2-ami-hvm-2.0.20201126.0-arm64-gp2 t4g.micro instance\r\nec2 % ssh -i simonw-ec2.pem ec2-user@ec2-18-219-238-192.us-east-2.compute.amazonaws.com\r\n[ec2-user@ip-172-31-30-7 ~]$ sudo yum install python3\r\nLoaded plugins: extras_suggestions, langpacks, priorities, update-motd\r\n...\r\nIs this ok [y/d/N]: y\r\nDownloading packages:\r\n(1/4): python3-3.7.9-1.amzn2.0.1.aarch64.rpm | 72 kB 00:00:00 \r\n(2/4): python3-pip-9.0.3-1.amzn2.0.2.noarch.rpm | 1.9 MB 00:00:00 \r\n(3/4): python3-setuptools-38.4.0-3.amzn2.0.6.noarch.rpm | 617 kB 00:00:00 \r\n(4/4): python3-libs-3.7.9-1.amzn2.0.1.aarch64.rpm | 9.1 MB 00:00:00 \r\n-----------------------------------------------------------------------------------------------------------------------------------------------------------\r\nTotal 68 MB/s | 12 MB 00:00:00 \r\nRunning transaction check\r\nRunning transaction test\r\nTransaction test succeeded\r\nRunning transaction\r\n Installing : python3-pip-9.0.3-1.amzn2.0.2.noarch 1/4 \r\n Installing : python3-3.7.9-1.amzn2.0.1.aarch64 2/4 \r\n Installing : python3-setuptools-38.4.0-3.amzn2.0.6.noarch 3/4 \r\n Installing : python3-libs-3.7.9-1.amzn2.0.1.aarch64 4/4 \r\n Verifying : python3-setuptools-38.4.0-3.amzn2.0.6.noarch 1/4 \r\n Verifying : python3-pip-9.0.3-1.amzn2.0.2.noarch 2/4 \r\n Verifying : python3-3.7.9-1.amzn2.0.1.aarch64 3/4 \r\n Verifying : python3-libs-3.7.9-1.amzn2.0.1.aarch64 4/4 \r\n\r\nInstalled:\r\n python3.aarch64 0:3.7.9-1.amzn2.0.1 \r\n\r\nDependency Installed:\r\n python3-libs.aarch64 0:3.7.9-1.amzn2.0.1 python3-pip.noarch 0:9.0.3-1.amzn2.0.2 python3-setuptools.noarch 0:38.4.0-3.amzn2.0.6 \r\n\r\nComplete!\r\n[ec2-user@ip-172-31-30-7 ~]$ python3 -m pip install --user pipx\r\nCollecting pipx\r\n Downloading https://files.pythonhosted.org/packages/42/8a/8447ec14562c5d97afbee54ec390864718bccce9dfb0506c8c77852489d3/pipx-0.15.6.0-py3-none-any.whl (43kB)\r\n 100% |\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 51kB 3.8MB/s \r\nCollecting packaging>=20.0 (from pipx)\r\n Downloading https://files.pythonhosted.org/packages/28/87/8edcf555adaf60d053ead881bc056079e29319b643ca710339ce84413136/packaging-20.7-py2.py3-none-any.whl\r\nCollecting argcomplete<2.0,>=1.9.4 (from pipx)\r\n Downloading https://files.pythonhosted.org/packages/e3/d0/ee7fc6ceac8957196def9bfa3b955d02163058defd3edd51ef7b1ff2769e/argcomplete-1.12.2-py2.py3-none-any.whl\r\nCollecting userpath>=1.4.1 (from pipx)\r\n Downloading https://files.pythonhosted.org/packages/fb/f1/faca8a3cc86bd2223aaf72edd222bc31d6ae2eea5feaf17a144634a92e6d/userpath-1.4.1-py2.py3-none-any.whl\r\nCollecting pyparsing>=2.0.2 (from packaging>=20.0->pipx)\r\n Downloading https://files.pythonhosted.org/packages/8a/bb/488841f56197b13700afd5658fc279a2025a39e22449b7cf29864669b15d/pyparsing-2.4.7-py2.py3-none-any.whl (67kB)\r\n 100% |\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 71kB 8.8MB/s \r\nCollecting importlib-metadata<4,>=0.23; python_version == \"3.7\" (from argcomplete<2.0,>=1.9.4->pipx)\r\n Downloading https://files.pythonhosted.org/packages/99/c7/4ccf2baa455613aa9e61372365aba8594ff2806c82189c31a6c65e7c679e/importlib_metadata-3.1.1-py3-none-any.whl\r\nCollecting click (from userpath>=1.4.1->pipx)\r\n Downloading https://files.pythonhosted.org/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl (82kB)\r\n 100% |\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 92kB 10.5MB/s \r\nCollecting distro; platform_system == \"Linux\" (from userpath>=1.4.1->pipx)\r\n Downloading https://files.pythonhosted.org/packages/25/b7/b3c4270a11414cb22c6352ebc7a83aaa3712043be29daa05018fd5a5c956/distro-1.5.0-py2.py3-none-any.whl\r\nCollecting zipp>=0.5 (from importlib-metadata<4,>=0.23; python_version == \"3.7\"->argcomplete<2.0,>=1.9.4->pipx)\r\n Downloading https://files.pythonhosted.org/packages/41/ad/6a4f1a124b325618a7fb758b885b68ff7b058eec47d9220a12ab38d90b1f/zipp-3.4.0-py3-none-any.whl\r\nInstalling collected packages: pyparsing, packaging, zipp, importlib-metadata, argcomplete, click, distro, userpath, pipx\r\nSuccessfully installed argcomplete-1.12.2 click-7.1.2 distro-1.5.0 importlib-metadata-3.1.1 packaging-20.7 pipx-0.15.6.0 pyparsing-2.4.7 userpath-1.4.1 zipp-3.4.0\r\n[ec2-user@ip-172-31-30-7 ~]$ python3 -m pipx ensurepath\r\n/home/ec2-user/.local/bin is already in PATH.\r\n/home/ec2-user/.local/bin is already in PATH.\r\n\r\nAll pipx binary directories have been added to PATH. If you are sure\r\nyou want to proceed, try again with the '--force' flag.\r\n\r\nOtherwise pipx is ready to go! \u2728 \ud83c\udf1f \u2728\r\n[ec2-user@ip-172-31-30-7 ~]$ pipx install datasette\r\n installed package datasette 0.52.2, Python 3.7.9\r\n These apps are now globally available\r\n - datasette\r\ndone! \u2728 \ud83c\udf1f \u2728\r\n[ec2-user@ip-172-31-30-7 ~]$ datasette --version\r\ndatasette, version 0.52.2\r\n[ec2-user@ip-172-31-30-7 ~]$ datasette --get /-/static/app.css\r\n404\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1124/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 754179035, "node_id": "MDExOlB1bGxSZXF1ZXN0NTMwMTI1Njk1", "number": 1122, "title": "Fix misaligned table actions cog", "user": {"value": 3243482, "label": "abdusco"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-12-01T08:41:46Z", "updated_at": "2020-12-03T10:56:40Z", "closed_at": "2020-12-03T00:33:37Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1122", "body": "Fixes https://github.com/simonw/datasette/issues/1121", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1122/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 754178780, "node_id": "MDU6SXNzdWU3NTQxNzg3ODA=", "number": 1121, "title": "Table actions cog is misaligned", "user": {"value": 3243482, "label": "abdusco"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-01T08:41:25Z", "updated_at": "2020-12-03T01:03:19Z", "closed_at": "2020-12-03T00:33:36Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "At the moment it looks like this\r\nhttps://datasette-graphql-demo.datasette.io/github/repos\r\n\r\n![image](https://user-images.githubusercontent.com/3243482/100716533-e6e2d300-33c9-11eb-866e-1e83ba228bf5.png)\r\n\r\nAdding a few flex statements fixes the alignment and centers `h1` text and the cog icon vertically.\r\n![image](https://user-images.githubusercontent.com/3243482/100716605-f8c47600-33c9-11eb-8d69-0e37499cf641.png)\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1121/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 747702144, "node_id": "MDU6SXNzdWU3NDc3MDIxNDQ=", "number": 1100, "title": "Error on OPTIONS request to database", "user": {"value": 1319404, "label": "akehrer"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-11-20T18:16:43Z", "updated_at": "2020-12-03T00:57:35Z", "closed_at": "2020-12-03T00:50:17Z", "author_association": "NONE", "pull_request": null, "body": "When I perform an OPTIONS request against a database or table datasette fails with an internal error.\r\n\r\nAll these tests result in the traceback below.\r\n\r\n```\r\ncurl -XOPTIONS http://127.0.0.1:8001/test-db\r\ncurl -XOPTIONS http://127.0.0.1:8001/test-db/table1\r\ncurl -XOPTIONS http://127.0.0.1:8001/test-db/table1\\?_search\\=test\r\n```\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"[path-to-python]/site-packages/datasette/app.py\", line 1033, in route_path\r\n response = await view(request, send)\r\n File \"[path-to-python]/site-packages/datasette/views/base.py\", line 146, in view\r\n request, **request.scope[\"url_route\"][\"kwargs\"]\r\n File \"[path-to-python]/site-packages/datasette/views/base.py\", line 118, in dispatch_request\r\n return await handler(request, *args, **kwargs)\r\nTypeError: object Response can't be used in 'await' expression\r\n```\r\n\r\nMaking the `options` function in the `DataView` class async fixed it for me.\r\n\r\n```python\r\n async def options(self, request, *args, **kwargs):\r\n r = Response.text(\"ok\")\r\n if self.ds.cors:\r\n r.headers[\"Access-Control-Allow-Origin\"] = \"*\"\r\n return r\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1100/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 755721275, "node_id": "MDU6SXNzdWU3NTU3MjEyNzU=", "number": 1123, "title": "Table actions hook are order dependent, should not be", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-03T00:47:12Z", "updated_at": "2020-12-03T00:50:17Z", "closed_at": "2020-12-03T00:50:17Z", "author_association": "OWNER", "pull_request": null, "body": "Got this error: https://github.com/simonw/datasette/runs/1489770800\r\n```\r\n> assert get_table_actions_links(response_2.text) == [\r\n {\"label\": \"From async\", \"href\": \"/\"},\r\n {\"label\": \"Database: fixtures\", \"href\": \"/\"},\r\n {\"label\": f\"Table: {table_or_view}\", \"href\": \"/\"},\r\n ]\r\nE AssertionError: assert [{'href': '/'...'From async'}] == [{'href': '/'...: facetable'}]\r\nE At index 0 diff: {'label': 'Database: fixtures', 'href': '/'} != {'label': 'From async', 'href': '/'}\r\nE Use -v to get the full diff\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1123/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610829227, "node_id": "MDU6SXNzdWU2MTA4MjkyMjc=", "number": 749, "title": "Cloud Run fails to serve database files larger than 32MB", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-05-01T16:06:46Z", "updated_at": "2020-12-03T00:31:15Z", "closed_at": "2020-12-03T00:31:14Z", "author_association": "OWNER", "pull_request": null, "body": "https://cloud.google.com/run/quotas lists the maximum response size as 32MB.\r\n\r\nI spotted a bug where attempting to download a database file larger than that from a Cloud Run deployment (in this case it was https://github-to-sqlite.dogsheep.net/github.db after I [accidentally increased the size of that database](https://github.com/dogsheep/github-to-sqlite/commit/630bdba68a23c0ac453e015518ef0bf41107a952)) returned a 500 error because of this.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/749/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 753876808, "node_id": "MDU6SXNzdWU3NTM4NzY4MDg=", "number": 1119, "title": "Include generated columns in fixtures.db, if SQLite version supports it", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-11-30T23:25:31Z", "updated_at": "2020-12-01T00:41:14Z", "closed_at": "2020-12-01T00:28:04Z", "author_association": "OWNER", "pull_request": null, "body": "In #1117 I added a one-off test that creates a DB with generated columns in it:\r\n\r\nhttps://github.com/simonw/datasette/blob/461670a0b87efa953141b449a9a261919864ceb3/tests/test_api.py#L1933-L1964\r\n\r\nIf this table was conditionally added to `fixtures.db` (if SQLite supports the feature) it would be available in the demo at https://latest.datasette.io/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1119/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 753898359, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI5ODg3ODYx", "number": 1120, "title": "generated_columns table in fixtures.py", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-12-01T00:17:19Z", "updated_at": "2020-12-01T00:28:03Z", "closed_at": "2020-12-01T00:28:02Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1120", "body": "Refs #1119", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1120/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 753767911, "node_id": "MDExOlB1bGxSZXF1ZXN0NTI5NzgzMjc1", "number": 1117, "title": "Support for generated columns", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-11-30T20:10:46Z", "updated_at": "2020-11-30T22:23:19Z", "closed_at": "2020-11-30T21:29:58Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1117", "body": "Refs #1116. My first attempt at this worked on my laptop but broke in CI, so I'm going to iterate on it in a pull request instead.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1117/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 753668177, "node_id": "MDU6SXNzdWU3NTM2NjgxNzc=", "number": 1116, "title": "GENERATED column support", "user": {"value": 2789593, "label": "nattaylor"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-11-30T17:33:47Z", "updated_at": "2020-11-30T21:29:59Z", "closed_at": "2020-11-30T21:29:59Z", "author_association": "NONE", "pull_request": null, "body": "I think this is a feature request... perhaps I should just try to contribute it myself, but thought I'd check in case support is planned already.\r\n\r\nFor a table with the following schema, datasette 0.51.1 doesn't pick up the GENERATED columns and the column list only contains `(rowid, body)` If I edit the SQL and select the generated columns, it will happily show them.\r\n\r\nAt first glance it appears that [`def table_column_details(conn, table):`](https://github.com/simonw/datasette/blob/4777362bf2692bc72b221ec47c3e6216151d1b89/datasette/utils/__init__.py#L575) would have to be refactored to use a different methodology to get the columns, since `PRAGMA table_info(deeds);` returns just `0|body|TEXT|0||0` so maybe it wouldn't be worth it.\r\n\r\n```\r\nCREATE TABLE deeds (\r\n body TEXT,\r\n id INT GENERATED ALWAYS AS (json_extract(body, '$.id')) STORED,\r\n consideration INT GENERATED ALWAYS AS (json_extract(body, '$.consideration')) STORED\r\n);\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1116/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 753788261, "node_id": "MDU6SXNzdWU3NTM3ODgyNjE=", "number": 1118, "title": "messagge_is_html typo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-11-30T20:43:22Z", "updated_at": "2020-11-30T21:24:28Z", "closed_at": "2020-11-30T21:24:28Z", "author_association": "OWNER", "pull_request": null, "body": "https://ripgrep.datasette.io/-/ripgrep?pattern=messagge_is_html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1118/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 753122082, "node_id": "MDU6SXNzdWU3NTMxMjIwODI=", "number": 56, "title": "Link to example tables from the README", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-11-30T04:01:51Z", "updated_at": "2020-11-30T04:10:27Z", "closed_at": "2020-11-30T04:10:27Z", "author_association": "MEMBER", "pull_request": null, "body": "Would help demonstrate how the tool works.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/56/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 753000405, "node_id": "MDU6SXNzdWU3NTMwMDA0MDU=", "number": 53, "title": "Command for fetching file contents", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-11-29T20:31:04Z", "updated_at": "2020-11-30T00:36:09Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Something like this:\r\n\r\n github-to-sqlite files github.db simonw/datasette\r\n\r\nThis would fetch all files from the `main` branch into a `files` table.\r\n\r\nAdditional options could handle things like pulling files from a branch or tag, or just pulling files that match a specific glob or that exist in a specific directory.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/53/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 681228542, "node_id": "MDExOlB1bGxSZXF1ZXN0NDY5NjUxNzMy", "number": 48, "title": "Add pull requests", "user": {"value": 755825, "label": "adamjonas"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-08-18T17:58:44Z", "updated_at": "2020-11-29T23:51:09Z", "closed_at": "2020-11-29T23:51:09Z", "author_association": "CONTRIBUTOR", "pull_request": "dogsheep/github-to-sqlite/pulls/48", "body": "ref #46 \r\n\r\nIssues don't have merge information on them, which means that PRs need to be pulled separately.\r\n\r\nDid my best to mimic the API of issues.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/48/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 753026388, "node_id": "MDU6SXNzdWU3NTMwMjYzODg=", "number": 55, "title": "github-to-sqlite workflows does not correctly replace existing records", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-11-29T21:58:43Z", "updated_at": "2020-11-29T23:48:50Z", "closed_at": "2020-11-29T23:48:50Z", "author_association": "MEMBER", "pull_request": null, "body": "Following #54 - see this TODO: https://github.com/dogsheep/github-to-sqlite/blob/1b23ce11953f9f59c0161ea1f99188b55b5ea11c/github_to_sqlite/utils.py#L700", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/55/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 753026003, "node_id": "MDU6SXNzdWU3NTMwMjYwMDM=", "number": 54, "title": "github-to-sqlite workflows command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-11-29T21:56:42Z", "updated_at": "2020-11-29T22:08:46Z", "closed_at": "2020-11-29T21:57:17Z", "author_association": "MEMBER", "pull_request": null, "body": "A command that fetches the YAML workflows for different repos, parses them and stores them in relational tables would be really useful for maintaining larger numbers of workflows.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/54/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 743370900, "node_id": "MDU6SXNzdWU3NDMzNzA5MDA=", "number": 1098, "title": "Foreign key links break for compound foreign keys", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-11-15T23:22:14Z", "updated_at": "2020-11-29T19:50:31Z", "closed_at": "2020-11-29T19:30:23Z", "author_association": "OWNER", "pull_request": null, "body": "Reported on Twitter here: https://twitter.com/ZaneSelvans/status/1328093641395548161\r\n\r\n> Maybe I'm doing something wrong here but the automatically generated links based foreign key relationships seem to be working here for utility_id_eia, but not for plant_id_eia & generator_id which seems odd: https://pudl-datasette-xl7xwcpe2a-uc.a.run.app/pudl/generators_eia860\r\n>\r\n> Right now it seems like they're trying to, but with only one of the two keys, so it gives \"Error 500. You did not supply a value for binding 2.\" Maybe only create the links when it's a simple foreign key?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1098/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}