{"id": 870125126, "node_id": "MDU6SXNzdWU4NzAxMjUxMjY=", "number": 1310, "title": "I'm creating a plugin to export a spreadsheet file (.ods or .xlsx)", "user": {"value": 3747136, "label": "ColinMaudry"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-04-28T16:20:11Z", "updated_at": "2021-04-30T07:26:11Z", "closed_at": "2021-04-30T06:58:46Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\n\r\nI have started developing a plugin to export records as a spreadsheet file. It could be ods or xlsx, whatever is easier.\r\n\r\nI have spotted the following packages:\r\n\r\n- ods files: https://pypi.org/project/odswriter/\r\n- xlsx files: https://openpyxl.readthedocs.io/en/stable/index.html (quite powerful) or https://xlsxwriter.readthedocs.io/ (faster)\r\n\r\nThis is the code I have so far, I test it with the `--plugins-dir` option:\r\n\r\n```python\r\nfrom datasette import hookimpl\r\nfrom datasette.utils.asgi import Response\r\nimport odswriter as ods\r\n\r\ndef render_spreadsheet(rows):\r\n with ods.writer(open(\"test.ods\",\"wb\")) as odsfile:\r\n for row in rows:\r\n odsfile.writerow([\"String\", \"ABCDEF123456\", \"123456\"])\r\n return Response(odsfile, content_type=\"application/vnd.oasis.opendocument.spreadsheet\", status=200)\r\n\r\n\r\n@hookimpl\r\ndef register_output_renderer():\r\n return {\"extension\": \"ods\", \"render\": render_spreadsheet}\r\n\r\n``` \r\n\r\nI get the following error:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/app.py\", line 1128, in route_path\r\n await response.asgi_send(send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py\", line 339, in asgi_send\r\n body = body.encode(\"utf-8\")\r\nAttributeError: 'ODSWriter' object has no attribute 'encode'\r\nERROR: Exception in ASGI application\r\nTraceback (most recent call last):\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/app.py\", line 1128, in route_path\r\n await response.asgi_send(send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py\", line 339, in asgi_send\r\n body = body.encode(\"utf-8\")\r\nAttributeError: 'ODSWriter' object has no attribute 'encode'\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py\", line 396, in run_asgi\r\n result = await app(self.scope, self.receive, self.send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py\", line 45, in __call__\r\n return await self.app(scope, receive, send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py\", line 161, in __call__\r\n await self.app(scope, receive, send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/tracer.py\", line 75, in __call__\r\n await self.app(scope, receive, send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py\", line 107, in app_wrapped_with_csrf\r\n await app(scope, receive, wrapped_send)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/app.py\", line 1086, in __call__\r\n return await self.route_path(scope, receive, send, path)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/app.py\", line 1133, in route_path\r\n return await self.handle_500(request, send, exception)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/app.py\", line 1267, in handle_500\r\n await asgi_send_html(\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py\", line 217, in asgi_send_html\r\n await asgi_send(\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py\", line 237, in asgi_send\r\n await asgi_start(send, status, headers, content_type)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py\", line 246, in asgi_start\r\n await send(\r\n File \"/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py\", line 103, in wrapped_send\r\n await send(event)\r\n File \"/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py\", line 482, in send\r\n raise RuntimeError(msg % message_type)\r\nRuntimeError: Expected ASGI message 'http.response.body', but got 'http.response.start'.\r\n```\r\n\r\nI tried with `AsgiFileDownload` like in [DatabaseDownload](https://github.com/simonw/datasette/blob/main/datasette/views/database.py#L150) to deal with the binary nature of the ods file, but the renderer expects a Response:\r\n\r\n> should be dict or Response\r\n\r\nHowever, the `Response` class only supports the following methods, not binary:\r\n\r\n- html\r\n- text\r\n- json\r\n- redirect\r\n\r\nHow would you suggest me to proceed to have my ods file downloaded?\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1310/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 871304967, "node_id": "MDU6SXNzdWU4NzEzMDQ5Njc=", "number": 1315, "title": "settings.json should be picked up by \"datasette publish cloudrun\"", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-29T18:16:41Z", "updated_at": "2021-04-29T18:16:41Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1315/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 871046111, "node_id": "MDExOlB1bGxSZXF1ZXN0NjI2MTMwMTM1", "number": 1313, "title": "Bump black from 20.8b1 to 21.4b2", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-04-29T13:58:06Z", "updated_at": "2021-04-29T15:47:50Z", "closed_at": "2021-04-29T15:47:49Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1313", "body": "Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b2.\n
\nRelease notes\n

Sourced from black's releases.

\n
\n

21.4b2

\n

Black

\n
    \n
  • \n

    Fix crash if the user configuration directory is inaccessible. (#2158)

    \n
  • \n
  • \n

    Clarify\ncircumstances\nin which Black may change the AST (#2159)

    \n
  • \n
\n

Packaging

\n
    \n
  • Install primer.json (used by black-primer by default) with black. (#2154)
  • \n
\n

21.4b1

\n

Black

\n
    \n
  • \n

    Fix crash on docstrings ending with "\\ ". (#2142)

    \n
  • \n
  • \n

    Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

    \n
  • \n
  • \n

    Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags\nin the name of the cache file. Without this fix, changes in these flags would not take\neffect if the cache had already been populated. (#2131)

    \n
  • \n
  • \n

    Don't remove necessary parentheses from assignment expression containing assert /\nreturn statements. (#2143)

    \n
  • \n
\n

Packaging

\n
    \n
  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
  • \n
\n

21.4b0

\n

Black

\n
    \n
  • \n

    Fixed a rare but annoying formatting instability created by the combination of\noptional trailing commas inserted by Black and optional parentheses looking at\npre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many\nduplicates. (#2126)

    \n
  • \n
  • \n

    Black now processes one-line docstrings by stripping leading and trailing spaces,\nand adding a padding space when needed to break up """". (#1740)

    \n
  • \n
  • \n

    Black now cleans up leading non-breaking spaces in comments (#2092)

    \n
  • \n
  • \n

    Black now respects --skip-string-normalization when normalizing multiline\ndocstring quotes (#1637)

    \n
  • \n
  • \n

    Black no longer removes all empty lines between non-function code and decorators\nwhen formatting typing stubs. Now Black enforces a single empty line. (#1646)

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from black's changelog.

\n
\n

21.4b2

\n

Black

\n
    \n
  • \n

    Fix crash if the user configuration directory is inaccessible. (#2158)

    \n
  • \n
  • \n

    Clarify\ncircumstances\nin which Black may change the AST (#2159)

    \n
  • \n
\n

Packaging

\n
    \n
  • Install primer.json (used by black-primer by default) with black. (#2154)
  • \n
\n

21.4b1

\n

Black

\n
    \n
  • \n

    Fix crash on docstrings ending with "\\ ". (#2142)

    \n
  • \n
  • \n

    Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

    \n
  • \n
  • \n

    Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags\nin the name of the cache file. Without this fix, changes in these flags would not take\neffect if the cache had already been populated. (#2131)

    \n
  • \n
  • \n

    Don't remove necessary parentheses from assignment expression containing assert /\nreturn statements. (#2143)

    \n
  • \n
\n

Packaging

\n
    \n
  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
  • \n
\n

21.4b0

\n

Black

\n
    \n
  • \n

    Fixed a rare but annoying formatting instability created by the combination of\noptional trailing commas inserted by Black and optional parentheses looking at\npre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many\nduplicates. (#2126)

    \n
  • \n
  • \n

    Black now processes one-line docstrings by stripping leading and trailing spaces,\nand adding a padding space when needed to break up """". (#1740)

    \n
  • \n
  • \n

    Black now cleans up leading non-breaking spaces in comments (#2092)

    \n
  • \n
  • \n

    Black now respects --skip-string-normalization when normalizing multiline\ndocstring quotes (#1637)

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b2)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b2)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1313/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 871157602, "node_id": "MDExOlB1bGxSZXF1ZXN0NjI2MjIyNjc2", "number": 1314, "title": "Upgrade to GitHub-native Dependabot", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-04-29T15:36:41Z", "updated_at": "2021-04-29T15:47:22Z", "closed_at": "2021-04-29T15:47:21Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1314", "body": "_Dependabot Preview will be shut down on August 3rd, 2021. In order to keep getting Dependabot updates, please merge this PR and migrate to GitHub-native Dependabot before then._\n\nDependabot has been fully integrated into GitHub, so you no longer have to install and manage a separate app. This pull request migrates your configuration from Dependabot.com to a config file, using the [new syntax][new_syntax]. When merged, we'll swap out `dependabot-preview` (me) for a new `dependabot` app, and you'll be all set!\n\nWith this change, you'll now use the [Dependabot page in GitHub][dependabot_page], rather than the [Dependabot dashboard][dashboard], to monitor your version updates, and you'll configure Dependabot through the new config file rather than a UI.\n\n\n\n\n\n\n\nIf you've got any questions or feedback for us, please let us know by creating an issue in the [dependabot/dependabot-core][issues] repository.\n\n[Learn more about migrating to GitHub-native Dependabot][learn]\n\nPlease note that regular `@dependabot` commands do not work on this pull request.\n\n[dashboard]: https://app.dependabot.com/\n[dependabot_page]: https://github.com/simonw/datasette/network/updates\n[issues]: https://github.com/dependabot/dependabot-core/issues/new?assignees=%40dependabot%2Fpreview-migration-reviewers&labels=E%3A+preview-migration&template=migration-issue.md\n[learn]: http://docs.github.com/code-security/supply-chain-security/upgrading-from-dependabotcom-to-github-native-dependabot\n[new_syntax]: https://help.github.com/en/github/administering-a-repository/configuration-options-for-dependency-updates\n[org_secrets_url]: https://github.com/settings/secrets/dependabot\n[repo_secrets_url]: https://github.com/simonw/datasette/settings/secrets/dependabot\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1314/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 870227815, "node_id": "MDExOlB1bGxSZXF1ZXN0NjI1NDU3NTc5", "number": 1311, "title": "Bump black from 20.8b1 to 21.4b1", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-04-28T18:25:58Z", "updated_at": "2021-04-29T13:58:11Z", "closed_at": "2021-04-29T13:58:09Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1311", "body": "Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b1.\n
\nRelease notes\n

Sourced from black's releases.

\n
\n

21.4b1

\n

Black

\n
    \n
  • \n

    Fix crash on docstrings ending with "\\ ". (#2142)

    \n
  • \n
  • \n

    Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

    \n
  • \n
  • \n

    Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags\nin the name of the cache file. Without this fix, changes in these flags would not take\neffect if the cache had already been populated. (#2131)

    \n
  • \n
  • \n

    Don't remove necessary parentheses from assignment expression containing assert /\nreturn statements. (#2143)

    \n
  • \n
\n

Packaging

\n
    \n
  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
  • \n
\n

21.4b0

\n

Black

\n
    \n
  • \n

    Fixed a rare but annoying formatting instability created by the combination of\noptional trailing commas inserted by Black and optional parentheses looking at\npre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many\nduplicates. (#2126)

    \n
  • \n
  • \n

    Black now processes one-line docstrings by stripping leading and trailing spaces,\nand adding a padding space when needed to break up """". (#1740)

    \n
  • \n
  • \n

    Black now cleans up leading non-breaking spaces in comments (#2092)

    \n
  • \n
  • \n

    Black now respects --skip-string-normalization when normalizing multiline\ndocstring quotes (#1637)

    \n
  • \n
  • \n

    Black no longer removes all empty lines between non-function code and decorators\nwhen formatting typing stubs. Now Black enforces a single empty line. (#1646)

    \n
  • \n
  • \n

    Black no longer adds an incorrect space after a parenthesized assignment expression\nin if/while statements (#1655)

    \n
  • \n
  • \n

    Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason\nto split lines (#1824)

    \n
  • \n
  • \n

    fixed a crash when PWD=/ on POSIX (#1631)

    \n
  • \n
  • \n

    fixed "I/O operation on closed file" when using --diff (#1664)

    \n
  • \n
  • \n

    Prevent coloured diff output being interleaved with multiple files (#1673)

    \n
  • \n
  • \n

    Added support for PEP 614 relaxed decorator syntax on python 3.9 (#1711)

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from black's changelog.

\n
\n

21.4b1

\n

Black

\n
    \n
  • \n

    Fix crash on docstrings ending with "\\ ". (#2142)

    \n
  • \n
  • \n

    Fix crash when atypical whitespace is cleaned out of dostrings (#2120)

    \n
  • \n
  • \n

    Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags\nin the name of the cache file. Without this fix, changes in these flags would not take\neffect if the cache had already been populated. (#2131)

    \n
  • \n
  • \n

    Don't remove necessary parentheses from assignment expression containing assert /\nreturn statements. (#2143)

    \n
  • \n
\n

Packaging

\n
    \n
  • Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
  • \n
\n

21.4b0

\n

Black

\n
    \n
  • \n

    Fixed a rare but annoying formatting instability created by the combination of\noptional trailing commas inserted by Black and optional parentheses looking at\npre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many\nduplicates. (#2126)

    \n
  • \n
  • \n

    Black now processes one-line docstrings by stripping leading and trailing spaces,\nand adding a padding space when needed to break up """". (#1740)

    \n
  • \n
  • \n

    Black now cleans up leading non-breaking spaces in comments (#2092)

    \n
  • \n
  • \n

    Black now respects --skip-string-normalization when normalizing multiline\ndocstring quotes (#1637)

    \n
  • \n
  • \n

    Black no longer removes all empty lines between non-function code and decorators\nwhen formatting typing stubs. Now Black enforces a single empty line. (#1646)

    \n
  • \n
  • \n

    Black no longer adds an incorrect space after a parenthesized assignment expression\nin if/while statements (#1655)

    \n
  • \n
  • \n

    Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason\nto split lines (#1824)

    \n
  • \n
  • \n

    fixed a crash when PWD=/ on POSIX (#1631)

    \n
  • \n
  • \n

    fixed "I/O operation on closed file" when using --diff (#1664)

    \n
  • \n
  • \n

    Prevent coloured diff output being interleaved with multiple files (#1673)

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b1)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b1)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1311/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 870946764, "node_id": "MDU6SXNzdWU4NzA5NDY3NjQ=", "number": 1312, "title": "how to query many-to-many relationship via json API?", "user": {"value": 5268174, "label": "bram2000"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-29T12:09:49Z", "updated_at": "2021-04-29T12:09:49Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi,\r\n\r\nFirstly thanks for Datasette, it's great!\r\n\r\nI'm trying to use the JSON API to query data from a Datasette instance. I have a simple 3 table many-to-many relationship, like so:\r\n\r\n`category` - list of categories\r\n`document` - list of documents\r\n`document_category` - join table (a category contains many documents, and a document can be a member of multiple categories)\r\n\r\nthe `document_category` table foreign keys to the other two using their respective row_ids.\r\n\r\nNow I want to return \"all documents within category X\" but I cannot see a way to do this without executing two queries; the first to lookup the row_id of category X, and the second to join `document` with `document_category` where category ID is .\r\n\r\nI could easily write this in SQL, but this makes programmatic handling of pagination much more difficult (we'd have to dynamically modify the SQL to select the row_id and include the correct where and limit clauses).\r\n\r\nIs there a way to achieve this using the JSON API?\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1312/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 869237023, "node_id": "MDExOlB1bGxSZXF1ZXN0NjI0NjM1NDQw", "number": 1309, "title": "Bump black from 20.8b1 to 21.4b0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-04-27T20:28:11Z", "updated_at": "2021-04-28T18:26:06Z", "closed_at": "2021-04-28T18:26:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1309", "body": "Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b0.\n
\nRelease notes\n

Sourced from black's releases.

\n
\n

21.4b0

\n

Black

\n
    \n
  • \n

    Fixed a rare but annoying formatting instability created by the combination of\noptional trailing commas inserted by Black and optional parentheses looking at\npre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many\nduplicates. (#2126)

    \n
  • \n
  • \n

    Black now processes one-line docstrings by stripping leading and trailing spaces,\nand adding a padding space when needed to break up """". (#1740)

    \n
  • \n
  • \n

    Black now cleans up leading non-breaking spaces in comments (#2092)

    \n
  • \n
  • \n

    Black now respects --skip-string-normalization when normalizing multiline\ndocstring quotes (#1637)

    \n
  • \n
  • \n

    Black no longer removes all empty lines between non-function code and decorators\nwhen formatting typing stubs. Now Black enforces a single empty line. (#1646)

    \n
  • \n
  • \n

    Black no longer adds an incorrect space after a parenthesized assignment expression\nin if/while statements (#1655)

    \n
  • \n
  • \n

    Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason\nto split lines (#1824)

    \n
  • \n
  • \n

    fixed a crash when PWD=/ on POSIX (#1631)

    \n
  • \n
  • \n

    fixed "I/O operation on closed file" when using --diff (#1664)

    \n
  • \n
  • \n

    Prevent coloured diff output being interleaved with multiple files (#1673)

    \n
  • \n
  • \n

    Added support for PEP 614 relaxed decorator syntax on python 3.9 (#1711)

    \n
  • \n
  • \n

    Added parsing support for unparenthesized tuples and yield expressions in annotated\nassignments (#1835)

    \n
  • \n
  • \n

    use lowercase hex strings (#1692)

    \n
  • \n
  • \n

    added --extend-exclude argument (PR #2005)

    \n
  • \n
  • \n

    speed up caching by avoiding pathlib (#1950)

    \n
  • \n
  • \n

    --diff correctly indicates when a file doesn't end in a newline (#1662)

    \n
  • \n
  • \n

    Added --stdin-filename argument to allow stdin to respect --force-exclude rules\n(#1780)

    \n
  • \n
  • \n

    Lines ending with fmt: skip will now be not formatted (#1800)

    \n
  • \n
  • \n

    PR #2053: Black no longer relies on typed-ast for Python 3.8 and higher

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from black's changelog.

\n
\n

21.4b0

\n

Black

\n
    \n
  • \n

    Fixed a rare but annoying formatting instability created by the combination of\noptional trailing commas inserted by Black and optional parentheses looking at\npre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many\nduplicates. (#2126)

    \n
  • \n
  • \n

    Black now processes one-line docstrings by stripping leading and trailing spaces,\nand adding a padding space when needed to break up """". (#1740)

    \n
  • \n
  • \n

    Black now cleans up leading non-breaking spaces in comments (#2092)

    \n
  • \n
  • \n

    Black now respects --skip-string-normalization when normalizing multiline\ndocstring quotes (#1637)

    \n
  • \n
  • \n

    Black no longer removes all empty lines between non-function code and decorators\nwhen formatting typing stubs. Now Black enforces a single empty line. (#1646)

    \n
  • \n
  • \n

    Black no longer adds an incorrect space after a parenthesized assignment expression\nin if/while statements (#1655)

    \n
  • \n
  • \n

    Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason\nto split lines (#1824)

    \n
  • \n
  • \n

    fixed a crash when PWD=/ on POSIX (#1631)

    \n
  • \n
  • \n

    fixed "I/O operation on closed file" when using --diff (#1664)

    \n
  • \n
  • \n

    Prevent coloured diff output being interleaved with multiple files (#1673)

    \n
  • \n
  • \n

    Added support for PEP 614 relaxed decorator syntax on python 3.9 (#1711)

    \n
  • \n
  • \n

    Added parsing support for unparenthesized tuples and yield expressions in annotated\nassignments (#1835)

    \n
  • \n
  • \n

    added --extend-exclude argument (PR #2005)

    \n
  • \n
  • \n

    speed up caching by avoiding pathlib (#1950)

    \n
  • \n
  • \n

    --diff correctly indicates when a file doesn't end in a newline (#1662)

    \n
  • \n
  • \n

    Added --stdin-filename argument to allow stdin to respect --force-exclude rules\n(#1780)

    \n
  • \n
  • \n

    Lines ending with fmt: skip will now be not formatted (#1800)

    \n
  • \n
  • \n

    PR #2053: Black no longer relies on typed-ast for Python 3.8 and higher

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b0)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b0)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1309/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 281110295, "node_id": "MDU6SXNzdWUyODExMTAyOTU=", "number": 173, "title": "I18n and L10n support", "user": {"value": 50138, "label": "janimo"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2017-12-11T17:49:58Z", "updated_at": "2021-04-26T12:10:01Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/173/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 866668415, "node_id": "MDU6SXNzdWU4NjY2Njg0MTU=", "number": 1308, "title": "Columns named \"link\" display in bold", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-04-24T05:58:11Z", "updated_at": "2021-04-24T06:07:49Z", "closed_at": "2021-04-24T06:07:49Z", "author_association": "OWNER", "pull_request": null, "body": "Reported in office hours today.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1308/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 860734722, "node_id": "MDU6SXNzdWU4NjA3MzQ3MjI=", "number": 1302, "title": "Fix disappearing facets", "user": {"value": 192568, "label": "mroswell"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-18T18:42:33Z", "updated_at": "2021-04-20T07:40:15Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "1. Clone https://github.com/mroswell/list-N\r\n2. Run `datasette disinfectants.db -o`\r\n3. Select the `Safer_or_Toxic` facet.\r\n4. Select `Toxic`.\r\n5. Close out the `Safer_or_Toxic` facet.\r\n6. Examine `Suggested facets` list. `Safer_or_Toxic` is GONE.\r\n7. Try some other facets. When you select an element, and then close the list, in some cases, the facet properly returns to the `Suggested facet` list... Arrays and dates properly return to the list, but fields with strings don't return to the list. \r\n\r\nSince my site is devoted to whether disinfectants are Safer or Toxic, having the suggested facet disappear from the suggested facet list is very confusing* to end-users. This, along with a few other issues, unfortunately proved beyond my own programming ability to address. So I hired a Senior-level developer to address a number of issues, including this disappearing act.\r\n\r\n8. Open a new terminal. Run `datasette disinfectants.db -m metadata.json --static static:static/ --template-dir templates/ --plugins-dir plugins/ -p 8001 -o`\r\n9. Repeat steps 3-6, but this time, the Safer_or_Toxic facet returns to the list (and the related URL parameters are removed).\r\n\r\nI'm not sure how to do a pull request for this, because the plugin contains other functionality that goes beyond this bug. I wanted the facets sorted in a certain order (both in the suggested facet list, and the detail lists) (... the detail lists were hopping around all over the place before...) I wanted the duplicate facets removed (leaving only the one where you can facet by individual item in an array.) I wanted the arrays to be presented in a prettier fashion (I did that in the template... That could be moved over to the plugin at some point)\r\n\r\nI'm thinking it'll be very helpful if applicable parts of my project's plugin (sort_suggested_facets_plugin.py) will be able to be incorporated back into datasette, but I leave that to you to consider.\r\n\r\n(* The disappearing facet bug was especially confusing because I'm removing the filters and sql from the table page, at the request of the organization. The filters and sql detail created a lot of confusion for end users who try to find disinfectants used by Hospitals, for instance, as an '=' won't find them, since they are part of the Use_site array.) My disappearing-facet confusion was documented in my own issue: https://github.com/mroswell/list-N/issues/57 (addressed by the plugin). Other facet-related issues here: https://github.com/mroswell/list-N/issues/54 (addressed by the plugin); https://github.com/mroswell/list-N/issues/15 (addressed by template); https://github.com/mroswell/list-N/issues/53 (not yet addressed). \r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1302/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 861331159, "node_id": "MDExOlB1bGxSZXF1ZXN0NjE4MDExOTc3", "number": 1303, "title": "Update pytest-asyncio requirement from <0.15,>=0.10 to >=0.10,<0.16", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-04-19T13:49:12Z", "updated_at": "2021-04-19T18:18:17Z", "closed_at": "2021-04-19T18:18:17Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1303", "body": "Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1303/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 791237799, "node_id": "MDU6SXNzdWU3OTEyMzc3OTk=", "number": 1196, "title": "Access Denied Error in Windows", "user": {"value": 2826376, "label": "QAInsights"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-01-21T15:40:40Z", "updated_at": "2021-04-14T19:28:38Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "I am trying to publish a db to vercel. But while issuing the below command throwing `Access Denied` error which is leading to `RecursionError: maximum recursion depth exceeded while calling a Python object`.\r\n\r\nI am using PyCharm and Python 3.9. I have reinstalled both and launched PyCharm as Admin in Windows 10. But still the issue persists.\r\n\r\nIssued command `datasette publish vercel jmeter.db --project jmeter --install datasette-vega`\r\n\r\nPS: localhost is working fine.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1196/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 856895291, "node_id": "MDU6SXNzdWU4NTY4OTUyOTE=", "number": 1299, "title": "Design better empty states", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-13T12:06:12Z", "updated_at": "2021-04-13T12:06:12Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Inspiration here: https://emptystat.es/", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1299/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 855451460, "node_id": "MDU6SXNzdWU4NTU0NTE0NjA=", "number": 1297, "title": "Documentation: json1, and introspection endpoints", "user": {"value": 192568, "label": "mroswell"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-12T00:38:00Z", "updated_at": "2021-04-12T01:29:33Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "https://docs.datasette.io/en/stable/facets.html notes that:\r\n> If your SQLite installation provides the json1 extension (you can check using /-/versions) Datasette will automatically detect columns that contain JSON arrays...\r\n\r\nWhen I check -/versions I see two sections relevant to json1:\r\n```\r\n \"extensions\": {\r\n \"json1\": null\r\n },\r\n \"compile_options\": [\r\n ...\r\n \"ENABLE_JSON1\",\r\n```\r\n \r\n The ENABLE_JSON1 makes me think json1 is likely available. But the `\"json1\": null` made me think it wasn't available (because of the `null`). It would help if the documentation provided clarity about how to know if json1 is installed. It would also be helpful if the `/-/versions` information signalled somehow that that is to be appended to the hostname or domain name (or whatever you want to call it, or simply show it, using `example.com/-/versions` instead of `/-/versions`. Likewise on that last point, for https://docs.datasette.io/en/stable/introspection.html#introspection , at least at some point on that page detailing where those introspection endpoints go. (Sometimes documentation can be so abbreviated that it's hard for new users to figure out what's going on.)\r\n ", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1297/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 853672224, "node_id": "MDU6SXNzdWU4NTM2NzIyMjQ=", "number": 1294, "title": "\"You can check out any time you like. But you can never leave!\"", "user": {"value": 192568, "label": "mroswell"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-08T17:02:15Z", "updated_at": "2021-04-08T18:35:50Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "(Feel free to rename this one.)\r\n\r\n- The column gear lets you \"Show not-blank rows.\" Then it places a parameter in the URL, which a web developer would notice, but a lot of users won't notice, or know to delete it. Would be good to toggle \"Show not-blank rows\" with \"Show all rows.\" (Also would be quite helpful to have a \"Show blank rows | Show all rows\" option)\r\n- The column gear lets you \"Sort ascending\" and \"Sort descending\" but then you're stuck with some sort of sorted version thereafter, unless you know to sort the ID column, or to remove the full _sort parameter and its value in the URL. Would be good to offer a \"Remove sort\" option in the gear.\r\n- These requests are in the same camp as: https://github.com/simonw/datasette-vega/issues/36\r\n- I suspect there are other url parameter instances where similar analysis would be helpful, but the three above are the use cases I've run across. \r\n\r\nUPDATE:\r\n- It would be helpful to have a \"Previous page\" available for all but the first table page.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1294/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 520667773, "node_id": "MDU6SXNzdWU1MjA2Njc3NzM=", "number": 620, "title": "Mechanism for indicating foreign key relationships in the table and query page URLs", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2019-11-10T22:26:27Z", "updated_at": "2021-04-05T03:57:22Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Datasette currently only inflates foreign keys (into names hyperlinks) if it detects them as foreign key constraints in the underlying database.\r\n\r\nIt would be useful if you could specify additional \"foreign keys\" using both `metadata.json` and the querystring - similar time how you can pass `?_fts_table=x` https://datasette.readthedocs.io/en/stable/full_text_search.html#configuring-full-text-search-for-a-table-or-view", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/620/reactions\", \"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 1}", "draft": null, "state_reason": null} {"id": 672421411, "node_id": "MDU6SXNzdWU2NzI0MjE0MTE=", "number": 916, "title": "Support reverse pagination (previous page, has-previous-items)", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2020-08-04T00:32:06Z", "updated_at": "2021-04-03T23:43:11Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "I need this for `datasette-graphql` for full compatibility with the way Relay likes to paginate - using cursors for paginating backwards as well as for paginating forwards.\r\n\r\n> This may be the kick I need to get Datasette pagination to work in reverse too.\r\n_Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-668305853_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/916/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 849396758, "node_id": "MDU6SXNzdWU4NDkzOTY3NTg=", "number": 1287, "title": "Upgrade to Python 3.9.4", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2021-04-02T18:43:15Z", "updated_at": "2021-04-03T22:38:39Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Has some security fixes https://pythoninsider.blogspot.com/2021/04/python-393-and-389-are-now-available.html", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1287/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 453131917, "node_id": "MDU6SXNzdWU0NTMxMzE5MTc=", "number": 502, "title": "Exporting sqlite database(s)?", "user": {"value": 7936571, "label": "chrismp"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-06-06T16:39:53Z", "updated_at": "2021-04-03T05:16:54Z", "closed_at": "2019-06-11T18:50:42Z", "author_association": "NONE", "pull_request": null, "body": "I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/502/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 849568079, "node_id": "MDExOlB1bGxSZXF1ZXN0NjA4MzIzMDI4", "number": 1290, "title": "Use pytest-xdist to speed up tests", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-04-03T03:34:36Z", "updated_at": "2021-04-03T03:42:29Z", "closed_at": "2021-04-03T03:42:28Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/1290", "body": "Closes #1289, refs #1212.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1290/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 849543502, "node_id": "MDU6SXNzdWU4NDk1NDM1MDI=", "number": 1289, "title": "Speed up tests with pytest-xdist", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2021-04-03T00:47:39Z", "updated_at": "2021-04-03T03:42:28Z", "closed_at": "2021-04-03T03:42:28Z", "author_association": "OWNER", "pull_request": null, "body": "I think I can get this working for almost every test, then use the pattern in https://github.com/pytest-dev/pytest-xdist/issues/385#issuecomment-444545641 to opt specific tests out of being run in parallel.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1289/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 849512840, "node_id": "MDU6SXNzdWU4NDk1MTI4NDA=", "number": 1288, "title": "Facets: show counts for null", "user": {"value": 1111743, "label": "jungle-boogie"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-02T22:33:44Z", "updated_at": "2021-04-02T22:33:44Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Hi,\r\n\r\nThank you for Datasette and being a fan of SQLite!\r\n\r\nNot all rows in a record will always contain data.\r\nSo when using a facet on a column where some records have data and others don't, you don't get an accurate count of the results.\r\n\r\nPlease consider also counting and showing null records with facets.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1288/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 817544251, "node_id": "MDU6SXNzdWU4MTc1NDQyNTE=", "number": 1245, "title": "Sticky table column headers would be useful, especially on the query page", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2021-02-26T17:42:51Z", "updated_at": "2021-04-02T20:53:35Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "Suggestion from office hours.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1245/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 826700095, "node_id": "MDU6SXNzdWU4MjY3MDAwOTU=", "number": 1255, "title": "Facets timing out but work when filtering", "user": {"value": 1219001, "label": "robroc"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-03-09T22:01:39Z", "updated_at": "2021-04-02T20:50:08Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "System info:\r\n\r\nWindows 10\r\nDatasette 0.55 installed via pip\r\nPython 3.8.5 in a conda environment\r\n\r\nI'm getting the message `These facets timed out` on any faceting operation. However, when I apply a filter, the facets appear in the filtered view. The error returns when the filter is removed. My data only has 38,450 rows.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1255/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 847700726, "node_id": "MDU6SXNzdWU4NDc3MDA3MjY=", "number": 1285, "title": "Feature Request or Plugin Request: Numeric Range Facets", "user": {"value": 192568, "label": "mroswell"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-04-01T01:50:20Z", "updated_at": "2021-04-01T02:28:19Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "It would be great to offer facets for numeric data ranges. \r\n\r\nThe ranges could pull from typical GIS methods of creating choropleth maps. \r\nhttps://gisgeography.com/choropleth-maps-data-classification/\r\nOf the following, for mapping, I've always preferred a Jenks Natural Breaks, or a cross between Jenks and Pretty breaks.\r\n\r\n- Equal Intervals \r\n- Quantile (equal count) \r\n- Standard Deviation\r\n- Natural Breaks (Jenks) Classification \r\n- Pretty Breaks\r\n- Some sort of Aggregate Jenks Classification (this isn't standard, but it would be nice to be able to set classification ranges that work across tables.)\r\n\r\nHere are some links for Natural Breaks, in case this method is unfamiliar.\r\n\r\n- https://en.wikipedia.org/wiki/Jenks_natural_breaks_optimization\r\n- http://wiki.gis.com/wiki/index.php/Jenks_Natural_Breaks_Classification\r\n- https://medium.com/analytics-vidhya/jenks-natural-breaks-best-range-finder-algorithm-8d1907192051\r\n\r\nPer that last link, there is a Jenks Python module... They also describe it as data-intensive for larger datasets. Maybe this is a good plugin idea.\r\n\r\nAn example of equal Intervals would be \r\n0 \u2013 < 10\r\n10 \u2013 < 20\r\n20 \u2013 < 30\r\n30 \u2013 < 40\r\n\r\nIt's kind of confusing to have that less-than sign in there. it could also be displayed as:\r\n0 \u2013 10\r\n10 \u2013 20\r\n20 \u2013 30\r\n30 \u2013 40\r\n\r\nBut then it's not completely clear which category 10 is in, for instance.\r\n\r\n(Best to right-justify.. and use an \"en dash\" between numbers.)\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1285/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 841456306, "node_id": "MDU6SXNzdWU4NDE0NTYzMDY=", "number": 1276, "title": "Invalid SQL: \"no such table: pragma_database_list\" on database page", "user": {"value": 1314318, "label": "justinallen"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-03-26T00:03:53Z", "updated_at": "2021-03-31T16:27:27Z", "closed_at": "2021-03-28T23:52:31Z", "author_association": "NONE", "pull_request": null, "body": "Don't think this has been covered here yet. I'm a little stumped with this one and can't tell if it's a bug or I have something misconfigured. \r\n\r\nOddly, when running locally the usual list of tables populates (i.e. at /charts a list of tables in charts.db). But when on the web server it throws an Invalid SQL error and \"no such table: pragma_database_list\" below. \r\n\r\nAll the url endpoints seem to work fine aside from this - individual tables (/charts/chart_one), as well as stored queries (/charts/query_one). \r\n\r\nNot sure if this has anything to do with upgrading to Datasette 0.55, or something to do with our setup, which uses a metadata build script similar to [the one for the 538 server](https://github.com/simonw/fivethirtyeight-datasette/blob/main/make_metadata.py), or something else. \r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1276/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 843884745, "node_id": "MDU6SXNzdWU4NDM4ODQ3NDU=", "number": 1283, "title": "advanced #export causes unexpected scrolling", "user": {"value": 192568, "label": "mroswell"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-03-29T22:46:57Z", "updated_at": "2021-03-29T22:46:57Z", "closed_at": null, "author_association": "CONTRIBUTOR", "pull_request": null, "body": "1. Visit a datasette table page\r\n2. Click on the \"(advanced)\" link. This adds a fragment identifier \"#export\" to the URL, and scrolls down to the \"Advanced export\" div with the \"export\" id.\r\n3. Manually scroll back up, and click on a suggested facet. The fragment identifier is still present, and the app scrolls back down to the \"Advanced export\" div. I think this is unwanted behavior.\r\n\r\nThe user remedy seems to be to manually remove the \"#export\" from the URL.\r\n\r\nThis behavior happens in my project, and in:\r\nhttps://covid-19.datasettes.com/covid/economist_excess_deaths (for instance) \r\nbut not in this table: \r\nhttps://global-power-plants.datasettes.com/global-power-plants/global-power-plants", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1283/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 843739658, "node_id": "MDExOlB1bGxSZXF1ZXN0NjAzMDgyMjgw", "number": 1282, "title": "Fix little typo", "user": {"value": 192568, "label": "mroswell"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-03-29T19:45:28Z", "updated_at": "2021-03-29T19:57:34Z", "closed_at": "2021-03-29T19:57:34Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1282", "body": "", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1282/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 576722115, "node_id": "MDU6SXNzdWU1NzY3MjIxMTU=", "number": 696, "title": "Single failing unit test when run inside the Docker image", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 3268330, "label": "Datasette 1.0"}, "comments": 2, "created_at": "2020-03-06T06:16:36Z", "updated_at": "2021-03-29T17:04:19Z", "closed_at": "2021-03-07T07:41:18Z", "author_association": "OWNER", "pull_request": null, "body": "```\r\ndocker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash\r\nroot@0e1928cfdf79:/# cd /mnt\r\nroot@0e1928cfdf79:/mnt# pip install -e .[test]\r\nroot@0e1928cfdf79:/mnt# pytest\r\n```\r\nI get one failure!\r\n\r\nIt was for `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]`\r\n```\r\n def test_searchable(app_client, path, expected_rows):\r\n response = app_client.get(path)\r\n> assert expected_rows == response.json[\"rows\"]\r\nE AssertionError: assert [[1, 'barry c...sel', 'puma']] == []\r\nE Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']\r\nE Full diff:\r\nE + []\r\nE - [[1, 'barry cat', 'terry dog', 'panther'],\r\nE - [2, 'terry dog', 'sara weasel', 'puma']]\r\n```\r\n\r\n_Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/696/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 724369025, "node_id": "MDExOlB1bGxSZXF1ZXN0NTA1NzY5NDYy", "number": 1031, "title": "Fallback to databases in inspect-data.json when no -i options are passed", "user": {"value": 299380, "label": "frankier"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-10-19T07:51:06Z", "updated_at": "2021-03-29T01:46:45Z", "closed_at": "2021-03-29T00:23:41Z", "author_association": "FIRST_TIME_CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1031", "body": "Currenlty `Datasette.__init__` checks immutables against None to decide whether to fallback to inspect-data.json. This patch modifies the serve command to pass None when no -i options are passed so this fallback works correctly.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1031/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 842881221, "node_id": "MDU6SXNzdWU4NDI4ODEyMjE=", "number": 1281, "title": "Latest Datasette tags missing from Docker Hub", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 7, "created_at": "2021-03-29T00:58:30Z", "updated_at": "2021-03-29T01:41:48Z", "closed_at": "2021-03-29T01:41:48Z", "author_association": "OWNER", "pull_request": null, "body": "Spotted this while testing https://github.com/simonw/datasette/issues/1249#issuecomment-808998719_\r\n\r\nhttps://hub.docker.com/r/datasetteproject/datasette/tags?page=1&ordering=last_updated isn't showing the tags for any version more recent than 0.54.1 - we are up to 0.56 now.\r\n\r\nBut the `:latest` tag is for the new 0.56 release.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1281/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 831163537, "node_id": "MDExOlB1bGxSZXF1ZXN0NTkyNTQ4MTAz", "number": 1260, "title": "Fix: code quality issues", "user": {"value": 25361949, "label": "withshubh"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-03-14T13:56:10Z", "updated_at": "2021-03-29T00:22:41Z", "closed_at": "2021-03-29T00:22:41Z", "author_association": "NONE", "pull_request": "simonw/datasette/pulls/1260", "body": "### Description\r\nHi :wave: I work at [DeepSource](https://deepsource.io), I ran DeepSource analysis on the forked copy of this repo and found some interesting [code quality issues](https://deepsource.io/gh/withshubh/datasette/issues/?category=recommended) in the codebase, opening this PR so you can assess if our platform is right and helpful for you.\r\n\r\n### Summary of changes\r\n\r\n- Replaced ternary syntax with if expression\r\n- Removed redundant `None` default\r\n- Used `is` to compare type of objects\r\n- Iterated dictionary directly\r\n- Removed unnecessary lambda expression\r\n- Refactored unnecessary `else` / `elif` when `if` block has a `return` statement\r\n- Refactored unnecessary `else` / `elif` when `if` block has a `raise` statement\r\n- Added .deepsource.toml to continuously analyze and detect code quality issues", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1260/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 807433181, "node_id": "MDU6SXNzdWU4MDc0MzMxODE=", "number": 1224, "title": "can't start immutable databases from configuration dir mode", "user": {"value": 295329, "label": "camallen"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2021-02-12T17:50:13Z", "updated_at": "2021-03-29T00:17:31Z", "closed_at": "2021-03-29T00:17:31Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Say I have a `/databases/` directory with multiple sqlite db files in that dir (`1.db` & `2.db`) and an `inspect-data.json` file.\r\n\r\nIf I start datasette via `datasette -h 0.0.0.0 /databases/` then the resulting databases are set to `is_mutable: true` as inspected via http://127.0.0.1:8001/-/databases.json\r\n\r\nI don't want to have to list out the databases by name, e.g. `datasette -i /databases/1.db -i /databases/2.db` as i want the system to autodetect the sqlite dbs i have in the configuration directory \r\n\r\nAccording to the docs outlined in https://docs.datasette.io/en/latest/settings.html?highlight=immutable#configuration-directory-mode this should be possible\r\n> `inspect-data.json` the result of running datasette inspect - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running\r\n \r\nI believe that if the `inspect-json.json` file present, then in theory the databases will be automatically set to immutable via this code https://github.com/simonw/datasette/blob/9603d893b9b72653895318c9104d754229fdb146/datasette/app.py#L211-L216\r\n\r\nHowever it appears the Click Multiple Options will return a tuple via https://github.com/simonw/datasette/blob/9603d893b9b72653895318c9104d754229fdb146/datasette/cli.py#L311-L317\r\n\r\nThe resulting tuple is passed to the Datasette app via `kwargs` and overrides the behaviour to set the databases to immutable via this arg https://github.com/simonw/datasette/blob/9603d893b9b72653895318c9104d754229fdb146/datasette/app.py#L182\r\n\r\nIf you think this is a bug and needs fixing, I am willing to make a PR to check for the empty `immutable` tuple before calling the Datasette class initializer as I think leaving that class interface alone is the best path here.\r\n\r\nThoughts?\r\n\r\nAlso - i'm loving Datasette, it truly is a wonderful tool, thank you :)", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1224/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 763207948, "node_id": "MDU6SXNzdWU3NjMyMDc5NDg=", "number": 1141, "title": "Default styling for bullet point lists", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-12-12T02:49:33Z", "updated_at": "2021-03-29T00:14:05Z", "closed_at": "2021-03-29T00:14:05Z", "author_association": "OWNER", "pull_request": null, "body": "I just noticed that https://datasette.io/content/recent_releases (which uses `datasette-render-markdown`) is missing its bullet points:\r\n\r\n\"content__recent_releases__399_rows\"\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1141/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 825217564, "node_id": "MDExOlB1bGxSZXF1ZXN0NTg3MzMyNDcz", "number": 1252, "title": "Add back styling to lists within table cells (fixes #1141)", "user": {"value": 7476523, "label": "bobwhitelock"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-03-09T03:00:57Z", "updated_at": "2021-03-29T00:14:04Z", "closed_at": "2021-03-29T00:14:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1252", "body": "This overrides the Datasette reset - see https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/datasette/static/app.css#L35-L38 - to add back the default styling of list items displayed within Datasette table cells.\r\n\r\nFollowing this change, the same content as in the original issue looks like this:\r\n\r\n![2021-03-09_02:57:32](https://user-images.githubusercontent.com/7476523/110411982-63e5ae80-8083-11eb-9b5c-e5dc825073e2.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1252/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 842556944, "node_id": "MDExOlB1bGxSZXF1ZXN0NjAyMTA3OTM1", "number": 1279, "title": "Minor Docs Update. Added `--app` to fly install command.", "user": {"value": 1019791, "label": "koaning"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2021-03-27T16:58:08Z", "updated_at": "2021-03-29T00:11:55Z", "closed_at": "2021-03-29T00:11:55Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/1279", "body": "Without this flag, there's an error locally. \r\n\r\n```\r\n> datasette publish fly bigmac.db\r\n\r\nUsage: datasette publish fly [OPTIONS] [FILES]...\r\nTry 'datasette publish fly --help' for help.\r\n\r\nError: Missing option '-a' / '--app'.\r\n```\r\n\r\nI also got an error message which later turned out to be because I hadn't added my credit card information yet to `fly`. I wasn't sure if I should add that mention to the docs here, or to submit a bug-report over at https://github.com/simonw/datasette-publish-fly. ", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/1279/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 741862364, "node_id": "MDU6SXNzdWU3NDE4NjIzNjQ=", "number": 1090, "title": "Custom widgets for canned query forms", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-11-12T19:21:07Z", "updated_at": "2021-03-27T16:25:25Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "This is an idea that was cut from the first version of writable canned queries:\r\n\r\n> I really want the option to use a `