{"id": 611835285, "node_id": "MDU6SXNzdWU2MTE4MzUyODU=", "number": 752, "title": "Non-utf8 encoding in exceptionhandlers and custom-pages", "user": {"value": 2181410, "label": "clausjuhl"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-04T12:24:42Z", "updated_at": "2020-05-04T17:42:20Z", "closed_at": "2020-05-04T17:42:20Z", "author_association": "NONE", "pull_request": null, "body": "Hi Simon.\r\n\r\nWhenever a response is not piped through a router-view, the template is encoded in latin-1 (I think). This is especially a problem (for me) with the new custom_pages-functionality, but also problematic with the 404- and 500-handlers. Thanks!", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/752/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 611874514, "node_id": "MDExOlB1bGxSZXF1ZXN0NDEyOTUxMTkx", "number": 753, "title": "Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.13", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-05-04T13:27:19Z", "updated_at": "2020-05-04T17:41:01Z", "closed_at": "2020-05-04T17:40:49Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/753", "body": "Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/753/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 596245802, "node_id": "MDExOlB1bGxSZXF1ZXN0NDAwNTc4OTc5", "number": 720, "title": "Update beautifulsoup4 requirement from ~=4.8.1 to >=4.8.1,<4.10.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-08T01:24:38Z", "updated_at": "2020-05-04T17:14:51Z", "closed_at": "2020-05-04T17:14:46Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/720", "body": "Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version.\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n**Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit.\n\nYou can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/720/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 594553553, "node_id": "MDExOlB1bGxSZXF1ZXN0Mzk5MTY2NDMz", "number": 719, "title": "asgi: check raw_path is not None", "user": {"value": 193185, "label": "cldellow"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-05T16:53:58Z", "updated_at": "2020-05-04T17:14:26Z", "closed_at": "2020-05-04T17:14:26Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/719", "body": "The ASGI spec\r\n(https://asgi.readthedocs.io/en/latest/specs/www.html#http) seems to imply that `None` is a valid value, so we need to check the value itself, not just whether the key is present.\r\n\r\nIn particular, the [mangum](https://github.com/erm/mangum) adapter passes `None` for this key's value. This change permits mangum to be used to front datasette in Amazon API Gateway + AWS Lambda deployments.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/719/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 596245923, "node_id": "MDExOlB1bGxSZXF1ZXN0NDAwNTc5MDc3", "number": 721, "title": "Update pytest requirement from ~=5.2.2 to >=5.2.2,<5.5.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-08T01:25:04Z", "updated_at": "2020-05-04T17:13:49Z", "closed_at": "2020-05-04T17:13:41Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/721", "body": "Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.\n
\nRelease notes\n

Sourced from pytest's releases.

\n
\n

5.4.1

\n

pytest 5.4.1 (2020-03-13)

\n

Bug Fixes

\n\n
\n
\n
\nChangelog\n

Sourced from pytest's changelog.

\n
\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n**Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit.\n\nYou can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/721/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 596246006, "node_id": "MDExOlB1bGxSZXF1ZXN0NDAwNTc5MTM2", "number": 722, "title": "Update jinja2 requirement from ~=2.10.3 to >=2.10.3,<2.12.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-08T01:25:24Z", "updated_at": "2020-05-04T17:13:26Z", "closed_at": "2020-05-04T17:13:16Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/722", "body": "Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.\n
\nRelease notes\n

Sourced from jinja2's releases.

\n
\n

2.11.1

\n

This fixes an issue in async environment when indexing the result of an attribute lookup, like {{ data.items[1:] }}.

\n\n
\n
\n
\nChangelog\n

Sourced from jinja2's changelog.

\n
\n

Version 2.11.1

\n

Released 2020-01-30

\n\n

Version 2.11.0

\n

Released 2020-01-27

\n\n ... (truncated)\n
\n
\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n**Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit.\n\nYou can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/722/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 605546606, "node_id": "MDExOlB1bGxSZXF1ZXN0NDA3OTI5MTI4", "number": 734, "title": "Update janus requirement from ~=0.4.0 to >=0.4,<0.6", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-23T13:43:45Z", "updated_at": "2020-05-04T16:48:14Z", "closed_at": "2020-05-04T16:48:04Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/734", "body": "Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.\n
\nChangelog\n

Sourced from janus's changelog.

\n
\n

0.5.0 (2020-04-23)

\n\n

0.4.0 (2018-07-28)

\n\n

0.3.2 (2018-07-06)

\n\n

0.3.1 (2018-01-30)

\n\n

0.3.0 (2017-02-21)

\n\n

0.2.4 (2016-12-05)

\n\n

0.2.3 (2016-07-12)

\n\n

0.2.2 (2016-07-11)

\n\n

0.2.1 (2016-03-24)

\n\n

0.2.0 (2015-09-20)

\n ... (truncated)\n
\n
\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/734/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 603242257, "node_id": "MDExOlB1bGxSZXF1ZXN0NDA2MDY3MDE5", "number": 728, "title": "Update mergedeep requirement from ~=1.1.1 to >=1.1.1,<1.4.0", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-20T13:33:23Z", "updated_at": "2020-05-04T16:45:58Z", "closed_at": "2020-05-04T16:45:49Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/728", "body": "Updates the requirements on [mergedeep](https://github.com/clarketm/mergedeep) to permit the latest version.\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/728/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 611997130, "node_id": "MDU6SXNzdWU2MTE5OTcxMzA=", "number": 754, "title": "Clean up aiofiles warnings on 3.8", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-05-04T16:14:59Z", "updated_at": "2020-05-04T16:22:30Z", "closed_at": "2020-05-04T16:22:30Z", "author_association": "OWNER", "pull_request": null, "body": "https://travis-ci.org/github/simonw/datasette/jobs/682624476\r\n\r\nLots of warnings like this:\r\n```\r\n/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33\r\n\r\n/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33\r\n\r\n /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33: DeprecationWarning: \"@coroutine\" decorator is deprecated since Python 3.8, use \"async def\" instead\r\n\r\n def method(self, *args, **kwargs):\r\n\r\n/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27\r\n\r\n /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27: DeprecationWarning: \"@coroutine\" decorator is deprecated since Python 3.8, use \"async def\" instead\r\n\r\n def _open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None,\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/754/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 604001627, "node_id": "MDExOlB1bGxSZXF1ZXN0NDA2Njc3MjA1", "number": 730, "title": "Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12", "user": {"value": 27856297, "label": "dependabot-preview[bot]"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-21T13:32:35Z", "updated_at": "2020-05-04T13:27:24Z", "closed_at": "2020-05-04T13:27:23Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/datasette/pulls/730", "body": "Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.\n
\nCommits\n\n
\n
\n\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language\n- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language\n- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language\n- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language\n- `@dependabot badge me` will comment on this PR with code to add a \"Dependabot enabled\" badge to your readme\n\nAdditionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):\n- Update frequency (including time of day and day of week)\n- Pull request limits (per update run and/or open at any time)\n- Out-of-range updates (receive only lockfile updates, if desired)\n- Security updates (receive only security updates, if desired)\n\n\n\n
", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/730/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 602533300, "node_id": "MDU6SXNzdWU2MDI1MzMzMDA=", "number": 1, "title": "Import photo metadata from Apple Photos into SQLite", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": {"value": 5324096, "label": "Apple Photos online and securely browsable"}, "comments": 8, "created_at": "2020-04-18T19:23:26Z", "updated_at": "2020-05-04T02:41:40Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Faces, albums, locations, that kind of thing.", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 610408908, "node_id": "MDU6SXNzdWU2MTA0MDg5MDg=", "number": 34, "title": "Command for retrieving dependents for a repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-04-30T21:47:51Z", "updated_at": "2020-05-03T15:53:01Z", "closed_at": "2020-05-03T15:53:01Z", "author_association": "MEMBER", "pull_request": null, "body": "I really, really want to start grabbing this data: https://github.com/simonw/datasette/network/dependents", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 611222968, "node_id": "MDU6SXNzdWU2MTEyMjI5Njg=", "number": 107, "title": "sqlite-utils create-view CLI command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-05-02T16:15:13Z", "updated_at": "2020-05-03T15:36:58Z", "closed_at": "2020-05-03T15:36:37Z", "author_association": "OWNER", "pull_request": null, "body": "Can go with #27 - `sqlite-utils create-table`.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/107/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 455496504, "node_id": "MDU6SXNzdWU0NTU0OTY1MDQ=", "number": 27, "title": "sqlite-utils create-table command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2019-06-13T01:43:30Z", "updated_at": "2020-05-03T15:26:15Z", "closed_at": "2020-05-03T15:26:15Z", "author_association": "OWNER", "pull_request": null, "body": "Spun off from #24 - it would be useful if CLI users could create new tables (with explicit column types, not null rules and defaults) without having to insert an example record.\r\n\r\n- [x] Get it working\r\n- [x] Support `--pk`\r\n- [x] Support `--not-null`\r\n- [x] Support `--default`\r\n- [x] Support `--fk colname othertable othercol`\r\n- [x] Support `--replace` and `--ignore`\r\n- [x] Documentation", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/27/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 611326701, "node_id": "MDU6SXNzdWU2MTEzMjY3MDE=", "number": 108, "title": "Documentation unit tests for CLI commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-05-03T03:58:42Z", "updated_at": "2020-05-03T04:13:57Z", "closed_at": "2020-05-03T04:13:57Z", "author_association": "OWNER", "pull_request": null, "body": "Have a test that ensures all CLI commands are documented.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/108/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 611284481, "node_id": "MDU6SXNzdWU2MTEyODQ0ODE=", "number": 38, "title": "[Feature Request] Support Repo Name in Search \ud83e\udd7a", "user": {"value": 5779832, "label": "zzeleznick"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-05-02T22:08:51Z", "updated_at": "2020-05-03T02:34:32Z", "closed_at": "2020-05-02T23:15:11Z", "author_association": "NONE", "pull_request": null, "body": "## Description\r\nPer your [v2.2 release tweet](https://twitter.com/simonw/status/1256700238099693568) I played with the demo, but the output did not match my expectations.\r\n\r\n## Expected Behavior\r\nExpected a search query for \"twitter\" contained within the `repo` column to return non-zero results.\r\n\r\n## Actual Behavior\r\n\ud83d\ude2d [0 rows where repo contains \"twitter\" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=twitter&_sort_desc=starred_at) \r\n\r\n## Best Explanation\r\nPer the table schema (see appendix) `repo` is of type `INTEGER` which built from `repo_id` and does not expose the repo name in search.\r\n\r\n## Desired Behavior\r\nGiven that searching for \"206156866\" is less intuitive than \"twitter\", it would be great to support this via extending the search capabilities or by adding an additional column.\r\n\r\n\u2705 104 rows where repo contains \"twitter\"\r\n\u274c [104 rows where repo contains \"206156866\" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=206156866&_sort_desc=starred_at) \r\n\r\n## Appendix\r\n```\r\nCREATE TABLE [stars] (\r\n [user] INTEGER REFERENCES [users]([id]),\r\n [repo] INTEGER REFERENCES [repos]([id]),\r\n [starred_at] TEXT,\r\n PRIMARY KEY ([user], [repo])\r\n);\r\nCREATE INDEX [idx_stars_repo]\r\n ON [stars] ([repo]);\r\nCREATE INDEX [idx_stars_user]\r\n ON [stars] ([user]);\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 493670730, "node_id": "MDU6SXNzdWU0OTM2NzA3MzA=", "number": 4, "title": "Command to fetch stargazers for one or more repos", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2019-09-14T21:58:22Z", "updated_at": "2020-05-02T21:30:27Z", "closed_at": "2020-05-02T21:30:27Z", "author_association": "MEMBER", "pull_request": null, "body": "Maybe this:\r\n\r\n $ github-to-sqlite stargazers github.db simonw/datasette\r\n\r\nIt could accept more than one repos.\r\n\r\nMaybe have options similar to `--sql` in [twitter-to-sqlite](https://github.com/dogsheep/twitter-to-sqlite) so you can e.g. fetch all stargazers for all of the repos you have fetched into the database already (or all of the repos belonging to owner X)", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 516763727, "node_id": "MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2", "number": 8, "title": "stargazers command, refs #4", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2019-11-03T00:37:36Z", "updated_at": "2020-05-02T20:00:27Z", "closed_at": "2020-05-02T20:00:26Z", "author_association": "MEMBER", "pull_request": "dogsheep/github-to-sqlite/pulls/8", "body": "Needs tests. Refs #4.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 611252244, "node_id": "MDU6SXNzdWU2MTEyNTIyNDQ=", "number": 750, "title": "Add notlike table filter", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-05-02T18:54:36Z", "updated_at": "2020-05-02T19:10:44Z", "closed_at": "2020-05-02T19:10:44Z", "author_association": "OWNER", "pull_request": null, "body": "I found myself wanting that for applying the opposite of this: https://github-to-sqlite.dogsheep.net/github/dependent_repos?dependent__like=%25simonw%2F%25&_sort_desc=dependent_stars\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/750/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 520756546, "node_id": "MDU6SXNzdWU1MjA3NTY1NDY=", "number": 12, "title": "Add this view for seeing new releases", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2019-11-11T06:00:12Z", "updated_at": "2020-05-02T18:58:18Z", "closed_at": "2020-05-02T18:58:17Z", "author_association": "MEMBER", "pull_request": null, "body": "```sql\r\nCREATE VIEW recent_releases AS select\r\n json_object(\"label\", repos.full_name, \"href\", repos.html_url) as repo,\r\n json_object(\r\n \"href\",\r\n releases.html_url,\r\n \"label\",\r\n releases.name\r\n ) as release,\r\n substr(releases.published_at, 0, 11) as date,\r\n releases.body as body_markdown,\r\n releases.published_at\r\nfrom\r\n releases\r\n join repos on repos.id = releases.repo\r\norder by\r\n releases.published_at desc\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 609950090, "node_id": "MDU6SXNzdWU2MDk5NTAwOTA=", "number": 33, "title": "Fall back to authentication via ENV", "user": {"value": 2029, "label": "garethr"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-04-30T12:58:14Z", "updated_at": "2020-05-02T18:46:10Z", "closed_at": "2020-05-02T18:45:37Z", "author_association": "NONE", "pull_request": null, "body": "Would you accept a PR that falls back to looking for an environment variable for the GitHub token? Specifically a change here:\r\nhttps://github.com/dogsheep/github-to-sqlite/blob/c34d5a18bfc41fa08755ba3d5cf9fe09ff204238/github_to_sqlite/cli.py#L271\r\n\r\nI'd like to use `github-to-sqlite` in a GitHub Action workflow and this would be simpler than trying to fill out the prompt or generate a file with sensitive content.\r\n\r\nWanted to check first, I'm happy to submit a PR with tests and updates to the docs. ", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610511450, "node_id": "MDU6SXNzdWU2MTA1MTE0NTA=", "number": 35, "title": "Create index on issue_comments(user) and other foreign keys", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-05-01T02:06:56Z", "updated_at": "2020-05-02T18:26:24Z", "closed_at": "2020-05-02T18:26:24Z", "author_association": "MEMBER", "pull_request": null, "body": "```\r\ncreate index issue_comments_user on issue_comments(user)\r\n```\r\nI'm sure there are other user columns that could benefit from an index.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/35/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610842926, "node_id": "MDU6SXNzdWU2MTA4NDI5MjY=", "number": 36, "title": "Add view for better display of dependent repos", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-05-01T16:33:44Z", "updated_at": "2020-05-02T16:50:31Z", "closed_at": "2020-05-02T16:30:11Z", "author_association": "MEMBER", "pull_request": null, "body": "```sql\r\nselect\r\n repos.full_name as repo,\r\n 'https://github.com/' || repos2.full_name as dependent,\r\n repos2.created_at as dependent_repo_created,\r\n repos2.updated_at as dependent_repo_updated,\r\n repos2.stargazers_count as dependent_repo_stars,\r\n repos2.watchers_count as dependent_repo_watchers\r\nfrom\r\n dependents\r\n join repos as repos2 on dependents.dependent = repos2.id\r\n join repos on dependents.repo = repos.id\r\norder by\r\n repos2.created_at desc\r\n```\r\nhttps://dogsheep.simonwillison.net/github?sql=select%0D%0A++repos.full_name+as+repo%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos2.full_name+as+dependent%2C%0D%0A++repos2.created_at+as+dependent_repo_created%2C%0D%0A++repos2.updated_at+as+dependent_repo_updated%2C%0D%0A++repos2.stargazers_count+as+dependent_repo_stars%2C%0D%0A++repos2.watchers_count+as+dependent_repo_watchers%0D%0Afrom%0D%0A++dependents%0D%0A++join+repos+as+repos2+on+dependents.dependent+%3D+repos2.id%0D%0A++join+repos+on+dependents.repo+%3D+repos.id%0D%0Aorder+by%0D%0A++repos2.created_at+desc", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/36/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 516967682, "node_id": "MDU6SXNzdWU1MTY5Njc2ODI=", "number": 10, "title": "Add this repos_starred view", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-04T05:44:38Z", "updated_at": "2020-05-02T16:37:36Z", "closed_at": "2020-05-02T16:37:36Z", "author_association": "MEMBER", "pull_request": null, "body": "```sql\r\ncreate view repos_starred as select\r\n stars.starred_at,\r\n users.login,\r\n repos.*\r\nfrom\r\n repos\r\n join stars on repos.id = stars.repo\r\n join users on repos.owner = users.id\r\norder by\r\n starred_at desc;\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610843136, "node_id": "MDU6SXNzdWU2MTA4NDMxMzY=", "number": 37, "title": "Mechanism for creating views if they don't yet exist", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-05-01T16:34:10Z", "updated_at": "2020-05-02T16:19:47Z", "closed_at": "2020-05-02T16:19:31Z", "author_association": "MEMBER", "pull_request": null, "body": "Needed for #36 #10 #12 ", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 611216862, "node_id": "MDU6SXNzdWU2MTEyMTY4NjI=", "number": 106, "title": "create_view(..., ignore=True, replace=True) parameters", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-02T15:45:21Z", "updated_at": "2020-05-02T16:04:51Z", "closed_at": "2020-05-02T16:02:10Z", "author_association": "OWNER", "pull_request": null, "body": "Two new parameters which specify what should happen if the view already exists. I want this for https://github.com/dogsheep/github-to-sqlite/issues/37\r\n\r\nHere's the current `create_view()` implementation:\r\n\r\nhttps://github.com/simonw/sqlite-utils/blob/b4d953d3ccef28bb81cea40ca165a647b59971fa/sqlite_utils/db.py#L325-L332\r\n\r\n`ignore=True` will not do anything if the view exists already.\r\n\r\n`replace=True` will drop and redefine the view - but only if its SQL definition differs, otherwise it will be left alone.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/106/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602569315, "node_id": "MDU6SXNzdWU2MDI1NjkzMTU=", "number": 102, "title": "Can't store an array or dictionary containing a bytes value", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-18T22:49:21Z", "updated_at": "2020-05-01T20:45:45Z", "closed_at": "2020-05-01T20:45:45Z", "author_association": "OWNER", "pull_request": null, "body": "```\r\nIn [1]: import sqlite_utils \r\n\r\nIn [2]: db = sqlite_utils.Database(memory=True) \r\n\r\nIn [3]: db[\"t\"].insert({\"id\": 1, \"data\": {\"foo\": b\"bytes\"}}) \r\n---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\n in \r\n----> 1 db[\"t\"].insert({\"id\": 1, \"data\": {\"foo\": b\"bytes\"}})\r\n\r\n~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert(self, record, pk, foreign_keys, column_order, not_null, defaults, hash_id, alter, ignore, replace, extracts, conversions, columns)\r\n 950 extracts=extracts,\r\n 951 conversions=conversions,\r\n--> 952 columns=columns,\r\n 953 )\r\n 954 \r\n\r\n~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, conversions, columns, upsert)\r\n 1052 for key in all_columns:\r\n 1053 value = jsonify_if_needed(\r\n-> 1054 record.get(key, None if key != hash_id else _hash(record))\r\n 1055 )\r\n 1056 if key in extracts:\r\n\r\n~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in jsonify_if_needed(value)\r\n 1318 def jsonify_if_needed(value):\r\n 1319 if isinstance(value, (dict, list, tuple)):\r\n-> 1320 return json.dumps(value)\r\n 1321 elif isinstance(value, (datetime.time, datetime.date, datetime.datetime)):\r\n 1322 return value.isoformat()\r\n\r\n/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)\r\n 229 cls is None and indent is None and separators is None and\r\n 230 default is None and not sort_keys and not kw):\r\n--> 231 return _default_encoder.encode(obj)\r\n 232 if cls is None:\r\n 233 cls = JSONEncoder\r\n\r\n/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in encode(self, o)\r\n 197 # exceptions aren't as detailed. The list call should be roughly\r\n 198 # equivalent to the PySequence_Fast that ''.join() would do.\r\n--> 199 chunks = self.iterencode(o, _one_shot=True)\r\n 200 if not isinstance(chunks, (list, tuple)):\r\n 201 chunks = list(chunks)\r\n\r\n/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in iterencode(self, o, _one_shot)\r\n 255 self.key_separator, self.item_separator, self.sort_keys,\r\n 256 self.skipkeys, _one_shot)\r\n--> 257 return _iterencode(o, 0)\r\n 258 \r\n 259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,\r\n\r\n/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in default(self, o)\r\n 177 \r\n 178 \"\"\"\r\n--> 179 raise TypeError(f'Object of type {o.__class__.__name__} '\r\n 180 f'is not JSON serializable')\r\n 181 \r\n\r\nTypeError: Object of type bytes is not JSON serializable\r\n```", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/102/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610853576, "node_id": "MDU6SXNzdWU2MTA4NTM1NzY=", "number": 105, "title": "\"sqlite-utils views\" command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-05-01T16:56:11Z", "updated_at": "2020-05-01T20:40:07Z", "closed_at": "2020-05-01T20:38:36Z", "author_association": "OWNER", "pull_request": null, "body": "Similar to `sqlite-utils tables`. See also #104.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/105/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610853393, "node_id": "MDU6SXNzdWU2MTA4NTMzOTM=", "number": 104, "title": "--schema option to \"sqlite-utils tables\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-05-01T16:55:49Z", "updated_at": "2020-05-01T17:12:37Z", "closed_at": "2020-05-01T17:12:37Z", "author_association": "OWNER", "pull_request": null, "body": "Adds output showing the table schema.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/104/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610342575, "node_id": "MDU6SXNzdWU2MTAzNDI1NzU=", "number": 748, "title": "?_searchmode=raw should be documented on full-text search page", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-30T19:50:06Z", "updated_at": "2020-04-30T21:06:12Z", "closed_at": "2020-04-30T21:06:12Z", "author_association": "OWNER", "pull_request": null, "body": "It's currently documented here: https://datasette.readthedocs.io/en/stable/json_api.html#special-table-arguments\r\n\r\nBut it should also be described here: https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-view-api", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/748/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610192152, "node_id": "MDU6SXNzdWU2MTAxOTIxNTI=", "number": 747, "title": "Directory configuration mode should support metadata.yaml", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-04-30T16:05:30Z", "updated_at": "2020-04-30T19:04:19Z", "closed_at": "2020-04-30T19:04:19Z", "author_association": "OWNER", "pull_request": null, "body": "Refs #739 - `metadata.yml` or `metadata.yaml` should be detected in the same way as `metadata.json` is.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/747/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 610284471, "node_id": "MDU6SXNzdWU2MTAyODQ0NzE=", "number": 46, "title": "Error running 'search' for the first time", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-30T18:11:20Z", "updated_at": "2020-04-30T18:11:58Z", "closed_at": "2020-04-30T18:11:58Z", "author_association": "MEMBER", "pull_request": null, "body": "```\r\n% twitter-to-sqlite search infodemic.db '#infodemic'\r\nTraceback (most recent call last):\r\n File \"/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite\", line 11, in \r\n load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')()\r\n File \"/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py\", line 829, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py\", line 782, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py\", line 1259, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py\", line 1066, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py\", line 610, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py\", line 867, in search\r\n for tweet in tweets:\r\n File \"/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py\", line 165, in fetch_timeline\r\n [since_type_id, since_key],\r\nsqlite3.OperationalError: no such table: since_ids\r\n```", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 608752766, "node_id": "MDExOlB1bGxSZXF1ZXN0NDEwNDY5Mjcy", "number": 746, "title": "shutil.Error, not OSError", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-29T03:30:51Z", "updated_at": "2020-04-29T07:07:24Z", "closed_at": "2020-04-29T07:07:23Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/746", "body": "Refs #744", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/746/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 275814941, "node_id": "MDU6SXNzdWUyNzU4MTQ5NDE=", "number": 141, "title": "datasette publish can fail if /tmp is on a different device", "user": {"value": 21148, "label": "jacobian"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 2949431, "label": "Custom templates edition"}, "comments": 5, "created_at": "2017-11-21T18:28:05Z", "updated_at": "2020-04-29T03:27:54Z", "closed_at": "2017-12-08T16:06:36Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "`datasette publish` uses hard links to avoid copying the db into a tmp directory. This can fail if `/tmp` is on another device, because hardlinks can't cross devices. You'll see something like this:\r\n\r\n```\r\n$ datasette publish heroku whatever.db\r\n...\r\nOSError: [Errno 18] Invalid cross-device link: '/mnt/c/Users/jacob/c/datasette/whatever.db' -> '/tmp/tmpvxq2yof6/whatever.db'\r\n```\r\n[In my case this is failing because I'm on a Windows machine, using WSL, so my code's on a different virtual filesystem from the Linux subsystem, Because Reasons.]\r\n\r\nI'm not sure if it's possible to detect this (can you figure out which device `/tmp` is on?), or what the fallback should be (soft link? copy?).", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/141/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 522334771, "node_id": "MDU6SXNzdWU1MjIzMzQ3NzE=", "number": 633, "title": "Publish to Heroku is broken: \"WARNING: You must pass the application as an import string to enable 'reload' or 'workers\"", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-13T16:32:11Z", "updated_at": "2020-04-28T20:37:50Z", "closed_at": "2019-11-13T16:43:23Z", "author_association": "OWNER", "pull_request": null, "body": "```\r\n2019-11-13T16:27:59.821483+00:00 heroku[web.1]: Starting process with command `datasette serve --host 0.0.0.0 -i fixtures.db --cors --port 36817 --inspect-file inspect-data.json`\r\n2019-11-13T16:28:01.856471+00:00 heroku[web.1]: State changed from starting to crashed\r\n2019-11-13T16:28:01.750253+00:00 app[web.1]: Serve! files=() (immutables=('fixtures.db',)) on port 36817\r\n2019-11-13T16:28:01.771524+00:00 app[web.1]: WARNING: You must pass the application as an import string to enable 'reload' or 'workers'.\r\n2019-11-13T16:28:01.837839+00:00 heroku[web.1]: Process exited with status 1\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/633/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 608512747, "node_id": "MDU6SXNzdWU2MDg1MTI3NDc=", "number": 14, "title": "Annotate photos using the Google Cloud Vision API", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-04-28T18:09:03Z", "updated_at": "2020-04-28T18:19:06Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "It can detect faces, run OCR, do image labeling (it knows what a lemur is!) and do object localization where it identifies objects and returns bounding polygons for them.", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14/reactions\", \"total_count\": 3, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 602533352, "node_id": "MDU6SXNzdWU2MDI1MzMzNTI=", "number": 2, "title": "Ability to convert HEIC images to JPEG", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5324096, "label": "Apple Photos online and securely browsable"}, "comments": 1, "created_at": "2020-04-18T19:23:43Z", "updated_at": "2020-04-28T16:47:21Z", "closed_at": "2020-04-28T16:47:21Z", "author_association": "MEMBER", "pull_request": null, "body": "", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 605806386, "node_id": "MDU6SXNzdWU2MDU4MDYzODY=", "number": 735, "title": "Error when I click on \"View and edit SQL\"", "user": {"value": 30607, "label": "aborruso"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-23T19:31:32Z", "updated_at": "2020-04-28T06:10:20Z", "closed_at": "2020-04-27T19:00:30Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nwhen I do it [here](https://my-database.now.sh/commissioniComunePalermo/youtube), I have \"unrecognized token: \"[\"\" error.\r\n\r\nIs it normal?\r\n\r\nThank you", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/735/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 606720674, "node_id": "MDU6SXNzdWU2MDY3MjA2NzQ=", "number": 736, "title": "strange behavior using accented characters", "user": {"value": 30607, "label": "aborruso"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-25T08:34:51Z", "updated_at": "2020-04-28T06:09:28Z", "closed_at": "2020-04-27T18:59:16Z", "author_association": "NONE", "pull_request": null, "body": "Hi,\r\nwhen I search `incompatibilit\u00e0` [here](https://my-database.now.sh/commissioniComunePalermo/youtube), using full text search, it becomes `incompatibilit\u00c3\u0083\u00c2\u00a0` and I have no result.\r\n\r\nIf I encode the `\u00e0` char in the URL (`incompatibilit%C3%A0`) I have the right result.\r\n\r\n![image](https://user-images.githubusercontent.com/30607/80275201-00a79380-86e0-11ea-865e-f7e1474e8098.png)\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/736/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607888367, "node_id": "MDU6SXNzdWU2MDc4ODgzNjc=", "number": 13, "title": "Also upload movie files", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-27T22:11:25Z", "updated_at": "2020-04-28T00:39:45Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "The `upload` command currently only handles static images:\r\n\r\nhttps://github.com/dogsheep/photos-to-sqlite/blob/d939455af00e07866686457ee2fcb9b2d1b7194e/photos_to_sqlite/utils.py#L26-L33\r\n\r\nNeed to cover movies taken by my phone and DSLR too.", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 600120439, "node_id": "MDU6SXNzdWU2MDAxMjA0Mzk=", "number": 726, "title": "Foreign key : case of a link to the associated row not displayed", "user": {"value": 6371750, "label": "JBPressac"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-15T08:31:27Z", "updated_at": "2020-04-27T22:05:47Z", "closed_at": "2020-04-27T22:05:46Z", "author_association": "CONTRIBUTOR", "pull_request": null, "body": "Hello,\r\nI use Datasette to publish tsv files linked together by foreign keys declared thanks to sqlite-utils. In one table, [prelib_personne](http://crbc-dataset.huma-num.fr/prelib/prelib_personne), the foreign keys are properly noticed by a link to the associated row (for instance ville_naissance_id is properly linked to prelib_ville). But every link to the foreign key prelib_oeuvre.id fails. For instance, [prelib_ecritoeuvre](http://crbc-dataset.huma-num.fr/prelib/prelib_ecritoeuvre) has links to prelib_personne but none to prelib_oeuvre. In despite of the schema:\r\n\r\nCREATE TABLE \"prelib_ecritoeuvre\" (\r\n\"id\" INTEGER,\r\n \"fonction_id\" INTEGER,\r\n \"oeuvre_id\" INTEGER,\r\n \"personne_id\" INTEGER\r\n ,PRIMARY KEY ([id]),\r\n FOREIGN KEY(fonction_id) REFERENCES prelib_fonctionecritoeuvre(id),\r\n FOREIGN KEY(personne_id) REFERENCES prelib_personne(id),\r\n FOREIGN KEY(oeuvre_id) REFERENCES prelib_oeuvre(id)\r\n); \r\n\r\nWould you have any clue to investigate the reason of this problem?\r\nThanks,", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/726/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607770595, "node_id": "MDU6SXNzdWU2MDc3NzA1OTU=", "number": 743, "title": "escape_fts() does not correctly escape * wildcards", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-04-27T18:48:53Z", "updated_at": "2020-04-27T19:11:30Z", "closed_at": "2020-04-27T19:11:01Z", "author_association": "OWNER", "pull_request": null, "body": "Spotted in #732. This should not return any results... but it does:\r\n\r\nhttps://latest.datasette.io/fixtures/searchable?_search=bar%2A&_trace=1\r\n\r\n\"fixtures__searchable__1_row_where_where_search_matches__bar__\"\r\n\r\nThe query from trace is:\r\n```\r\n \"sql\": \"select count(*) from searchable where rowid in (select rowid from searchable_fts where searchable_fts match escape_fts(:search))\",\r\n \"params\": {\r\n \"search\": \"bar*\"\r\n }\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/743/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607211058, "node_id": "MDU6SXNzdWU2MDcyMTEwNTg=", "number": 740, "title": "Don't throw 500 error on attempted directory browse", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-27T03:50:11Z", "updated_at": "2020-04-27T18:29:15Z", "closed_at": "2020-04-27T18:29:15Z", "author_association": "OWNER", "pull_request": null, "body": "\"Error_500\"\r\n\r\nThis should be a 403 error instead, because the `--static` mechanism doesn't allow directory browsing.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/740/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607243940, "node_id": "MDU6SXNzdWU2MDcyNDM5NDA=", "number": 742, "title": "Speed up tests with scope=\"session\"?", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-27T05:23:54Z", "updated_at": "2020-04-27T18:24:53Z", "closed_at": "2020-04-27T18:24:53Z", "author_association": "OWNER", "pull_request": null, "body": "Tests are pretty slow - could I speed them up with pytest `scope=\"session\"` on some of the fixtures?\r\n\r\nEg https://travis-ci.org/github/simonw/datasette/jobs/679940036 ran 452 tests in 3m53s - the `test_html` ones seem particularly slow.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/742/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 605110015, "node_id": "MDU6SXNzdWU2MDUxMTAwMTU=", "number": 731, "title": "Option to automatically configure based on directory layout", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 9, "created_at": "2020-04-22T22:17:47Z", "updated_at": "2020-04-27T16:32:44Z", "closed_at": "2020-04-27T16:30:26Z", "author_association": "OWNER", "pull_request": null, "body": "My Datasette projects increasingly take on the following structure:\r\n\r\n- `metadata.json` with the metadata\r\n- One or more `something.db` database files\r\n- A `templates/` folder with some custom templates\r\n- A `plugins/` folder with some custom plugins\r\n\r\nThen I have to run Datasette like this:\r\n\r\n datasette *.db -m metadata.json --template-dir=templates --plugins-dir=plugins\r\n\r\nIt would be really interesting if Datasette had a special mode where you could point it at a directory with the above layout and it would automatically configure itself based on the contents.\r\n\r\nMaybe even allow `datasette serve` to detect if it was passed a single argument that's a directory, not a file, and kick in to \"directory layout configuration mode\" in that case:\r\n\r\n datasette .\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/731/reactions\", \"total_count\": 2, \"+1\": 2, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607107849, "node_id": "MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw", "number": 739, "title": "Configuration directory mode", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-26T20:37:46Z", "updated_at": "2020-04-27T16:30:25Z", "closed_at": "2020-04-27T16:30:25Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/739", "body": "Refs #731\r\n\r\nTODO:\r\n\r\n- [x] Decide how to combine explicit command-line options with items detected from the directory structure\r\n- [x] Add unit tests\r\n- [x] Implement `inspect-data.json` mechanism for populating `immutables`\r\n- [x] Add documentation", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/739/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 607086780, "node_id": "MDU6SXNzdWU2MDcwODY3ODA=", "number": 738, "title": "Pass a request object to custom page templates", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-26T18:57:48Z", "updated_at": "2020-04-26T19:01:54Z", "closed_at": "2020-04-26T19:01:54Z", "author_association": "OWNER", "pull_request": null, "body": "Follow-up to #648. I'm not passing a request object to `.render_template()` at the moment, which breaks any other custom plugins using e.g. `extra_template_vars()` that were expecting to be able to access the request.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/738/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 607067303, "node_id": "MDExOlB1bGxSZXF1ZXN0NDA5MTIzODk3", "number": 737, "title": "Custom pages mechanism, refs #648", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-04-26T17:31:41Z", "updated_at": "2020-04-26T18:46:43Z", "closed_at": "2020-04-26T18:46:43Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/737", "body": "Refs #648. TODO:\r\n- [x] Pass a `view_name` to `render_template()`\r\n- [x] Mechanism for custom status code / headers / redirect\r\n- [x] Documentation", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/737/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 606032950, "node_id": "MDU6SXNzdWU2MDYwMzI5NTA=", "number": 11, "title": "Try running S3 uploads in a thread pool", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-24T04:34:31Z", "updated_at": "2020-04-24T16:45:41Z", "closed_at": "2020-04-24T16:45:41Z", "author_association": "MEMBER", "pull_request": null, "body": "Since #10 provided such a speedup, can the same thing be done for the actual uploads?\r\n\r\nhttp://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance.", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 606033104, "node_id": "MDU6SXNzdWU2MDYwMzMxMDQ=", "number": 12, "title": "If less than 500MB, show size in MB not GB", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-24T04:35:01Z", "updated_at": "2020-04-24T04:35:25Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Just saw this:\r\n```\r\nUploading 0.05 GB\r\n```", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/12/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 606028272, "node_id": "MDU6SXNzdWU2MDYwMjgyNzI=", "number": 10, "title": "Speed up hashing step using threads", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-24T04:20:08Z", "updated_at": "2020-04-24T04:32:35Z", "closed_at": "2020-04-24T04:32:35Z", "author_association": "MEMBER", "pull_request": null, "body": "This TODO from the code:\r\n\r\nhttps://github.com/dogsheep/photos-to-sqlite/blob/2e7f2c67cc18b02c75bb64992a05b0196e507252/photos_to_sqlite/cli.py#L82-L90", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 605938063, "node_id": "MDU6SXNzdWU2MDU5MzgwNjM=", "number": 9, "title": "upload command should be resumable, should only upload photos not already uploaded", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-23T23:31:08Z", "updated_at": "2020-04-23T23:39:14Z", "closed_at": "2020-04-23T23:39:14Z", "author_association": "MEMBER", "pull_request": null, "body": "Follow on from #4. ", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 605147638, "node_id": "MDU6SXNzdWU2MDUxNDc2Mzg=", "number": 8, "title": "Should I have used MD5 instead of SHA256?", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-23T00:02:08Z", "updated_at": "2020-04-23T00:03:35Z", "closed_at": "2020-04-23T00:03:35Z", "author_association": "MEMBER", "pull_request": null, "body": "https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html\r\n\r\n> Objects created by the PUT Object, POST Object, or Copy operation, or through the AWS Management Console, and are encrypted by SSE-S3 or plaintext, have ETags that are an MD5 digest of their object data.\r\n", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 603624862, "node_id": "MDU6SXNzdWU2MDM2MjQ4NjI=", "number": 31, "title": "Issue and milestone should have foreign key to repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-21T00:46:24Z", "updated_at": "2020-04-22T01:20:19Z", "closed_at": "2020-04-22T01:20:19Z", "author_association": "MEMBER", "pull_request": null, "body": "Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`.\r\n\r\n_Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 604222295, "node_id": "MDU6SXNzdWU2MDQyMjIyOTU=", "number": 32, "title": "Issue comments don't appear to populate issues foreign key", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-21T19:17:32Z", "updated_at": "2020-04-22T01:17:44Z", "closed_at": "2020-04-22T01:17:44Z", "author_association": "MEMBER", "pull_request": null, "body": "https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101\r\n\r\n\"Screen\r\n", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 285168503, "node_id": "MDU6SXNzdWUyODUxNjg1MDM=", "number": 176, "title": "Add GraphQL endpoint", "user": {"value": 173848, "label": "yozlet"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2017-12-29T23:21:01Z", "updated_at": "2020-04-21T14:16:24Z", "closed_at": null, "author_association": "NONE", "pull_request": null, "body": "Would make it much easier to build React & similar frontends. Maybe with https://github.com/graphql-python/sanic-graphql ?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/176/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 603618244, "node_id": "MDU6SXNzdWU2MDM2MTgyNDQ=", "number": 30, "title": "Issues milestone column is the wrong type", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-21T00:24:34Z", "updated_at": "2020-04-21T00:45:23Z", "closed_at": "2020-04-21T00:36:22Z", "author_association": "MEMBER", "pull_request": null, "body": "https://github-to-sqlite.dogsheep.net/github/issues?milestone=2857392\r\n\r\n![2A4C1185-2434-4F29-9EA0-3246E2F03F77](https://user-images.githubusercontent.com/9599/79811760-b7e08b00-832b-11ea-9ad7-684a6ae097a6.jpeg)\r\n\r\nIt is TEXT when it should be an INTEGER - which is why the foreign key label is not correctly displayed.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/30/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 603617013, "node_id": "MDU6SXNzdWU2MDM2MTcwMTM=", "number": 29, "title": "Milestones should have foreign key to creator and repo", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-21T00:20:44Z", "updated_at": "2020-04-21T00:43:58Z", "closed_at": "2020-04-21T00:43:58Z", "author_association": "MEMBER", "pull_request": null, "body": "https://github-to-sqlite.dogsheep.net/github/milestones\r\n\r\nCreator is an integer but not a foreign key to users\r\n\r\nRepo is missing entirely!", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/29/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602619330, "node_id": "MDU6SXNzdWU2MDI2MTkzMzA=", "number": 45, "title": "Use raise_for_status() everywhere", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-19T04:38:28Z", "updated_at": "2020-04-19T04:39:22Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message.\r\n\r\nRecent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575\r\n\r\nUsing `response.raise_for_status()` everywhere will make these errors less confusing.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 602575575, "node_id": "MDU6SXNzdWU2MDI1NzU1NzU=", "number": 6, "title": "Add progress bar to upload command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-18T23:32:41Z", "updated_at": "2020-04-19T00:15:24Z", "closed_at": "2020-04-19T00:15:24Z", "author_association": "MEMBER", "pull_request": null, "body": "Upload was added in #4 ", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602533539, "node_id": "MDU6SXNzdWU2MDI1MzM1Mzk=", "number": 4, "title": "Upload all my photos to a secure S3 bucket", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5324096, "label": "Apple Photos online and securely browsable"}, "comments": 14, "created_at": "2020-04-18T19:24:50Z", "updated_at": "2020-04-18T21:58:11Z", "closed_at": "2020-04-18T21:57:13Z", "author_association": "MEMBER", "pull_request": null, "body": "- [x] Create a bucket with bucket credentials\r\n- [x] Programmatically upload some recent photos to it (from a notebook)\r\n- [x] Turn this into a script", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602551638, "node_id": "MDU6SXNzdWU2MDI1NTE2Mzg=", "number": 5, "title": "photos-to-sqlite s3-auth command", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-18T21:05:25Z", "updated_at": "2020-04-18T21:08:44Z", "closed_at": "2020-04-18T21:08:44Z", "author_association": "MEMBER", "pull_request": null, "body": "Modeled on `github-to-sqlite auth` - prompts the user for their S3 credentials and saves them to `auth.json`.", "repo": {"value": 256834907, "label": "dogsheep-photos"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601333634, "node_id": "MDU6SXNzdWU2MDEzMzM2MzQ=", "number": 28, "title": "Pull repository contributors", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-16T18:46:40Z", "updated_at": "2020-04-18T15:05:10Z", "closed_at": "2020-04-18T15:05:10Z", "author_association": "MEMBER", "pull_request": null, "body": "https://developer.github.com/v3/repos/#list-contributors\r\n\r\n`GET /repos/:owner/:repo/contributors`\r\n\r\nNot sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601330277, "node_id": "MDU6SXNzdWU2MDEzMzAyNzc=", "number": 27, "title": "Repos have a big blob of JSON in the organization column", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-04-16T18:43:14Z", "updated_at": "2020-04-18T00:19:16Z", "closed_at": "2020-04-18T00:18:52Z", "author_association": "MEMBER", "pull_request": null, "body": "e.g. https://github-to-sqlite.dogsheep.net/github/repos\r\n\r\n![github__repos__11_rows_where_sorted_by_updated_at_descending](https://user-images.githubusercontent.com/9599/79494124-5640b980-7fd7-11ea-99a2-17ffbd82f9ce.png)\r\n\r\nThis appears to be obsolete because the `owner` column already links to that record, albeit in the `users` table with `type` set to `Organization`: https://github-to-sqlite.dogsheep.net/github/users/53015001", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/27/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601392318, "node_id": "MDU6SXNzdWU2MDEzOTIzMTg=", "number": 101, "title": "README should include an example of CLI data insertion", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-16T19:45:37Z", "updated_at": "2020-04-17T23:59:49Z", "closed_at": "2020-04-17T23:59:49Z", "author_association": "OWNER", "pull_request": null, "body": "Maybe using `curl` from the GitHub API.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/101/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601358649, "node_id": "MDU6SXNzdWU2MDEzNTg2NDk=", "number": 100, "title": "Mechanism for forcing column-type, over-riding auto-detection", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-16T19:12:52Z", "updated_at": "2020-04-17T23:53:32Z", "closed_at": "2020-04-17T23:53:32Z", "author_association": "OWNER", "pull_request": null, "body": "As seen in https://github.com/dogsheep/github-to-sqlite/issues/27#issuecomment-614843406 - there's a problem where you insert a record with a `None` value for a column and that column is created as `TEXT` - but actually you intended it to be an `INT` (as later examples will demonstrate).\r\n\r\nSome kind of mechanism for over-riding the detected types of columns would be useful here.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/100/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602176870, "node_id": "MDU6SXNzdWU2MDIxNzY4NzA=", "number": 43, "title": "\"twitter-to-sqlite lists\" command for retrieving a user's owned lists", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-17T19:08:59Z", "updated_at": "2020-04-17T23:48:28Z", "closed_at": "2020-04-17T23:30:39Z", "author_association": "MEMBER", "pull_request": null, "body": "https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships\r\n\r\n`https://api.twitter.com/1.1/lists/ownerships.json `", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 585353598, "node_id": "MDU6SXNzdWU1ODUzNTM1OTg=", "number": 37, "title": "Handle \"User not found\" error", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-03-20T22:14:32Z", "updated_at": "2020-04-17T23:43:46Z", "closed_at": "2020-04-17T23:43:46Z", "author_association": "MEMBER", "pull_request": null, "body": "While running `user-timeline` I got this bug (because a screen name I asked for didn't exist):\r\n```\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py\", line 185, in transform_user\r\n user[\"created_at\"] = parser.parse(user[\"created_at\"])\r\nKeyError: 'created_at'\r\n>>> import pdb\r\n>>> pdb.pm()\r\n> /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user()\r\n-> user[\"created_at\"] = parser.parse(user[\"created_at\"])\r\n(Pdb) user\r\n{'errors': [{'code': 50, 'message': 'User not found.'}]}\r\n```", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602173589, "node_id": "MDU6SXNzdWU2MDIxNzM1ODk=", "number": 42, "title": "Error running user-timeline with --sql and --ids together", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-17T19:02:06Z", "updated_at": "2020-04-17T23:34:40Z", "closed_at": "2020-04-17T23:34:40Z", "author_association": "MEMBER", "pull_request": null, "body": "```\r\n$ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids\r\nTraceback (most recent call last):\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite\", line 11, in \r\n load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')()\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py\", line 284, in user_timeline\r\n \"@{:\" + str(max(len(identifier) for identifier in identifiers)) + \"}\"\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py\", line 284, in \r\n \"@{:\" + str(max(len(identifier) for identifier in identifiers)) + \"}\"\r\nTypeError: object of type 'int' has no len()\r\n```\r\nBut this DID work - casting to strings:\r\n```\r\n$ twitter-to-sqlite user-timeline tweets.db --sql='select \"\" || id from users' --ids\r\n... this worked ...\r\n```", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 602181581, "node_id": "MDU6SXNzdWU2MDIxODE1ODE=", "number": 44, "title": "tweet[\"source\"] can be an empty string", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-17T19:18:26Z", "updated_at": "2020-04-17T22:01:44Z", "closed_at": "2020-04-17T22:01:44Z", "author_association": "MEMBER", "pull_request": null, "body": "Got this excepion:\r\n```\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py\", line 641, in extract_and_save_source\r\n details = m.groupdict()\r\nAttributeError: 'NoneType' object has no attribute 'groupdict'\r\n```\r\nI traced it back to this tweet: https://twitter.com/osder/status/578712651393576960\r\n```\r\n(Pdb) source_re\r\nre.compile('.*?)\".*?>(?P.*?)')\r\n(Pdb) locals()['source']\r\n''\r\n(Pdb) u\r\n> /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets()\r\n-> tweet[\"source\"] = extract_and_save_source(db, tweet[\"source\"])\r\n(Pdb) tweet\r\n{'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741}\r\n```", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601265023, "node_id": "MDU6SXNzdWU2MDEyNjUwMjM=", "number": 25, "title": "Improvements to demo instance", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-16T17:26:55Z", "updated_at": "2020-04-16T18:07:12Z", "closed_at": "2020-04-16T18:07:12Z", "author_association": "MEMBER", "pull_request": null, "body": "- [x] Demo should pull issue-comments as well", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/25/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 601271612, "node_id": "MDU6SXNzdWU2MDEyNzE2MTI=", "number": 26, "title": "Topics are missing from repositories", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-16T17:36:32Z", "updated_at": "2020-04-16T17:41:11Z", "closed_at": "2020-04-16T17:41:11Z", "author_association": "MEMBER", "pull_request": null, "body": "I'm sure this used to work, but right now repositories are fetched without their topics.\r\n\r\nhttps://developer.github.com/v3/repos/ says you need to send a custom `Accept` header of `application/vnd.github.mercy-preview+json` to get topics.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 549287310, "node_id": "MDU6SXNzdWU1NDkyODczMTA=", "number": 76, "title": "order_by mechanism", "user": {"value": 10501166, "label": "metab0t"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-01-14T02:06:03Z", "updated_at": "2020-04-16T06:23:29Z", "closed_at": "2020-04-16T03:13:06Z", "author_association": "NONE", "pull_request": null, "body": "In some cases, I want to iterate rows in a table with `ORDER BY` clause. It would be nice to have a `rows_order_by` function similar to `rows_where`.\r\nIn a more general case, `rows_filter` function might be added to allow more customized filtering to iterate rows.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/76/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 598013965, "node_id": "MDU6SXNzdWU1OTgwMTM5NjU=", "number": 724, "title": "--plugin-secret over-rides existing metadata.json plugin config", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-04-10T17:56:30Z", "updated_at": "2020-04-16T04:58:12Z", "closed_at": "2020-04-10T18:34:21Z", "author_association": "OWNER", "pull_request": null, "body": "This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored.\r\n\r\nhttps://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109\r\n\r\n", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/724/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 600583271, "node_id": "MDU6SXNzdWU2MDA1ODMyNzE=", "number": 727, "title": "Custom CSS class on body for styling canned queries", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-04-15T20:57:32Z", "updated_at": "2020-04-15T21:14:58Z", "closed_at": "2020-04-15T21:07:50Z", "author_association": "OWNER", "pull_request": null, "body": "https://latest.datasette.io/fixtures/neighborhood_search is a canned query page.\r\n\r\nOne of the templates scanned is `query-fixtures-neighborhood_search.html`\r\n\r\nBUT... the body CSS class just looks like this:\r\n```html\r\n\r\n```\r\nI would be useful if that included a class that can be used to style that specific canned query page.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/727/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 593751293, "node_id": "MDU6SXNzdWU1OTM3NTEyOTM=", "number": 97, "title": "Adding a \"recreate\" flag to the `Database` constructor", "user": {"value": 1448859, "label": "betatim"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-04-04T05:41:10Z", "updated_at": "2020-04-15T14:29:31Z", "closed_at": "2020-04-13T03:52:29Z", "author_association": "NONE", "pull_request": null, "body": "I have a [script](https://github.com/betatim/binder-datasette/blob/master/create-db.ipynb) that imports data into a sqlite DB. When I re-run that script I'd like to remove the existing sqlite DB, instead of adding to it. The pragmatic answer is to add the check and file deletion to my script.\r\n\r\nHowever I thought it would be easy and useful for others to add a `recreate=True` flag to `db = sqlite_utils.Database(\"binder-launches.db\")`. After taking a look at the code for it I am not so sure any more. This is because the connection string could be a URL (or \"connection string\") like `\"file:///tmp/foo.db\"`. I don't know what the equivalent of `os.path.exists()` is for a connection string or how to detect that something is a connection string and raise an error \"can't use recreate=True and conn_string at the same time\".\r\n\r\nDoes anyone have an idea/suggestion where to start investigating?", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/97/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 530491074, "node_id": "MDU6SXNzdWU1MzA0OTEwNzQ=", "number": 14, "title": "Command for importing events", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-11-29T21:28:58Z", "updated_at": "2020-04-14T19:38:34Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "Eg from https://api.github.com/users/simonw/events\r\n\r\nDocs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 546051181, "node_id": "MDU6SXNzdWU1NDYwNTExODE=", "number": 16, "title": "Exception running first command: IndexError: list index out of range", "user": {"value": 15092, "label": "jayvdb"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2020-01-07T03:01:58Z", "updated_at": "2020-04-14T18:37:21Z", "closed_at": "2020-04-14T18:37:21Z", "author_association": "NONE", "pull_request": null, "body": "Exception running first command without an existing db or auth.\r\n\r\n```py\r\n> mkdir ~/.github/coala\r\n> /usr/bin/github-to-sqlite repos ~/.github/coala coala\r\nTraceback (most recent call last):\r\n File \"/usr/bin/github-to-sqlite\", line 11, in \r\n load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')()\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/usr/lib/python3.7/site-packages/click/core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py\", line 163, in repos\r\n utils.save_repo(db, repo)\r\n File \"/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py\", line 120, in save_repo\r\n to_save[\"owner\"] = save_user(db, to_save[\"owner\"])\r\n File \"/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py\", line 61, in save_user\r\n return db[\"users\"].upsert(to_save, pk=\"id\", alter=True).last_pk\r\n File \"/usr/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1135, in upsert\r\n extracts=extracts,\r\n File \"/usr/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1162, in upsert_all\r\n upsert=True,\r\n File \"/usr/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1105, in insert_all\r\n row = list(self.rows_where(\"rowid = ?\", [self.last_rowid]))[0]\r\nIndexError: list index out of range\r\n```", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 599776345, "node_id": "MDU6SXNzdWU1OTk3NzYzNDU=", "number": 24, "title": "Feature idea: github-to-sqlite everything ...", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-14T18:34:00Z", "updated_at": "2020-04-14T18:34:00Z", "closed_at": null, "author_association": "MEMBER", "pull_request": null, "body": "At the moment if you want to pull all your repos, issues, issues comments etc you have to do it with a sequence of separate commands.\r\n\r\nConsider adding a `everything` or `all` command which fetches everything that the tool knows how to fetch, and is designed to be run on a cron in a way that fetches just new stuff each time.", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/24/reactions\", \"total_count\": 7, \"+1\": 7, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 598640234, "node_id": "MDU6SXNzdWU1OTg2NDAyMzQ=", "number": 99, "title": ".upsert_all() should maybe error if dictionaries passed to it do not have the same keys", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-04-13T03:02:25Z", "updated_at": "2020-04-13T03:05:20Z", "closed_at": "2020-04-13T03:05:04Z", "author_association": "OWNER", "pull_request": null, "body": "While investigating #98 I stumbled across this:\r\n```\r\n def test_upsert_compound_primary_key(fresh_db):\r\n table = fresh_db[\"table\"]\r\n table.upsert_all(\r\n [\r\n {\"species\": \"dog\", \"id\": 1, \"name\": \"Cleo\", \"age\": 4},\r\n {\"species\": \"cat\", \"id\": 1, \"name\": \"Catbag\"},\r\n ],\r\n pk=(\"species\", \"id\"),\r\n )\r\n table.upsert_all(\r\n [\r\n {\"species\": \"dog\", \"id\": 1, \"age\": 5},\r\n {\"species\": \"dog\", \"id\": 2, \"name\": \"New Dog\", \"age\": 1},\r\n ],\r\n pk=(\"species\", \"id\"),\r\n )\r\n> assert [\r\n {\"species\": \"dog\", \"id\": 1, \"name\": \"Cleo\", \"age\": 5},\r\n {\"species\": \"cat\", \"id\": 1, \"name\": \"Catbag\", \"age\": None},\r\n {\"species\": \"dog\", \"id\": 2, \"name\": \"New Dog\", \"age\": 1},\r\n ] == list(table.rows)\r\nE AssertionError: assert [{'age': 5, '...cies': 'dog'}] == [{'age': 5, '...cies': 'dog'}]\r\nE At index 0 diff: {'species': 'dog', 'id': 1, 'name': 'Cleo', 'age': 5} != {'species': 'dog', 'id': 1, 'name': None, 'age': 5}\r\nE Full diff:\r\nE - [{'age': 5, 'id': 1, 'name': 'Cleo', 'species': 'dog'},\r\nE ? ^^^ --\r\nE + [{'age': 5, 'id': 1, 'name': None, 'species': 'dog'},\r\nE ? ^^^\r\nE {'age': None, 'id': 1, 'name': 'Catbag', 'species': 'cat'},\r\nE {'age': 1, 'id': 2, 'name': 'New Dog', 'species': 'dog'}]\r\n```\r\nIf you run `.upsert_all()` with multiple dictionaries it doesn't quite have the effect you might expect.", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/99/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 594189527, "node_id": "MDU6SXNzdWU1OTQxODk1Mjc=", "number": 717, "title": "See if I can get Datasette working on Zeit Now v2", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 10, "created_at": "2020-04-05T00:56:48Z", "updated_at": "2020-04-06T22:47:22Z", "closed_at": "2020-04-06T22:47:21Z", "author_association": "OWNER", "pull_request": null, "body": "I thought this was impossible because AWS Lambda doesn't ship the `sqlite3` standard library module... but apparenttly that's not the case on Now v2 any more!\r\n\r\nhttps://now-2-python-versions-ks69olzpi.now.sh/api\r\n\r\n```\r\n _________________________________________________________________________________________________________________________________________________________________ \r\n/ Hello from Python from a ZEIT Now Serverless Function! Version is 3.6.10 (default, Mar 10 2020, 22:54:43) \\\r\n\\ [GCC 4.8.3 20140911 (Red Hat 4.8.3-9)], sqlite3 module = , sqlite3 version = [('3.7.17',)] /\r\n ----------------------------------------------------------------------------------------------------------------------------------------------------------------- \r\n \\ ^__^\r\n \\ (oo)\\_______\r\n (__)\\ )\\/\\\r\n ||----w |\r\n || ||\r\n```\r\nThat's from shipping this code as `api/index.py`:\r\n```python\r\nfrom http.server import BaseHTTPRequestHandler\r\nfrom cowpy import cow\r\nimport sys\r\n\r\n\r\ntry:\r\n import sqlite3\r\nexcept ImportError:\r\n sqlite3 = None\r\n\r\n\r\nclass handler(BaseHTTPRequestHandler):\r\n def do_GET(self):\r\n self.send_response(200)\r\n self.send_header(\"Content-type\", \"text/plain\")\r\n self.end_headers()\r\n message = cow.Cowacter().milk(\r\n \"Hello from Python from a ZEIT Now Serverless Function! Version is {}, sqlite3 module = {}, sqlite3 version = {}\".format(\r\n sys.version, sqlite3, sqlite3.connect(\":memory:\").execute(\"select sqlite_version()\").fetchall()\r\n )\r\n )\r\n self.wfile.write(message.encode())\r\n return\r\n```\r\nNow v2 supports ASGI so this might be possible without too much work: https://zeit.co/docs/runtimes#advanced-usage/advanced-python-usage/asynchronous-server-gateway-interface", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/717/reactions\", \"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 408376825, "node_id": "MDU6SXNzdWU0MDgzNzY4MjU=", "number": 409, "title": "Zeit API v1 does not work for new users - need to migrate to v2", "user": {"value": 209967, "label": "michaelmcandrew"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-02-09T00:50:33Z", "updated_at": "2020-04-06T15:44:46Z", "closed_at": "2020-04-06T15:44:46Z", "author_association": "NONE", "pull_request": null, "body": "Hello there,\r\n\r\nThis looks like a great tool. Thanks. \r\n\r\nUnfortunately, I hit the following error:\r\n\r\n```\r\nmichael@hazel ~/src/cc-datasette/data/out datasette publish now cc-datasette.db\r\n> WARN! You are using an old version of the Now Platform. More: https://zeit.co/docs/v1-upgrade\r\n> Deploying /tmp/tmpjtrxwsyf/datasette under michaelmcandrew\r\n> Using project datasette\r\n> Error! You tried to create a Now 1.0 deployment. Please use Now 2.0 instead: https://zeit.co/upgrade\r\n```\r\nI'm guessing you might not hit this because you are not a 'new user' of Zeit (https://github.com/zeit/now-cli/issues/1805#issuecomment-452470953).\r\n\r\nWould it be a lot of work to upgrade to the new Zeit API, do you think?", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/409/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 594168758, "node_id": "MDU6SXNzdWU1OTQxNjg3NTg=", "number": 716, "title": "extra_template_vars() sending wrong view_name for index", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 8, "created_at": "2020-04-04T23:57:09Z", "updated_at": "2020-04-05T20:04:08Z", "closed_at": "2020-04-05T18:28:48Z", "author_association": "OWNER", "pull_request": null, "body": "See https://github.com/simonw/museums/issues/20#issuecomment-609103663 - at some point between 286ed286b68793532c2a38436a08343b45cfbc91 and current master (e0e7a0facfc935a835cd73c720bc46661462f0b1 today) a bug was introduced where the `extra_template_vars(request, view_name)` plugin hook started being passed `None` instead of `index` for the `view_name` parameter on the site index page.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/716/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 573583971, "node_id": "MDU6SXNzdWU1NzM1ODM5NzE=", "number": 689, "title": "\"Templates considered\" comment broken in >=0.35", "user": {"value": 35075, "label": "chrishas35"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-03-01T17:31:21Z", "updated_at": "2020-04-05T19:39:44Z", "closed_at": "2020-04-05T19:39:44Z", "author_association": "NONE", "pull_request": null, "body": "Noticed that the \"Templates Considered\" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34:\r\n\r\n```diff\r\n< \"datasette_version\": \"0.34\",\r\n< \"app_css_hash\": \"ffa51a\",\r\n< \"select_templates\": [\r\n< \"*index.html\"\r\n< ],\r\n< \"zip\": \"\",\r\n< \"body_scripts\": [],\r\n< \"extra_css_urls\": \"\",\r\n< \"extra_js_urls\": \"\",\r\n< \"format_bytes\": \"\",\r\n< \"database_url\": \">\",\r\n< \"database_color\": \">\"\r\n---\r\n> \"datasette_version\": \"0.35\",\r\n> \"database_url\": \">\",\r\n> \"database_color\": \">\"\r\n```", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/689/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 574043218, "node_id": "MDU6SXNzdWU1NzQwNDMyMTg=", "number": 693, "title": "Variables from extra_template_vars() not exposed in _context=1", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2020-03-02T15:14:51Z", "updated_at": "2020-04-05T19:12:48Z", "closed_at": "2020-04-05T19:12:48Z", "author_association": "OWNER", "pull_request": null, "body": "The `_context=1` debugging mode does not show variables that should have been added to the context by the `extra_template_vars()` plugin hook.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/693/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 521323012, "node_id": "MDExOlB1bGxSZXF1ZXN0MzM5NzIyNzkw", "number": 627, "title": "Support Python 3.8, stop supporting Python 3.5", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2019-11-12T04:36:33Z", "updated_at": "2020-04-05T10:23:58Z", "closed_at": "2019-11-12T05:09:12Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/627", "body": "Refs #622", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/627/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 587322443, "node_id": "MDU6SXNzdWU1ODczMjI0NDM=", "number": 710, "title": "Remove Zeit Now v1 support", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-03-24T22:39:49Z", "updated_at": "2020-04-04T23:05:12Z", "closed_at": "2020-04-04T23:05:12Z", "author_association": "OWNER", "pull_request": null, "body": "It will remain supported as a plugin but since no-one can sign up for Docker hosting any more (for over a year now) there's no point including it in Datasette core.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/710/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 593006814, "node_id": "MDU6SXNzdWU1OTMwMDY4MTQ=", "number": 715, "title": "Refactor duplicate cell display logic", "user": {"value": 9599, "label": "simonw"}, "state": "open", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-03T00:58:11Z", "updated_at": "2020-04-03T00:58:11Z", "closed_at": null, "author_association": "OWNER", "pull_request": null, "body": "The logic for rendering cells in table view and in database (or canned query) view is currently very similar:\r\n\r\nhttps://github.com/simonw/datasette/blob/7656fd64d8b6a32ebc34d89c1b8711cc5ea240f7/datasette/views/base.py#L514-L539\r\n\r\nCompared with:\r\n\r\nhttps://github.com/simonw/datasette/blob/7656fd64d8b6a32ebc34d89c1b8711cc5ea240f7/datasette/views/table.py#L104-L195\r\n\r\nI'll be changing this a bit in #698 but I should still try to clean this up more further in the future.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/715/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": null} {"id": 592829135, "node_id": "MDU6SXNzdWU1OTI4MjkxMzU=", "number": 713, "title": "Support YAML in metadata - metadata.yaml", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 6, "created_at": "2020-04-02T18:10:05Z", "updated_at": "2020-04-02T19:36:17Z", "closed_at": "2020-04-02T19:30:55Z", "author_association": "OWNER", "pull_request": null, "body": "I was originally going to do this with a plugin - see #357 - but the more I work with `metadata.json` the more I want it to just accept YAML as an optional alternative to JSON.\r\n\r\nThe best example why is still this one: https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/master/russian-ads-metadata.yaml\r\n\r\nYAML is just SO much better than JSON for multi-line strings - in particular HTML and SQL, both of which are common in `metadata.json` files.", "repo": {"value": 107914493, "label": "datasette"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/713/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 592844348, "node_id": "MDExOlB1bGxSZXF1ZXN0Mzk3NzQ5NjUz", "number": 714, "title": "--metadata accepts YAML as well as JSON", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2020-04-02T18:36:02Z", "updated_at": "2020-04-02T19:30:54Z", "closed_at": "2020-04-02T19:30:54Z", "author_association": "OWNER", "pull_request": "simonw/datasette/pulls/714", "body": "Refs #713. Still needs tests and documentation.", "repo": {"value": 107914493, "label": "datasette"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/datasette/issues/714/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 591613579, "node_id": "MDU6SXNzdWU1OTE2MTM1Nzk=", "number": 41, "title": "Bug: recorded a since_id for None, None", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-04-01T04:29:43Z", "updated_at": "2020-04-01T04:31:11Z", "closed_at": "2020-04-01T04:31:11Z", "author_association": "MEMBER", "pull_request": null, "body": "This shouldn't happen in the `since_ids` table (relates to #39):\r\n\r\n\"twitter__since_ids__2_rows\"\r\n", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 590669793, "node_id": "MDU6SXNzdWU1OTA2Njk3OTM=", "number": 40, "title": "Feature: record history of follower counts", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2020-03-30T23:32:28Z", "updated_at": "2020-04-01T04:13:05Z", "closed_at": "2020-04-01T04:13:05Z", "author_association": "MEMBER", "pull_request": null, "body": "We currently over-write the follower count every time we import a tweet (when we import that user profile again):\r\n\r\nhttps://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294\r\n\r\nIt would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 492297930, "node_id": "MDU6SXNzdWU0OTIyOTc5MzA=", "number": 10, "title": "Rethink progress bars for various commands", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 5, "created_at": "2019-09-11T15:06:47Z", "updated_at": "2020-04-01T03:45:48Z", "closed_at": "2020-04-01T03:45:48Z", "author_association": "MEMBER", "pull_request": null, "body": "Progress bars and the `--silent` option are implemented inconsistently across commands at the moment.\r\n\r\nThis is made more challenging by the fact that for many operations the total length is not known.\r\n\r\nhttps://click.palletsprojects.com/en/7.x/api/#click.progressbar", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 590666760, "node_id": "MDU6SXNzdWU1OTA2NjY3NjA=", "number": 39, "title": "--since feature can be confused by retweets", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 11, "created_at": "2020-03-30T23:25:33Z", "updated_at": "2020-04-01T03:45:16Z", "closed_at": "2020-04-01T03:45:16Z", "author_association": "MEMBER", "pull_request": null, "body": "If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run.\r\n\r\nIt does this by seeking out the max ID of their previous tweets:\r\n\r\nhttps://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311\r\n\r\nBUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!", "repo": {"value": 206156866, "label": "twitter-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 589801352, "node_id": "MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3", "number": 96, "title": "Add type conversion for Panda's Timestamp", "user": {"value": 32605365, "label": "b0b5h4rp13"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 2, "created_at": "2020-03-29T14:13:09Z", "updated_at": "2020-03-31T04:40:49Z", "closed_at": "2020-03-31T04:40:48Z", "author_association": "CONTRIBUTOR", "pull_request": "simonw/sqlite-utils/pulls/96", "body": "Add type conversion for Panda's Timestamp, if Panda library is present in system\r\n(thanks for this project, I was about to do the same thing from scratch)", "repo": {"value": 140912432, "label": "sqlite-utils"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/simonw/sqlite-utils/issues/96/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 544571092, "node_id": "MDU6SXNzdWU1NDQ1NzEwOTI=", "number": 15, "title": "Assets table with downloads", "user": {"value": 2029, "label": "garethr"}, "state": "closed", "locked": 0, "assignee": null, "milestone": {"value": 5225818, "label": "1.0"}, "comments": 4, "created_at": "2020-01-02T13:05:28Z", "updated_at": "2020-03-28T12:17:01Z", "closed_at": "2020-03-23T19:17:32Z", "author_association": "NONE", "pull_request": null, "body": "The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful?\r\nIf so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?", "repo": {"value": 207052882, "label": "github-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 543355051, "node_id": "MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2", "number": 6, "title": "don't break if source is missing", "user": {"value": 78035, "label": "mfa"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 1, "created_at": "2019-12-29T10:46:47Z", "updated_at": "2020-03-28T02:28:11Z", "closed_at": "2020-03-28T02:28:11Z", "author_association": "CONTRIBUTOR", "pull_request": "dogsheep/swarm-to-sqlite/pulls/6", "body": "broke for me. very old checkins in 2010 had no source set.", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "type": "pull", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": 0, "state_reason": null} {"id": 589491711, "node_id": "MDU6SXNzdWU1ODk0OTE3MTE=", "number": 7, "title": "Upgrade to sqlite-utils 2.x", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 0, "created_at": "2020-03-28T02:24:51Z", "updated_at": "2020-03-28T02:25:03Z", "closed_at": "2020-03-28T02:25:03Z", "author_association": "MEMBER", "pull_request": null, "body": "", "repo": {"value": 205429375, "label": "swarm-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 503234169, "node_id": "MDU6SXNzdWU1MDMyMzQxNjk=", "number": 2, "title": "Track and use the 'since' value", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 3, "created_at": "2019-10-07T05:02:59Z", "updated_at": "2020-03-27T22:22:30Z", "closed_at": "2020-03-27T22:22:30Z", "author_association": "MEMBER", "pull_request": null, "body": "Pocket says:\r\n\r\n> Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time.\r\n\r\nAt the bottom of https://getpocket.com/developer/docs/v3/retrieve", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"} {"id": 503233021, "node_id": "MDU6SXNzdWU1MDMyMzMwMjE=", "number": 1, "title": "Use better pagination (and implement progress bar)", "user": {"value": 9599, "label": "simonw"}, "state": "closed", "locked": 0, "assignee": null, "milestone": null, "comments": 4, "created_at": "2019-10-07T04:58:11Z", "updated_at": "2020-03-27T22:13:57Z", "closed_at": "2020-03-27T22:13:57Z", "author_association": "MEMBER", "pull_request": null, "body": "Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow.\r\n\r\nWe can do better by implementing pagination using count and offset.", "repo": {"value": 213286752, "label": "pocket-to-sqlite"}, "type": "issue", "active_lock_reason": null, "performed_via_github_app": null, "reactions": "{\"url\": \"https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions\", \"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "draft": null, "state_reason": "completed"}