id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason
1570375808,I_kwDODFdgUs5dmgiA,79,Deploy demo job is failing due to rate limit,9599,simonw,open,0,,,,,2,2023-02-03T20:05:01Z,2023-12-08T14:50:15Z,,MEMBER,,https://github.com/dogsheep/github-to-sqlite/actions/runs/4080058087/jobs/7032116511,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2029161033,I_kwDOCGYnMM548opJ,606,str and int as aliases for text and integer,9599,simonw,closed,0,,,,,2,2023-12-06T18:35:49Z,2023-12-06T19:44:04Z,2023-12-06T18:49:32Z,OWNER,,"I keep making this mistake:
```bash
sqlite-utils add-column content.db assets _since int
```
```
Usage: sqlite-utils add-column [OPTIONS] PATH TABLE COL_NAME [[integer|float|b
lob|text|INTEGER|FLOAT|BLOB|TEXT]]
Try 'sqlite-utils add-column -h' for help.
Error: Invalid value for '[[integer|float|blob|text|INTEGER|FLOAT|BLOB|TEXT]]':
'int' is not one of 'integer', 'float', 'blob', 'text', 'INTEGER', 'FLOAT', 'BLOB', 'TEXT'.
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1959278971,PR_kwDOBm6k_c5dpF-F,2202,Bump the python-packages group with 1 update,49699333,dependabot[bot],closed,0,,,,,2,2023-10-24T13:40:21Z,2023-11-08T13:19:03Z,2023-11-08T13:19:01Z,CONTRIBUTOR,simonw/datasette/pulls/2202,"Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=23.9.1&new-version=23.10.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
----
:books: Documentation preview :books:: https://datasette--2202.org.readthedocs.build/en/2202/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1926729132,PR_kwDOCGYnMM5b7Z_y,598,Fixed issue #433 - CLI eats cursor,62745,spookylukey,closed,0,,,,,2,2023-10-04T18:06:58Z,2023-11-04T00:46:55Z,2023-11-04T00:40:30Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/598,"The issue is that underlying iterator is not fully consumed within the body of the `with file_progress()` block. Instead, that block creates generator expressions like `docs = (dict(zip(headers, row)) for row in reader)`
These iterables are consumed later, outside the `with file_progress()` block, which consumes the underlying iterator, and in turn updates the progress bar.
This means that the `ProgressBar.__exit__` method gets called before the last time the `ProgressBar.update` method gets called. The result is that the code to make the cursor invisible (inside the `update()` method) is called after the cleanup code to make it visible (in the `__exit__` method).
The fix is to move consumption of the `docs` iterators within the progress bar block. (
(An additional fix, to make ProgressBar more robust against this kind of misuse, would to make it refusing to update after its `__exit__` method had been called, just like files cannot be `read()` after they are closed. That requires a in the click library).
Note that Github diff obscures the simplicity of this diff, it's just indenting a block of code.
----
:books: Documentation preview :books:: https://sqlite-utils--598.org.readthedocs.build/en/598/
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/598/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",0,
1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,Olshansk,closed,0,,,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code).
The following video should be self explanatory:
https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02
Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1865572575,PR_kwDOBm6k_c5Yt2eO,2155,Fix hupper.start_reloader entry point,79087,cadeef,open,0,,,,,2,2023-08-24T17:14:08Z,2023-09-27T18:44:02Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2155,"Update hupper's entry point so that click commands are processed properly.
Fixes #2123
----
:books: Documentation preview :books:: https://datasette--2155.org.readthedocs.build/en/2155/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2155/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 2, ""eyes"": 0}",0,
1825007061,I_kwDOBm6k_c5sx2XV,2123,datasette serve when invoked with --reload interprets the serve command as a file,79087,cadeef,open,0,,,,,2,2023-07-27T19:07:22Z,2023-09-18T13:02:46Z,,NONE,,"When running `datasette serve` with the `--reload` flag, the serve command is picked up as a file argument:
```
$ datasette serve --reload test_db
Starting monitor for PID 13574.
Error: Invalid value for '[FILES]...': Path 'serve' does not exist.
Press ENTER or change a file to reload.
```
If a 'serve' file is created it launches properly (albeit with an empty database called serve):
```
$ touch serve; datasette serve --reload test_db
Starting monitor for PID 13628.
INFO: Started server process [13628]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
```
Version (running from HEAD on main):
```
$ datasette --version
datasette, version 1.0a2
```
This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context.
I'm happy to debug and land a patch if it's welcome.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2123/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1900026059,I_kwDOBm6k_c5xQBjL,2188,"Plugin Hooks for ""compile to SQL"" languages",15178711,asg017,open,0,,,,,2,2023-09-18T01:37:15Z,2023-09-18T06:58:53Z,,CONTRIBUTOR,,"There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples:
- Logica https://logica.dev
- PRQL https://prql-lang.org
- Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy
It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage.
A few things I'd imagine a `datasette-prql` or `datasette-logica` plugin would do:
- `prql=` instead of `sql=`
- Code editor support (syntax highlighting, autocomplete)
- Hide/show SQL",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1898927976,I_kwDOBm6k_c5xL1do,2186,Mechanism for register_output_renderer hooks to access full count,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-09-15T18:57:54Z,2023-09-15T19:27:59Z,,OWNER,,"The cause of this bug:
- https://github.com/simonw/datasette-export-notebook/issues/17
Is that `datasette-export-notebook` was consulting `data[""filtered_table_rows_count""]` in the render output plugin function in order to show the total number of rows that would be exported.
That field is no longer available by default - the `""count""` field is only available if `?_extra=count` was passed.
It would be useful if plugins like this could access the total count on demand, should they need to.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1891614971,I_kwDOCGYnMM5wv8D7,594,Represent compound foreign keys in table.foreign_keys output,9599,simonw,open,0,,,,,2,2023-09-12T03:48:24Z,2023-09-12T03:51:13Z,,OWNER,,"Given this schema:
```sql
CREATE TABLE departments (
campus_name TEXT NOT NULL,
dept_code TEXT NOT NULL,
dept_name TEXT,
PRIMARY KEY (campus_name, dept_code)
);
CREATE TABLE courses (
course_code TEXT PRIMARY KEY,
course_name TEXT,
campus_name TEXT NOT NULL,
dept_code TEXT NOT NULL,
FOREIGN KEY (campus_name, dept_code) REFERENCES departments(campus_name, dept_code)
);
```
The output of `db[""courses""].foreign_keys` right now is:
```
[ForeignKey(table='courses', column='campus_name', other_table='departments', other_column='campus_name'),
ForeignKey(table='courses', column='dept_code', other_table='departments', other_column='dept_code')]
```
Which suggests two normal foreign keys, not one compound foreign key.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1861812208,PR_kwDOBm6k_c5YhH-W,2149,"Start a new `datasette.yaml` configuration file, with settings support",15178711,asg017,closed,0,,,,,2,2023-08-22T16:24:16Z,2023-08-23T01:26:11Z,2023-08-23T01:26:11Z,CONTRIBUTOR,simonw/datasette/pulls/2149,"refs #2093 #2143
This is the first step to implementing the new `datasette.yaml`/`datasette.json` configuration file.
- The old `--config` argument is now back, and is the path to a `datasette.yaml` file. Acts like the `--metadata` flag.
- The old `settings.json` behavior has been removed.
- The `""settings""` key inside `datasette.yaml` defines the same `--settings` flags
- Values passed in `--settings` will over-write values in `datasette.yaml`
Docs for the Config file is pretty light, not much to add until we add more config to the file.
----
:books: Documentation preview :books:: https://datasette--2149.org.readthedocs.build/en/2149/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2149/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1856760386,PR_kwDOBm6k_c5YQGcc,2144,Bump the python-packages group with 3 updates,49699333,dependabot[bot],closed,0,,,,,2,2023-08-18T13:49:37Z,2023-08-21T13:48:18Z,2023-08-21T13:48:16Z,CONTRIBUTOR,simonw/datasette/pulls/2144,"Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs).
Updates `sphinx` from 7.1.2 to 7.2.2
Release notes
Fix the signature of the StateMachine.insert_input() patch,
for when calling with keyword arguments.
Fixed membership testing (in) for the :py:class:str interface
of the asset classes (_CascadingStyleSheet and _JavaScript),
which several extensions relied upon.
Fixed a type error in SingleFileHTMLBuilder._get_local_toctree,
includehidden may be passed as a string or a boolean.
Fix :noindex: for PyModule and JSModule``.
Release 7.2.1 (released Aug 17, 2023)
Bugs fixed
Restored the the :py:class:str interface of the asset classes
(_CascadingStyleSheet and _JavaScript), which several extensions relied upon.
This will be removed in Sphinx 9.
Restored calls to Builder.add_{css,js}_file(),
which several extensions relied upon.
Restored the private API TocTree.get_toctree_ancestors(),
which several extensions relied upon.
Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.
Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.
Preserve leading whitespace lines in reStructuredText code blocks.
Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.
Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.
Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.
Support passing the --preview option through to Black, to select the future style.
Remove language_version from .pre-commit-hooks.yaml.
This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.
Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
----
:books: Documentation preview :books:: https://datasette--2144.org.readthedocs.build/en/2144/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1854970601,PR_kwDOBm6k_c5YKAZ4,2142,Bump the python-packages group with 2 updates,49699333,dependabot[bot],closed,0,,,,,2,2023-08-17T13:07:53Z,2023-08-18T13:49:29Z,2023-08-18T13:49:26Z,CONTRIBUTOR,simonw/datasette/pulls/2142,"Bumps the python-packages group with 2 updates: [sphinx](https://github.com/sphinx-doc/sphinx) and [blacken-docs](https://github.com/asottile/blacken-docs).
Updates `sphinx` from 7.1.2 to 7.2.0
Release notes
#11512: Deprecate sphinx.util.md5 and sphinx.util.sha1.
Use hashlib instead.
#11526: Deprecate sphinx.testing.path.
Use os.path or pathlib instead.
#11528: Deprecate sphinx.util.split_index_msg and sphinx.util.split_into.
Use sphinx.util.index_entries.split_index_msg instead.
Deprecate sphinx.builders.html.Stylesheet
and sphinx.builders.html.Javascript.
Use sphinx.application.Sphinx.add_css_file()
and sphinx.application.Sphinx.add_js_file() instead.
#11582: Deprecate sphinx.builders.html.StandaloneHTMLBuilder.css_files and
sphinx.builders.html.StandaloneHTMLBuilder.script_files.
Use sphinx.application.Sphinx.add_css_file()
and sphinx.application.Sphinx.add_js_file() instead.
#11459: Deprecate sphinx.ext.autodoc.preserve_defaults.get_function_def().
Patch by Bénédikt Tran.
Features added
#11526: Support os.PathLike types and pathlib.Path objects
in many more places.
#5474: coverage: Print summary statistics tables.
Patch by Jorge Leitao.
#6319: viewcode: Add :confval:viewcode_line_numbers to control
whether line numbers are added to rendered source code.
Patch by Ben Krikler.
#9662: Add the :no-typesetting: option to suppress textual output
and only create a linkable anchor.
Patch by Latosha Maltba.
#11221: C++: Support domain objects in the table of contents.
Patch by Rouslan Korneychuk.
#10938: doctest: Add :confval:doctest_show_successes option.
Patch by Trey Hunner.
#11533: Add :no-index:, :no-index-entry:, and :no-contents-entry:.
#11572: Improve debug logging of reasons why files are detected as out of
date.
Patch by Eric Larson.
Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.
Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.
Preserve leading whitespace lines in reStructuredText code blocks.
Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.
Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.
Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.
Support passing the --preview option through to Black, to select the future style.
Remove language_version from .pre-commit-hooks.yaml.
This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.
Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
----
:books: Documentation preview :books:: https://datasette--2142.org.readthedocs.build/en/2142/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1853289039,PR_kwDOBm6k_c5YEUBK,2141,Bump the python-packages group with 1 update,49699333,dependabot[bot],closed,0,,,,,2,2023-08-16T13:47:35Z,2023-08-17T13:07:48Z,2023-08-17T13:07:45Z,CONTRIBUTOR,simonw/datasette/pulls/2141,"Bumps the python-packages group with 1 update: [blacken-docs](https://github.com/asottile/blacken-docs).
Changelog
Thanks to initial work from Matthew Anderson in PR [#246](https://github.com/asottile/blacken-docs/issues/246) <https://github.com/adamchainz/blacken-docs/pull/246>__.
Expand Markdown detection to all Python language names from Pygments: py, sage, python3, py3, and numpy.
Preserve leading whitespace lines in reStructuredText code blocks.
Thanks to Julianus Pfeuffer for the report in Issue [#217](https://github.com/asottile/blacken-docs/issues/217) <https://github.com/adamchainz/blacken-docs/issues/217>__.
Use exit code 2 to indicate errors from Black, whilst exit code 1 remains for “files have been formatted”.
Thanks to Julianus Pfeuffer for the report in Issue [#218](https://github.com/asottile/blacken-docs/issues/218) <https://github.com/adamchainz/blacken-docs/issues/218>__.
Support passing the --preview option through to Black, to select the future style.
Remove language_version from .pre-commit-hooks.yaml.
This change allows default_language_version in ``.pre-commit-config.yaml` to take precedence.
Thanks to Aneesh Agrawal in PR [#258](https://github.com/asottile/blacken-docs/issues/258) <https://github.com/adamchainz/blacken-docs/pull/258>__.
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=blacken-docs&package-manager=pip&previous-version=1.15.0&new-version=1.16.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore dependency` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore dependency` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
----
:books: Documentation preview :books:: https://datasette--2141.org.readthedocs.build/en/2141/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1846076261,I_kwDOBm6k_c5uCONl,2139,border-color: ##ff0000 bug - two hashes,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-11T01:22:58Z,2023-08-11T05:16:24Z,2023-08-11T05:16:24Z,OWNER,,"Spotted this on https://latest.datasette.io/extra_database
```html
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1844213115,I_kwDOBm6k_c5t7HV7,2138,on_success_message_sql option for writable canned queries,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-08-10T00:20:14Z,2023-08-10T00:39:40Z,2023-08-10T00:34:26Z,OWNER,,"> Or... how about if the `on_success_message` option could define a SQL query to be executed to generate that message? Maybe `on_success_message_sql`.
- https://github.com/simonw/datasette/issues/2134",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1840329615,I_kwDOBm6k_c5tsTOP,2130,Render plugin mechanism needs `error` and `truncated` fields,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,2,2023-08-07T23:19:19Z,2023-08-08T01:51:54Z,2023-08-08T01:47:42Z,OWNER,,"While working on:
- https://github.com/simonw/datasette/pull/2118
It became clear that the `render` callback function documented here: https://docs.datasette.io/en/0.64.3/plugin_hooks.html#register-output-renderer-datasette
Needs to grow the ability to be told if an error occurred (an `error` string) and if the results were truncated (a `truncated` boolean).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1833193570,PR_kwDOBm6k_c5XArm3,2125,Bump sphinx from 6.1.3 to 7.1.2,49699333,dependabot[bot],closed,0,,,,,2,2023-08-02T13:28:39Z,2023-08-07T16:20:30Z,2023-08-07T16:20:27Z,CONTRIBUTOR,simonw/datasette/pulls/2125,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.1.2.
Release notes
#11412: Emit warnings on using a deprecated Python-specific index entry type
(namely, module, keyword, operator, object, exception,
statement, and builtin) in the :rst:dir:index directive, and
set the removal version to Sphinx 9. Patch by Adam Turner.
Features added
#11415: Add a checksum to JavaScript and CSS asset URIs included within
generated HTML, using the CRC32 algorithm.
:meth:~sphinx.application.Sphinx.require_sphinx now allows the version
requirement to be specified as (major, minor).
#11011: Allow configuring a line-length limit for object signatures, via
:confval:maximum_signature_line_length and the domain-specific variants.
If the length of the signature (in characters) is greater than the configured
limit, each parameter in the signature will be split to its own logical line.
This behaviour may also be controlled by options on object description
directives, for example :rst:dir:py:function:single-line-parameter-list.
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.1.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2125.org.readthedocs.build/en/2125/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1824399610,PR_kwDOBm6k_c5WjCS8,2121,Bump furo from 2023.3.27 to 2023.7.26,49699333,dependabot[bot],closed,0,,,,,2,2023-07-27T13:40:48Z,2023-08-07T16:20:23Z,2023-08-07T16:20:20Z,CONTRIBUTOR,simonw/datasette/pulls/2121,"Bumps [furo](https://github.com/pradyunsg/furo) from 2023.3.27 to 2023.7.26.
Changelog
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2023.3.27&new-version=2023.7.26)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2121.org.readthedocs.build/en/2121/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1796830110,PR_kwDOBm6k_c5VFw3j,2098,Bump blacken-docs from 1.14.0 to 1.15.0,49699333,dependabot[bot],closed,0,,,,,2,2023-07-10T13:49:12Z,2023-08-07T16:20:22Z,2023-08-07T16:20:20Z,CONTRIBUTOR,simonw/datasette/pulls/2098,"Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.14.0 to 1.15.0.
Changelog
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=blacken-docs&package-manager=pip&previous-version=1.14.0&new-version=1.15.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2098.org.readthedocs.build/en/2098/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2098/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1826424151,PR_kwDOBm6k_c5Wp6Hs,2124,Bump sphinx from 6.1.3 to 7.1.1,49699333,dependabot[bot],closed,0,,,,,2,2023-07-28T13:23:11Z,2023-08-02T13:28:47Z,2023-08-02T13:28:44Z,CONTRIBUTOR,simonw/datasette/pulls/2124,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.1.1.
Release notes
#11412: Emit warnings on using a deprecated Python-specific index entry type
(namely, module, keyword, operator, object, exception,
statement, and builtin) in the :rst:dir:index directive, and
set the removal version to Sphinx 9. Patch by Adam Turner.
Features added
#11415: Add a checksum to JavaScript and CSS asset URIs included within
generated HTML, using the CRC32 algorithm.
:meth:~sphinx.application.Sphinx.require_sphinx now allows the version
requirement to be specified as (major, minor).
#11011: Allow configuring a line-length limit for object signatures, via
:confval:maximum_signature_line_length and the domain-specific variants.
If the length of the signature (in characters) is greater than the configured
limit, each parameter in the signature will be split to its own logical line.
This behaviour may also be controlled by options on object description
directives, for example :rst:dir:py:function:single-line-parameter-list.
Patch by Thomas Louf, Adam Turner, and Jean-François B.
#10983: Support for multiline copyright statements in the footer block.
Patch by Stefanie Molin
sphinx.util.display.status_iterator now clears the current line
with ANSI control codes, rather than overprinting with space characters.
#11431: linkcheck: Treat SSL failures as broken links.
Patch by Bénédikt Tran
#11157: Keep the translated attribute on translated nodes.
#11451: Improve the traceback displayed when using :option:sphinx-build -T
in parallel builds. Patch by Bénédikt Tran
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.1.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2124.org.readthedocs.build/en/2124/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1827427757,PR_kwDOD079W85WtUKG,38,photos-to-sql not found?,319473,coldclimate,closed,0,,,,,2,2023-07-29T09:59:42Z,2023-07-29T10:01:27Z,2023-07-29T10:01:23Z,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/38,"I wonder if `photos-to-sql` is an old name for `dogsheep-photos`, because I can't find it anywhere.
I can't actually get this command to work (`sqlite3.OperationalError: no such table: attached.ZGENERICASSET` thrown) but I don't think that's related",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1820346348,PR_kwDOBm6k_c5WVYor,2107,Bump sphinx from 6.1.3 to 7.1.0,49699333,dependabot[bot],closed,0,,,,,2,2023-07-25T13:28:30Z,2023-07-28T13:23:19Z,2023-07-28T13:23:17Z,CONTRIBUTOR,simonw/datasette/pulls/2107,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.1.0.
Release notes
#11412: Emit warnings on using a deprecated Python-specific index entry type
(namely, module, keyword, operator, object, exception,
statement, and builtin) in the :rst:dir:index directive, and
set the removal version to Sphinx 9. Patch by Adam Turner.
Features added
#11415: Add a checksum to JavaScript and CSS asset URIs included within
generated HTML, using the CRC32 algorithm.
:meth:~sphinx.application.Sphinx.require_sphinx now allows the version
requirement to be specified as (major, minor).
#11011: Allow configuring a line-length limit for object signatures, via
:confval:maximum_signature_line_length and the domain-specific variants.
If the length of the signature (in characters) is greater than the configured
limit, each parameter in the signature will be split to its own logical line.
This behaviour may also be controlled by options on object description
directives, for example :rst:dir:py:function:single-line-parameter-list.
Patch by Thomas Louf, Adam Turner, and Jean-François B.
#10983: Support for multiline copyright statements in the footer block.
Patch by Stefanie Molin
sphinx.util.display.status_iterator now clears the current line
with ANSI control codes, rather than overprinting with space characters.
#11431: linkcheck: Treat SSL failures as broken links.
Patch by Bénédikt Tran
#11157: Keep the translated attribute on translated nodes.
#11451: Improve the traceback displayed when using :option:sphinx-build -T
in parallel builds. Patch by Bénédikt Tran
#11324: linkcheck: Use session-basd HTTP requests.
#11438: Add support for the :rst:dir:py:class and :rst:dir:py:function
directives for PEP 695 (generic classes and functions declarations) and
PEP 696 (default type parameters). Multi-line support (#11011) is enabled
for type parameters list and can be locally controlled on object description
directives, e.g., :rst:dir:py:function:single-line-type-parameter-list.
Patch by Bénédikt Tran.
#11484: linkcheck: Allow HTML anchors to be ignored on a per-URL basis
via :confval:linkcheck_anchors_ignore_for_url while
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2107.org.readthedocs.build/en/2107/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1710164693,PR_kwDOBm6k_c5QhIL2,2075,Bump sphinx from 6.1.3 to 7.0.1,49699333,dependabot[bot],closed,0,,,,,2,2023-05-15T13:59:31Z,2023-07-25T13:28:39Z,2023-07-25T13:28:36Z,CONTRIBUTOR,simonw/datasette/pulls/2075,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 7.0.1.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=6.1.3&new-version=7.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
You can trigger a rebase of this PR by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2075.org.readthedocs.build/en/2075/
> **Note**
> Automatic rebases have been disabled on this pull request as it has been open for over 30 days.
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2075/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1816997390,I_kwDOCGYnMM5sTS4O,576,Backfill the release notes prior to 0.4,9599,simonw,closed,0,,,,,2,2023-07-23T05:41:42Z,2023-07-23T05:49:51Z,2023-07-23T05:48:21Z,OWNER,,"Currently the changelog starts at 0.4:
https://sqlite-utils.datasette.io/en/3.34/changelog.html#id115
I want the other releases - according to https://pypi.org/project/sqlite-utils/#history there are three missing:
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1816919568,I_kwDOCGYnMM5sS_4Q,575,Python API ability to opt-out of connection plugins,9599,simonw,closed,0,,,,,2,2023-07-22T23:01:13Z,2023-07-22T23:17:22Z,2023-07-22T23:08:22Z,OWNER,,"Plugins affecting the CLI by default makes sense to me.
I'm less confident about them _always_ affecting users of the Python API.
I'm going to have them apply by default, but I'm going to add a mechanism to opt-out on an individual database basis. Basically this:
```python
from sqlite_utils import Database
db = Database(memory=True, execute_plugins=False)
# Anything using db from here on will not execute plugins
```
cc @asg017
Refs:
- #567
- #574 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1816877910,I_kwDOCGYnMM5sS1tW,572,Don't test Python 3.7 against textual,9599,simonw,closed,0,,,,,2,2023-07-22T19:57:03Z,2023-07-22T22:16:50Z,2023-07-22T22:16:50Z,OWNER,,"Spotted this in the GitHub Actions logs:
![IMG_5046](https://github.com/simonw/sqlite-utils/assets/9599/81fb1093-cd8a-4019-a612-2e49b500c933)
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,simonw,open,0,,,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example:
- https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1765870617,I_kwDOBm6k_c5pQQwZ,2087,`--settings settings.json` option,9599,simonw,open,0,,,,,2,2023-06-20T17:48:45Z,2023-07-14T17:02:03Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080
> May I add a request to the whole metadata / settings ? Allow to pass `--settings path/to/settings.json` instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2087/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1795219865,I_kwDOCGYnMM5rAOGZ,566,`--no-headers` doesn't work on most formats,33625,zellyn,open,0,,,,,2,2023-07-09T03:43:36Z,2023-07-09T04:13:35Z,,NONE,,"Version 3.33
```
sqlite-utils query library.db 'select asin from audible' --fmt plain --no-headers | head -3
asin
0062804006
0062891421
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1781047747,I_kwDOBm6k_c5qKKHD,2092,test_homepage intermittent failure,9599,simonw,closed,0,,,,,2,2023-06-29T15:20:37Z,2023-06-29T15:26:28Z,2023-06-29T15:24:13Z,OWNER,,"e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852
```
=================================== FAILURES ===================================
________________________________ test_homepage _________________________________
[gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python
ds_client =
@pytest.mark.asyncio
async def test_homepage(ds_client):
response = await ds_client.get(""/.json"")
assert response.status_code == 200
assert ""application/json; charset=utf-8"" == response.headers[""content-type""]
data = response.json()
assert data.keys() == {""fixtures"": 0}.keys()
d = data[""fixtures""]
assert d[""name""] == ""fixtures""
assert d[""tables_count""] == 24
assert len(d[""tables_and_views_truncated""]) == 5
assert d[""tables_and_views_more""] is True
# 4 hidden FTS tables + no_primary_key (hidden in metadata)
assert d[""hidden_tables_count""] == 6
# 201 in no_primary_key, plus 6 in other hidden tables:
> assert d[""hidden_table_rows_sum""] == 207, data
E AssertionError: {'fixtures': {'color': '9403e5', 'hash': None, 'hidden_table_rows_sum': 0, 'hidden_tables_count': 6, ...}}
E assert 0 == 207
```
My guess is that this is a timing error, where very occasionally the ""count rows but stop counting if it exceeds a time limit"" thing fails.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1781005740,I_kwDOBm6k_c5qJ_2s,2090,Adopt ruff for linting,9599,simonw,open,0,,,,,2,2023-06-29T14:56:43Z,2023-06-29T15:05:04Z,,OWNER,,https://beta.ruff.rs/docs/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2090/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1655860104,I_kwDOCGYnMM5ismuI,535,rows: --transpose or psql extended view-like functionality,7908073,chapmanjacobd,closed,0,,,,,2,2023-04-05T15:37:33Z,2023-06-15T08:39:49Z,2023-06-14T22:05:28Z,CONTRIBUTOR,,"It would be nice if the rows subcommand had a flag, perhaps called `--transpose` which would print in long form instead of wide. Similar to extended display mode in psql (`\x`)
In other words instead of this:
```
sqlite-utils rows --limit 5 --fmt github track_metadata.db songs
```
| track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work |
|--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------|
| TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 |
| TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 |
| TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 |
| TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 |
| TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 |
The output would look something like this:
```
$ for col in (sqlite-columns track_metadata.db songs)
sqlite-utils --fmt github track_metadata.db ""select $col from songs order by rowid desc limit 5""
end
```
| track_id |
|--------------------|
| TRYYYVU12903CD01E3 |
| TRYYYDJ128F9310A21 |
| TRYYYMG128F4260ECA |
| TRYYYJO128F426DA37 |
| TRYYYUS12903CD2DF0 |
| title |
|-------------------------------------|
| Fernweh feat. Sektion Kuchikäschtli |
| Faraday |
| Novemba |
| Jago Chhadeo |
| O Samba Da Vida |
| song_id |
|--------------------|
| SOWXJXQ12AB0189F43 |
| SOLXGOR12A81C21EB7 |
| SOHODZI12A8C137BB3 |
| SOXQYIQ12A8C137FBB |
| SOTXAME12AB018F136 |
| release |
|---------------------------------|
| So Oder So |
| The Trance Collection Vol. 2 |
| Dub_Connected: electronic music |
| Naale Baba Lassi Pee Gya |
| Pacha V.I.P. |
| artist_id |
|--------------------|
| AR7PLM21187B990D08 |
| ARCMCOK1187B9B1073 |
| ARZ3R6M1187B9AF750 |
| ART5FZD1187B9A7FCF |
| AR7Z4J81187FB3FC59 |
| artist_mbid |
|--------------------------------------|
| 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 |
| 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 |
| 8b97e9c8-61f5-4615-9a96-276f24204e34 |
| 2357c400-9109-42b6-b3fe-9e2d9f8e3872 |
| 9d50cb20-7e42-45cc-b0dd-154c3e92a577 |
| artist_name |
|----------------|
| Texta |
| Elude |
| Gabriel Le Mar |
| Kuldeep Manak |
| Kiko Navarro |
| duration |
|------------|
| 295.079 |
| 484.519 |
| 553.038 |
| 244.166 |
| 217.443 |
| artist_familiarity |
|----------------------|
| 0.552977 |
| 0.403668 |
| 0.556918 |
| 0.4015 |
| 0.528617 |
| artist_hotttnesss |
|---------------------|
| 0.454869 |
| 0.256935 |
| 0.336914 |
| 0.374866 |
| 0.411595 |
| year |
|--------|
| 2004 |
| 0 |
| 0 |
| 0 |
| 0 |
| track_7digitalid |
|--------------------|
| 8486723 |
| 5472456 |
| 2219291 |
| 1632096 |
| 7522478 |
| shs_perf |
|------------|
| -1 |
| -1 |
| -1 |
| -1 |
| -1 |
| shs_work |
|------------|
| 0 |
| 0 |
| 0 |
| 0 |
| 0 |
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1740150327,I_kwDOCGYnMM5nuJY3,557,Aliased ROWID option for tables created from alter=True commands,7908073,chapmanjacobd,closed,0,,,,,2,2023-06-04T05:29:28Z,2023-06-14T06:09:21Z,2023-06-05T19:26:26Z,CONTRIBUTOR,,"> If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values.
ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does `id integer primary key` DDL) makes it OK.
It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid
I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the `id` column in any select statement where I plan on using `upsert` but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c
https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1718635018,PR_kwDOCGYnMM5Q9lY4,553,Reformatted CLI examples in docs,9599,simonw,closed,0,,,,,2,2023-05-21T20:44:34Z,2023-05-21T20:57:27Z,2023-05-21T20:57:23Z,OWNER,simonw/sqlite-utils/pulls/553,"Refs:
- #551
----
:books: Documentation preview :books:: https://sqlite-utils--553.org.readthedocs.build/en/553/
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1718515590,I_kwDOCGYnMM5mbneG,544,New options for analyze-tables --common-limit --no-most and --no-least,9599,simonw,closed,0,,,,,2,2023-05-21T14:03:19Z,2023-05-21T17:03:06Z,2023-05-21T16:19:31Z,OWNER,,"The ""least common"" section is frequently uninteresting, especially for huge tables with a large number of repeated-once values.
sqlite-utils analyze-tables content.db repos --common-limit 20 --no-least",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1718550688,PR_kwDOCGYnMM5Q9VH0,546,"Analyze tables options: --common-limit, --no-most, --no-least",9599,simonw,closed,0,,,,,2,2023-05-21T15:54:39Z,2023-05-21T16:19:30Z,2023-05-21T16:19:30Z,OWNER,simonw/sqlite-utils/pulls/546,"Refs #544
- [x] Documentation for CLI options
- [x] Documentation for new Python API parameters: `most_common: bool` and `least_common: bool`
- [x] Tests for CLI
- [x] Tests for Python API",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1708030220,I_kwDOBm6k_c5lznkM,2073,Faceting doesn't work against integer columns in views,9599,simonw,open,0,,,,,2,2023-05-12T18:20:10Z,2023-05-12T18:24:07Z,,OWNER,,"Spotted this issue here: https://til.simonwillison.net/datasette/baseline
I had to do this workaround:
```sql
create view baseline as select
_key,
spec,
'' || json_extract(status, '$.is_baseline') as is_baseline,
json_extract(status, '$.since') as baseline_since,
json_extract(status, '$.support.chrome') as baseline_chrome,
json_extract(status, '$.support.edge') as baseline_edge,
json_extract(status, '$.support.firefox') as baseline_firefox,
json_extract(status, '$.support.safari') as baseline_safari,
compat_features,
caniuse,
usage_stats,
status
from
[index]
```
I think the core issue here is that, against a table, `select * from x where integer_column = '1'` works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1702354223,I_kwDOBm6k_c5ld90v,2070,Mechanism for deploying a preview of a branch using Vercel,9599,simonw,closed,0,,,,,2,2023-05-09T16:21:45Z,2023-05-09T16:25:00Z,2023-05-09T16:24:31Z,OWNER,,"I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml
It deployed the `json-extras-query` branch here: https://datasette-preview-json-extras-query.vercel.app/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2070/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1595340692,I_kwDOCGYnMM5fFveU,530,"add ability to configure ""on delete"" and ""on update"" attributes of foreign keys:",536941,fgregg,open,0,,,,,2,2023-02-22T15:44:14Z,2023-05-08T20:39:01Z,,CONTRIBUTOR,,"sqlite supports these, and it would be quite nice to be able to add them with sqlite-utils.
https://www.sqlite.org/foreignkeys.html#fk_actions",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/530/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1620254998,I_kwDOCGYnMM5gkyEW,532,Show more information when JSON can't be imported with sqlite-utils insert,83080728,voltagex,closed,0,,,,,2,2023-03-12T06:41:44Z,2023-05-08T20:32:16Z,2023-05-08T20:32:02Z,NONE,,"I am currently trying to import the [JSON export of my data from Discord](https://support.discord.com/hc/en-us/articles/360004027692-Requesting-a-Copy-of-your-Data), specifically `activity/reporting/events-*.json`
```
sqlite-utils.exe insert test.db reporting events-2023-00000-of-00001.json
[###################################-] 99% 00:00:00
Error: Invalid JSON - use --csv for CSV or --tsv for TSV files
```
Please show more information as to *why* this is invalid, if possible.
I am using version 3.30 with Python 3.10 on Windows 11.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1617602868,I_kwDOJHON9s5gaqk0,6,Character encoding problem,9599,simonw,open,0,,,,,2,2023-03-09T16:44:34Z,2023-04-14T15:22:09Z,,MEMBER,,"I ran against a recent note with this in it:
> Or just ""Actions ⚙️ ""
And got back:
> `Actions ⚙️`
Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this:
```python
s = 'Actions â\x80\x9aöôÃ\x94â\x88\x8fè'
s = s.encode('latin-1')
s = s.decode('utf-8')
s = s.encode('macroman')
s = s.decode('utf-8')
print(s)
```
",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1515883470,I_kwDOC8tyDs5aWovO,24,DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 ,6231413,mmngreco,open,0,,,,,2,2023-01-01T23:00:38Z,2023-03-30T10:17:31Z,,NONE,,"Hi @simonw
I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time.
Following the instructions from the `README.md` I've faced this error:
```bash
(venv) mgreco@pop-os apple-health master* (23:44|0s)
$ healthkit-to-sqlite apple_health_export/export.xml healthkit.db --xml
Importing from HealthKit [------------------------------------] 0%
Traceback (most recent call last):
File ""/home/mgreco/github/mmngreco/apple-health/venv/bin/healthkit-to-sqlite"", line 33, in
sys.exit(load_entry_point('healthkit-to-sqlite', 'console_scripts', 'healthkit-to-sqlite')())
File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__
return self.main(*args, **kwargs)
File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1055, in main
rv = self.invoke(ctx)
File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 760, in invoke
return __callback(*args, **kwargs)
File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/cli.py"", line 57, in cli
convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf)
File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 25, in convert_xml_to_sqlite
for tag, el in find_all_tags(
File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 12, in find_all_tags
for event, el in parser.read_events():
File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1324, in read_events
raise event
File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1296, in feed
self._parser.feed(data)
xml.etree.ElementTree.ParseError: syntax error: line 156, column 0
```
So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads:
1. Uncompress the zip and move the new folder where `export.xml` is.
1. Create a `patch.txt` with the following content
```diff
--- export.xml 2022-09-18 15:17:09.000000000 -0400
+++ export-fixed.xml 2022-09-18 16:37:08.000000000 -0400
@@ -15,6 +15,7 @@
HKCharacteristicTypeIdentifierBiologicalSex CDATA #REQUIRED
HKCharacteristicTypeIdentifierBloodType CDATA #REQUIRED
HKCharacteristicTypeIdentifierFitzpatrickSkinType CDATA #REQUIRED
+ HKCharacteristicTypeIdentifierCardioFitnessMedicationsUse CDATA #IMPLIED
>
-
+
-
+
- device CDATA #IMPLIED
-
-
->
]>
```
1. Apply the path with the command: `patch < patch.txt`
1. Fix endDates with the command `sed 's/startDate/endDate/2' export.xml > export-fixed.xml`
1. Try again `healthkit-to-sqlite export-fixed.xml healthkit.db --xml`",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1613974869,PR_kwDOBm6k_c5LgPS-,2034,remove an unused `app` var in cli.py,4370201,wenhoujx,open,0,,,,,2,2023-03-07T18:19:05Z,2023-03-29T20:56:20Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2034,"this var `app` isn't actually used? unless init it does some side-effect outside of the event loop, idon't think it's necessary.
Feel free to ignore this PR if the deleted line actually does something.
----
:books: Documentation preview :books:: https://datasette--2034.org.readthedocs.build/en/2034/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2034/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1566081801,PR_kwDOBm6k_c5JAcGy,2014,Bump black from 22.12.0 to 23.1.0,49699333,dependabot[bot],closed,0,,,,,2,2023-02-01T13:06:16Z,2023-03-29T06:09:14Z,2023-03-29T06:09:12Z,CONTRIBUTOR,simonw/datasette/pulls/2014,"Bumps [black](https://github.com/psf/black) from 22.12.0 to 23.1.0.
Release notes
This is the first release of 2023, and following our stability policy, it comes with a number of improvements to our stable style, notably improvements to empty line handling and the removal of redundant parentheses in several contexts.
There are also many changes to the preview style; try out black --preview and give us feedback to help us set the stable style for next year.
In addition to style changes, Black now automatically infers the supported Python versions from your pyproject.toml file, removing the need to set Black's target versions separately.
Stable style
Introduce the 2023 stable style, which incorporates most aspects of last year's preview style (#3418). Specific changes:
Enforce empty lines before classes and functions with sticky leading comments (#3302) (22.12.0)
Reformat empty and whitespace-only files as either an empty file (if no newline is present) or as a single newline character (if a newline is present) (#3348) (22.12.0)
Correctly handle trailing commas that are inside a line's leading non-nested parens (#3370) (22.12.0)
--skip-string-normalization / -S now prevents docstring prefixes from being normalized as expected (#3168) (since 22.8.0)
When using --skip-magic-trailing-comma or -C, trailing commas are stripped from subscript expressions with more than 1 element (#3209) (22.8.0)
Fix a string merging/split issue when a comment is present in the middle of implicitly concatenated strings on its own line (#3227) (22.8.0)
Docstring quotes are no longer moved if it would violate the line length limit (#3044, #3430) (22.6.0)
Parentheses around return annotations are now managed (#2990) (22.6.0)
Remove unnecessary parentheses around awaited objects (#2991) (22.6.0)
Remove unnecessary parentheses in with statements (#2926) (22.6.0)
Remove trailing newlines after code block open (#3035) (22.6.0)
Code cell separators #%% are now standardised to # %% (#2919) (22.3.0)
Remove unnecessary parentheses from except statements (#2939) (22.3.0)
Remove unnecessary parentheses from tuple unpacking in for loops (#2945) (22.3.0)
Avoid magic-trailing-comma in single-element subscripts (#2942) (22.3.0)
Fix a crash when a colon line is marked between # fmt: off and # fmt: on (#3439)
Preview style
Format hex codes in unicode escape sequences in string literals (#2916)
Add parentheses around if-else expressions (#2278)
Improve performance on large expressions that contain many strings (#3467)
Fix a crash in preview style with assert + parenthesized string (#3415)
Fix crashes in preview style with walrus operators used in function return annotations and except clauses (#3423)
Fix a crash in preview advanced string processing where mixed implicitly concatenated regular and f-strings start with an empty span (#3463)
Fix a crash in preview advanced string processing where a standalone comment is placed before a dict's value (#3469)
Fix an issue where extra empty lines are added when a decorator has # fmt: skip applied or there is a standalone comment between decorators (#3470)
Do not put the closing quotes in a docstring on a separate line, even if the line is too long (#3430)
Long values in dict literals are now wrapped in parentheses; correspondingly unnecessary parentheses around short values in dict literals are now removed; long string lambda values are now wrapped in parentheses (#3440)
Fix two crashes in preview style involving edge cases with docstrings (#3451)
Exclude string type annotations from improved string processing; fix crash when the return type annotation is stringified and spans across multiple lines (#3462)
Wrap multiple context managers in parentheses when targeting Python 3.9+ (#3489)
Fix several crashes in preview style with walrus operators used in with statements or tuples (#3473)
Fix an invalid quote escaping bug in f-string expressions where it produced invalid code. Implicitly concatenated f-strings with different quotes can now be merged or quote-normalized by changing the quotes used in expressions. (#3509)
This is the first release of 2023, and following our
stability policy,
it comes with a number of improvements to our stable style, including improvements to
empty line handling, removal of redundant parentheses in several contexts, and output
that highlights implicitly concatenated strings better.
There are also many changes to the preview style; try out black --preview and give us
feedback to help us set the stable style for next year.
In addition to style changes, Black now automatically infers the supported Python
versions from your pyproject.toml file, removing the need to set Black's target
versions separately.
Stable style
Introduce the 2023 stable style, which incorporates most aspects of last year's
preview style (#3418). Specific changes:
Enforce empty lines before classes and functions with sticky leading comments
(#3302) (22.12.0)
Reformat empty and whitespace-only files as either an empty file (if no newline is
present) or as a single newline character (if a newline is present) (#3348)
(22.12.0)
Implicitly concatenated strings used as function args are now wrapped inside
parentheses (#3307) (22.12.0)
Correctly handle trailing commas that are inside a line's leading non-nested parens
(#3370) (22.12.0)
--skip-string-normalization / -S now prevents docstring prefixes from being
normalized as expected (#3168) (since 22.8.0)
When using --skip-magic-trailing-comma or -C, trailing commas are stripped from
subscript expressions with more than 1 element (#3209) (22.8.0)
Implicitly concatenated strings inside a list, set, or tuple are now wrapped inside
parentheses (#3162) (22.8.0)
Fix a string merging/split issue when a comment is present in the middle of
implicitly concatenated strings on its own line (#3227) (22.8.0)
Docstring quotes are no longer moved if it would violate the line length limit
(#3044, #3430) (22.6.0)
Parentheses around return annotations are now managed (#2990) (22.6.0)
Remove unnecessary parentheses around awaited objects (#2991) (22.6.0)
Remove unnecessary parentheses in with statements (#2926) (22.6.0)
Remove trailing newlines after code block open (#3035) (22.6.0)
Code cell separators #%% are now standardised to # %% (#2919) (22.3.0)
Remove unnecessary parentheses from except statements (#2939) (22.3.0)
Remove unnecessary parentheses from tuple unpacking in for loops (#2945) (22.3.0)
Avoid magic-trailing-comma in single-element subscripts (#2942) (22.3.0)
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=22.12.0&new-version=23.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2014.org.readthedocs.build/en/2014/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2014/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1639446870,PR_kwDOBm6k_c5M1izI,2043,Bump furo from 2022.12.7 to 2023.3.23,49699333,dependabot[bot],closed,0,,,,,2,2023-03-24T13:58:08Z,2023-03-28T13:58:24Z,2023-03-28T13:58:21Z,CONTRIBUTOR,simonw/datasette/pulls/2043,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.12.7 to 2023.3.23.
Changelog
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.12.7&new-version=2023.3.23)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--2043.org.readthedocs.build/en/2043/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2043/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1633077183,I_kwDOBm6k_c5hVse_,2041,Remove obsolete table POST code,9599,simonw,closed,0,,,8755003,Datasette 1.0a-next,2,2023-03-21T01:01:40Z,2023-03-21T01:17:44Z,2023-03-21T01:17:43Z,OWNER,,"Spotted this in:
- #1999
`POST /db/table` currently executes obsolete code for inserting a row - I replaced that with `/db/table/-/insert` in
https://github.com/simonw/datasette/commit/6e788b49edf4f842c0817f006eb9d865778eea5e but forgot to remove the old code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1620516340,I_kwDOCGYnMM5glx30,533,ReadTheDocs error: not all arguments converted during string formatting,9599,simonw,closed,0,,,,,2,2023-03-12T21:21:05Z,2023-03-12T21:25:33Z,2023-03-12T21:25:33Z,OWNER,,"This came up as a failure running tests for:
- #531
Traceback on https://readthedocs.org/projects/sqlite-utils/builds/19749348/
```
File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted
nodes, messages2 = role_fn(role, rawsource, text, lineno, self)
File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role
title = caption % part
TypeError: not all arguments converted during string formatting
Exception occurred:
File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py"", line 103, in role
title = caption % part
TypeError: not all arguments converted during string formatting
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,simonw,open,0,,,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error.
Prototype here:
```diff
diff --git a/datasette/app.py b/datasette/app.py
index 40416713..1428a3f0 100644
--- a/datasette/app.py
+++ b/datasette/app.py
@@ -200,6 +200,7 @@ SETTINGS = (
""Allow display of SQL trace debug information with ?_trace=1"",
),
Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""),
+ Setting(""strict_templates"", False, ""Raise errors for undefined template variables""),
)
_HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead""
OBSOLETE_SETTINGS = {
@@ -399,11 +400,14 @@ class Datasette:
),
]
)
+ env_extras = {}
+ if self.setting(""strict_templates""):
+ env_extras[""undefined""] = StrictUndefined
self.jinja_env = Environment(
loader=template_loader,
autoescape=True,
enable_async=True,
- undefined=StrictUndefined,
+ **env_extras,
)
self.jinja_env.filters[""escape_css_string""] = escape_css_string
self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus
```
Explored this idea a bit in:
- #1999",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1616429236,I_kwDOJHON9s5gWMC0,4,Support incremental updates,9599,simonw,open,0,,,,,2,2023-03-09T05:14:00Z,2023-03-09T18:20:56Z,,MEMBER,,"Running this script can take several hours against a large notes database.
Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max `updated` date and then keep on looping until it finds one modified before then.
Problem is I don't actually know what order it iterates over the notes in.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1612296210,I_kwDOBm6k_c5gGbAS,2033,`datasette install -r requirements.txt`,9599,simonw,closed,0,,,,,2,2023-03-06T22:17:17Z,2023-03-06T22:54:52Z,2023-03-06T22:27:34Z,OWNER,,"Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1590839187,PR_kwDOBm6k_c5KSs9T,2028,add Python 3.11 classifier,614233,dtrodrigues,closed,0,,,,,2,2023-02-19T20:16:03Z,2023-03-06T21:01:20Z,2023-03-06T21:01:19Z,CONTRIBUTOR,simonw/datasette/pulls/2028,"Python 3.11 is tested in CI and is used in the docker image, so add the Python 3.11 Trove classifier.
----
:books: Documentation preview :books:: https://datasette--2028.org.readthedocs.build/en/2028/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2028/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1594383280,I_kwDOBm6k_c5fCFuw,2030,How to use Datasette with apache webserver on GCP?,19700859,gk7279,closed,0,,,,,2,2023-02-22T03:08:49Z,2023-02-22T21:54:39Z,2023-02-22T21:54:39Z,NONE,,"Hi Simon and Datasette team-
I have installed apache2 webserver inside GCP VM using apt.
I can see my ""Hello World"" index.html if I use the external IP of this GCP in a browser.
However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage.
I cannot invest Docker on this VM.
Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated.
Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2030/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,mlaparie,closed,0,,,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server.
This is my `settings.json`:
```json
{
""default_page_size"": 200,
""max_returned_rows"": 8000,
""num_sql_threads"": 3,
""sql_time_limit_ms"": 1000,
""default_facet_size"": 30,
""facet_time_limit_ms"": 200,
""facet_suggest_time_limit_ms"": 50,
""hash_urls"": false,
""allow_facet"": true,
""allow_download"": true,
""suggest_facets"": true,
""default_cache_ttl"": 5,
""default_cache_ttl_hashed"": 31536000,
""cache_size_kb"": 0,
""allow_csv_stream"": true,
""max_csv_mb"": 100,
""truncate_cells_html"": 2048,
""force_https_urls"": false,
""template_debug"": false,
""base_url"": ""/pclim/db/""
}
```
This looks ok to me. Would you have any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1186696202,I_kwDOBm6k_c5Gu4wK,1696,Show foreign key label when filtering,9599,simonw,open,0,,,,,2,2022-03-30T16:18:54Z,2023-01-29T20:56:20Z,,OWNER,,"For example here:
3 corresponds to ""Human Related: Other"" - it would be neat to display this in this area of the page somehow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1558644003,I_kwDOBm6k_c5c5wUj,2006,Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2023-01-26T19:17:40Z,2023-01-26T19:20:53Z,,OWNER,,"I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes.
I can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1551113681,I_kwDOBm6k_c5cdB3R,1998,`datasette --version` should also show the SQLite version,9599,simonw,open,0,,,,,2,2023-01-20T16:11:30Z,2023-01-20T18:19:06Z,,OWNER,,Idea came up here: https://discord.com/channels/823971286308356157/823971286941302908/1066026473003159783,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1998/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1522778923,I_kwDOBm6k_c5aw8Mr,1978,Document datasette.urls.row and row_blob,25778,eyeseast,closed,0,,,,,2,2023-01-06T15:45:51Z,2023-01-09T14:30:00Z,2023-01-09T14:30:00Z,CONTRIBUTOR,,"These are in the codebase but not in documentation. I think everything else in this class is documented.
```python
class Urls:
...
def row(self, database, table, row_path, format=None):
path = f""{self.table(database, table)}/{row_path}""
if format is not None:
path = path_with_format(path=path, format=format)
return PrefixedUrlString(path)
def row_blob(self, database, table, row_path, column):
return self.table(database, table) + ""/{}.blob?_blob_column={}"".format(
row_path, urllib.parse.quote_plus(column)
)
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,not_planned
1522552817,PR_kwDOBm6k_c5G0XxH,1977,Bump sphinx from 5.3.0 to 6.1.1,49699333,dependabot[bot],closed,0,,,,,2,2023-01-06T13:02:12Z,2023-01-09T13:06:17Z,2023-01-09T13:06:14Z,CONTRIBUTOR,simonw/datasette/pulls/1977,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.1.1.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.1.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--1977.org.readthedocs.build/en/1977/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1977/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1524867951,I_kwDOBm6k_c5a46Nv,1980,"""Cannot sort table by id"" when sortable_columns is used",9599,simonw,open,0,,,,,2,2023-01-09T03:21:33Z,2023-01-09T03:23:53Z,,OWNER,,"I had an instance with this in `metadata.yml`:
```yaml
databases:
timezones:
tables:
timezones:
sortable_columns:
- tzid
```
When I clicked on the ""Apply"" button here:
It sent me to `/timezones/timezones?_sort=id&id__exact=133` with the error message:
> 500: Cannot sort table by id",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1520712722,PR_kwDOBm6k_c5GuDBN,1976,Bump sphinx from 5.3.0 to 6.1.0,49699333,dependabot[bot],closed,0,,,,,2,2023-01-05T13:02:37Z,2023-01-06T13:02:17Z,2023-01-06T13:02:15Z,CONTRIBUTOR,simonw/datasette/pulls/1976,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.1.0.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--1976.org.readthedocs.build/en/1976/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1976/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1516376583,PR_kwDOBm6k_c5GfPJL,1974,Bump sphinx from 5.3.0 to 6.0.0,49699333,dependabot[bot],closed,0,,,,,2,2023-01-02T13:04:26Z,2023-01-05T13:02:42Z,2023-01-05T13:02:40Z,CONTRIBUTOR,simonw/datasette/pulls/1974,"Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 5.3.0 to 6.0.0.
Release notes
#10470: Drop Python 3.7, Docutils 0.14, Docutils 0.15, Docutils 0.16, and
Docutils 0.17 support. Patch by Adam Turner
Incompatible changes
#7405: Removed the jQuery and underscore.js JavaScript frameworks.
These frameworks are no longer be automatically injected into themes from
Sphinx 6.0. If you develop a theme or extension that uses the
jQuery, $, or $u global objects, you need to update your
JavaScript to modern standards, or use the mitigation below.
The first option is to use the sphinxcontrib.jquery_ extension, which has been
developed by the Sphinx team and contributors. To use this, add
sphinxcontrib.jquery to the extensions list in conf.py, or call
app.setup_extension("sphinxcontrib.jquery") if you develop a Sphinx theme
or extension.
The second option is to manually ensure that the frameworks are present.
To re-add jQuery and underscore.js, you will need to copy jquery.js and
underscore.js from the Sphinx repository_ to your static directory,
and add the following to your layout.html:
#10471, #10565: Removed deprecated APIs scheduled for removal in Sphinx 6.0. See
:ref:dev-deprecated-apis for details. Patch by Adam Turner.
#10901: C Domain: Remove support for parsing pre-v3 style type directives and
roles. Also remove associated configuration variables c_allow_pre_v3 and
c_warn_on_allowed_pre_v3. Patch by Adam Turner.
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sphinx&package-manager=pip&previous-version=5.3.0&new-version=6.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
----
:books: Documentation preview :books:: https://datasette--1974.org.readthedocs.build/en/1974/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1974/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1515182998,I_kwDOBm6k_c5aT9uW,1970,"Path ""None"" in _internal database table",9599,simonw,closed,0,,,,,2,2022-12-31T18:51:05Z,2022-12-31T19:22:58Z,2022-12-31T18:52:49Z,OWNER,,"See https://latest.datasette.io/_internal/databases (after https://latest.datasette.io/login-as-root)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1503010009,PR_kwDOBm6k_c5FyT3c,1967,Add favicon to documentation,1839645,choldgraf,closed,0,,,,,2,2022-12-19T14:01:04Z,2022-12-31T19:15:51Z,2022-12-31T19:00:31Z,CONTRIBUTOR,simonw/datasette/pulls/1967,"I've been browsing the datasette documentation and found it hard to quickly locate tabs with many of them open, because it does not ship a favicon. So this PR:
- Grabs the favicon `.png` from datasette itself[^1]
- Adds it to the `_static/` folder
- Sets `html_favicon` to load it in the docs
[^1]: I also learned that Chrome can fetch favicons as an internal service! See `chrome://favicon/https://datasette.io/tools/github-to-sqlite`.
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1967/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1501843596,PR_kwDOBm6k_c5FuaJm,1965,Detect server start/stop more reliably.,11321,janl,closed,0,,,,,2,2022-12-18T10:03:42Z,2022-12-20T19:08:26Z,2022-12-18T16:01:51Z,CONTRIBUTOR,simonw/datasette/pulls/1965,"This is useful, especially in testing, since your test hosts might not reliabliy start the server within two seconds, so we do a definite check before progressing.
By the same token, after `kill $server_pid` wait for the pid to be gone from the process list.
Since now the script can end prematurely, I also added a cleanup function to make sure the temporary certs are removed in any case.
n.b. this could also be done with the use of `trap 'fn' ERR` but that felt like a bit too much magic for this short a script.
----
:books: Documentation preview :books:: https://datasette--1965.org.readthedocs.build/en/1965/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1965/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1501713288,I_kwDOBm6k_c5ZglOI,1963,0.63.3 bugfix release,9599,simonw,closed,0,,,,,2,2022-12-18T02:48:15Z,2022-12-18T03:26:55Z,2022-12-18T03:26:55Z,OWNER,,"I'm going to ship a release which back-ports these two fixes:
- https://github.com/simonw/datasette/issues/1958
- https://github.com/simonw/datasette/issues/1955",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1963/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1495821607,I_kwDOBm6k_c5ZKG0n,1953,Release notes for Datasette 1.0a2,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-14T06:26:40Z,2022-12-15T02:02:15Z,2022-12-15T02:01:08Z,OWNER,,"https://github.com/simonw/datasette/milestone/27?closed=1
https://github.com/simonw/datasette/compare/1.0a1...9ad76d279e2c3874ca5070626a25458ce129f126",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1953/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
855296937,MDU6SXNzdWU4NTUyOTY5Mzc=,1295,Errors should have links to further information,9599,simonw,open,0,,,,,2,2021-04-11T12:39:12Z,2022-12-14T23:28:49Z,,OWNER,,"Inspired by this tweet:
https://twitter.com/willmcgugan/status/1381186384510255104
> While I am thinking about faqs. I’d also like to add short URLs to Rich exceptions.
>
> I loath cryptic error messages, and I’ve created a fair few myself. In Rich I’ve tried to make them as plain English as possible. But...
>
> would be great if every error message linked to a page that explains the error in detail and offers fixes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1483320357,I_kwDOBm6k_c5Yaawl,1937,/db/-/create API should require insert-rows permission to use row: or rows: option,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-08T01:33:09Z,2022-12-14T20:21:26Z,2022-12-14T20:21:26Z,OWNER,,Otherwise someone with `create-table` but no` insert-rows` permission could abuse it to insert data.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1937/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1497288666,I_kwDOBm6k_c5ZPs_a,1956,Handle abbreviations properly in permission_allowed_actor_restrictions,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-14T19:54:21Z,2022-12-14T20:04:29Z,2022-12-14T20:04:28Z,OWNER,,"This code currently assumes abbreviations are:
```pyton
action_initials = """".join([word[0] for word in action.split(""-"")])
```
https://github.com/simonw/datasette/blob/1a3dcf494376e32f7cff110c86a88e5b0a3f3924/datasette/default_permissions.py#L182-L208
That's no longer correct, they are now registered by the new plugin hook:
- #1939 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1956/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1495241162,I_kwDOBm6k_c5ZH5HK,1950,"Bad ?_sort returns a 500 error, should be a 400",9599,simonw,closed,0,,,,,2,2022-12-13T22:08:16Z,2022-12-13T22:23:22Z,2022-12-13T22:23:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable?_sort=bad
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1950/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1200649124,I_kwDOBm6k_c5HkHOk,1708,Datasette 1.0 alpha upcoming release notes,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,2,2022-04-11T22:57:12Z,2022-12-13T05:29:06Z,,OWNER,,"I'm going to try writing the release notes first, to see if that helps unblock me.
# ⚠️ Any release notes in this issue are a draft, and should not be treated as the real thing ⚠️ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1493339206,I_kwDOBm6k_c5ZAoxG,1946,`datasette --get` mechanism for sending tokens,9599,simonw,closed,0,,,8711695, Datasette 1.0a2,2,2022-12-13T04:25:05Z,2022-12-13T04:36:57Z,2022-12-13T04:36:57Z,OWNER,,"> For the tests for `datasette create-token` it would be useful if `datasette --get` had a mechanism for sending an `Authorization: Bearer X` header.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1855#issuecomment-1347731288_
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1946/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1473664029,PR_kwDOBm6k_c5ELz0u,1930,Typo in JSON API `Updating a row` documentation,3556,davidbgk,closed,0,,,,,2,2022-12-03T02:22:31Z,2022-12-08T21:12:35Z,2022-12-08T21:12:35Z,CONTRIBUTOR,simonw/datasette/pulls/1930,"
----
:books: Documentation preview :books:: https://datasette--1930.org.readthedocs.build/en/1930/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1930/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1483250004,I_kwDOBm6k_c5YaJlU,1936,Fix /db/table/-/upsert in the API explorer,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2022-12-08T00:59:34Z,2022-12-08T01:36:02Z,,OWNER,,"Split from:
- #1931
- #1878
This is a bit tricky because the code needs to figure out what the primary keys are for an item, and whether or not `rowid` should be included.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1936/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1469043836,I_kwDOBm6k_c5Xj9R8,1917,Don't allow writable API to edit the `_memory` database,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,2,2022-11-30T04:51:59Z,2022-11-30T05:07:56Z,2022-11-30T05:07:55Z,OWNER,,"It shows up on https://latest.datasette.io/-/api (once you are signed in as root) - but there's no point in creating tables in it because they likely won't persist from one request to the next, as it's not a shared named database.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1917/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1468519699,I_kwDOBm6k_c5Xh9UT,1911,`/db/-/create` should support creating tables with compound primary keys,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,2,2022-11-29T18:30:47Z,2022-11-29T18:50:58Z,2022-11-29T18:48:05Z,OWNER,,"Found myself needing this to write the tests for:
- #1864 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1911/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1455928469,I_kwDOBm6k_c5Wx7SV,1903,Refactor all error classes into a datasette.exceptions module,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2022-11-18T22:44:45Z,2022-11-20T22:35:01Z,,OWNER,,"While working on this issue:
- #1896
I realized that Datasette has error classes scattered around a fair bit, including some in the `datasette.utils.asgi` module for some reason.
I should clean these up.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1903/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1452485922,PR_kwDOBm6k_c5DEh-E,1898,Use DOMContentLoaded instead of load event for CodeMirror initialization,95570,bgrins,closed,0,,,,,2,2022-11-17T00:19:21Z,2022-11-18T07:29:01Z,2022-11-18T07:29:01Z,CONTRIBUTOR,simonw/datasette/pulls/1898," Closes #1894
----
:books: Documentation preview :books:: https://datasette--1898.org.readthedocs.build/en/1898/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1898/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1430563092,PR_kwDOCGYnMM5B6_6K,508,Allow surrogates in parameters,7908073,chapmanjacobd,closed,0,,,,,2,2022-10-31T22:11:49Z,2022-11-17T15:11:16Z,2022-10-31T22:55:36Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/508,"closes #507
https://dwheeler.com/essays/fixing-unix-linux-filenames.html
----
:books: Documentation preview :books:: https://sqlite-utils--508.org.readthedocs.build/en/508/
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/508/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1447388809,I_kwDOBm6k_c5WRWaJ,1887,Add a confirm step to the drop table API,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,2,2022-11-14T04:59:53Z,2022-11-15T19:59:59Z,2022-11-14T05:18:51Z,OWNER,,"> In playing with the API explorer just now I realized it's way too easy to accidentally drop a table using it.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1871#issuecomment-1313097057_
Added drop table API in:
- #1874",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1887/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1436539554,I_kwDOCGYnMM5Vn9qi,511,"[insert_all, upsert_all] IntegrityError: constraint failed",7908073,chapmanjacobd,closed,0,,,,,2,2022-11-04T19:21:48Z,2022-11-04T22:59:54Z,2022-11-04T22:54:09Z,CONTRIBUTOR,,"My understand is that `INSERT OR IGNORE` will ignore when inserts would cause duplicate keys so I'm not sure exactly why the error is raised from `sqlite3`.
```
import argparse
from pathlib import Path
from xklb import db, utils
from xklb.utils import log
def parse_args() -> argparse.Namespace:
parser = argparse.ArgumentParser()
parser.add_argument(""database"")
parser.add_argument(""dbs"", nargs=""*"")
parser.add_argument(""--upsert"")
parser.add_argument(""--db"", ""-db"", help=argparse.SUPPRESS)
parser.add_argument(""--verbose"", ""-v"", action=""count"", default=0)
args = parser.parse_args()
if args.db:
args.database = args.db
Path(args.database).touch()
args.db = db.connect(args)
log.info(utils.dict_filter_bool(args.__dict__))
return args
def merge_db(args, source_db):
source_db = str(Path(source_db).resolve())
s_db = db.connect(argparse.Namespace(database=source_db, verbose=args.verbose))
for table in [s for s in s_db.table_names() if not ""_fts"" in s and not s.startswith(""sqlite_"")]:
log.info(""[%s]: %s"", source_db, table)
with s_db.conn:
data = s_db[table].rows
with args.db.conn:
if args.upsert:
args.db[table].upsert_all(data, pk=args.upsert.split("",""), alter=True)
else:
args.db[table].insert_all(data, alter=True, replace=True)
def merge_dbs():
args = parse_args()
for s_db in args.dbs:
merge_db(args, s_db)
if __name__ == ""__main__"":
merge_dbs()
```
```
$ lb-dev merge video.db tube_71.db --upsert path -vv
SQL: INSERT OR IGNORE INTO [media]([path]) VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz']
...
File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, replace, truncate, extracts, conversions, columns, upsert, analyze)
3116 all_columns += [
3117 column for column in record if column not in all_columns
3118 ]
3120 first = False
-> 3122 self.insert_chunk(
3123 alter,
3124 extracts,
3125 chunk,
3126 all_columns,
3127 hash_id,
3128 hash_id_columns,
3129 upsert,
3130 pk,
3131 conversions,
3132 num_records_processed,
3133 replace,
3134 ignore,
3135 )
3137 if analyze:
3138 self.analyze()
File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:2887, in Table.insert_chunk(self, alter, extracts, chunk, all_columns, hash_id, hash_id_columns, upsert, pk, conversions, num_records_processed, replace, ignore)
2885 for query, params in queries_and_params:
2886 try:
-> 2887 result = self.db.execute(query, params)
2888 except OperationalError as e:
2889 if alter and ("" column"" in e.args[0]):
2890 # Attempt to add any missing columns, then try again
File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:484, in Database.execute(self, sql, parameters)
482 self._tracer(sql, parameters)
483 if parameters is not None:
--> 484 return self.conn.execute(sql, parameters)
485 else:
486 return self.conn.execute(sql)
IntegrityError: constraint failed
> /home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py(484)execute()
482 self._tracer(sql, parameters)
483 if parameters is not None:
--> 484 return self.conn.execute(sql, parameters)
485 else:
486 return self.conn.execute(sql)
```
```
sqlite3 --version
3.36.0 2021-06-18 18:36:39
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1431786951,I_kwDOBm6k_c5VV1XH,1876,SQL query should wrap on SQL interrupted screen,9599,simonw,closed,0,,,,,2,2022-11-01T17:14:01Z,2022-11-01T17:22:33Z,2022-11-01T17:22:33Z,OWNER,,"Just saw this:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1876/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1400431789,PR_kwDOBm6k_c5AWyQK,1837,Make hash and size a lazy property,536941,fgregg,closed,0,,,,,2,2022-10-06T23:51:22Z,2022-10-27T20:51:21Z,2022-10-27T20:51:20Z,CONTRIBUTOR,simonw/datasette/pulls/1837,"Many apologies, @simonw. My previous PR #1835 did not really solve the problem because the name of the database is often not known to database object in the init method.
I took a cue from how you dealt with this issue and made hash a lazy property and did something similar with size.
----
:books: Documentation preview :books:: https://datasette--1837.org.readthedocs.build/en/1837/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1837/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1405196044,PR_kwDOCGYnMM5AmYzy,499,feat: recreate fts triggers after table transform,7908073,chapmanjacobd,open,0,,,,,2,2022-10-11T20:35:39Z,2022-10-26T17:54:51Z,,CONTRIBUTOR,simonw/sqlite-utils/pulls/499,"https://github.com/simonw/sqlite-utils/pull/498
----
:books: Documentation preview :books:: https://sqlite-utils--499.org.readthedocs.build/en/499/
alternatively, `self.disable_fts()`",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/499/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1423182778,I_kwDOCGYnMM5U1Au6,505,Release sqlite-utils 3.30,9599,simonw,closed,0,,,,,2,2022-10-25T22:20:05Z,2022-10-25T22:41:26Z,2022-10-25T22:41:16Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.29...defa2974c6d3abc19be28d6b319649b8028dc966,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1420055377,I_kwDOBm6k_c5UpFNR,1847,Both _local_metadata and _metadata_local?,9599,simonw,closed,0,,,,,2,2022-10-24T01:43:08Z,2022-10-24T01:53:13Z,2022-10-24T01:53:13Z,OWNER,,"Spotted this in the debugger against the `datasette` object while running tests (`pytest -k test_permissions_cascade` to be exact):
```
(Pdb) [p for p in dir(self) if p.startswith('_') and '__' not in p]
['_actor', '_asset_urls', '_connected_databases', '_crumb_items', '_local_metadata', '_metadata', '_metadata_local', '_metadata_recursive_update', '_permission_checks', '_plugins', '_prepare_connection', '_refresh_schemas', '_refresh_schemas_lock', '_register_custom_units', '_register_renderers', '_root_token', '_routes', '_secret', '_settings', '_show_messages', '_startup_hook_calculation', '_startup_hook_fired', '_startup_invoked', '_threads', '_versions', '_write_messages_to_response']
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1847/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1404013495,PR_kwDOCGYnMM5AicIh,498,fix: enable-fts permanently save triggers,7908073,chapmanjacobd,closed,0,,,,,2,2022-10-11T05:10:51Z,2022-10-15T04:33:08Z,2022-10-11T06:34:31Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/498,"I was wondering why my all my databases were giving wild search results. Turns out create_trigger was not sticking!
Running `sqlite-utils triggers x.db` shows `[]` after running `enable-fts` using the python api. Looking at the counts trigger it seems that is the right way to save triggers. triggers show up now
----
:books: Documentation preview :books:: https://sqlite-utils--498.org.readthedocs.build/en/498/
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/498/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1386456717,PR_kwDOBm6k_c4_oHI4,1820,[SPIKE] Don't truncate query CSVs,536941,fgregg,closed,0,,,,,2,2022-09-26T17:27:01Z,2022-10-07T16:12:17Z,2022-10-07T16:12:17Z,CONTRIBUTOR,simonw/datasette/pulls/1820,"Relates to #526
This is a minimal set of changes needed for having *query* CSVs attempt to download all the rows.
What's good about it is the minimalism.
What's bad about it:
1. We are abusing the `_size` argument to indicate we don't want truncation, which isn't the most obvious thing. Additionally, there are various checks that make sure the ""_size"" URL parameter is a positive integer, which we are relying on to prevent overloading.
2. The default CSV on a table page will use the max_returned_rows argument. Changing this could be a breaking change, since that's currently a place that has some facilities for pagination. Additionally, i think there's a limit under the hood somewhere which if we removed could lead to sql timeouts
3. There are similar reasons for leaving the current streaming method alone, as the current methods could allow for downloading very large files that could have a sql timeout if we tried to get them in one go.
----
:books: Documentation preview :books:: https://datasette--1820.org.readthedocs.build/en/1820/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1820/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1,
1397193691,I_kwDOBm6k_c5TR3vb,1832,__bool__ method on Results,9599,simonw,closed,0,,,,,2,2022-10-05T04:18:12Z,2022-10-05T04:32:33Z,2022-10-05T04:32:33Z,OWNER,,"Wrote this code today: https://github.com/simonw/datasette-public/blob/1401bfae50e71c1dfd2bfb6954f2e86d5a7ab21b/datasette_public/__init__.py#L41
```python
results = await db.execute(
""select 1 from _public_tables where table_name = ?"", [table_name]
)
if len(results):
return True
```
Would be nice if I could use `if results` there instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1832/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
923270900,MDExOlB1bGxSZXF1ZXN0NjcyMDUzODEx,65,basic support for events,231498,khimaros,open,0,,,,,2,2021-06-17T00:51:30Z,2022-10-03T22:35:03Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/65,"a quick first pass at implementing the feature requested in https://github.com/dogsheep/github-to-sqlite/issues/64
testing instructions:
```
$ github-to-sqlite events events.db user/khimaros
```
if the specified user is the authenticated user, it will also include private events.
caveat: pagination appears to be broken (i don't see `next` in the response JSON from GitHub)",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1393903845,I_kwDOBm6k_c5TFUjl,1828,word-wrap: anywhere resulting in weird display,9599,simonw,closed,0,,,,,2,2022-10-02T21:25:03Z,2022-10-02T23:01:17Z,2022-10-02T23:01:17Z,OWNER,,"e.g. on https://github-to-sqlite.dogsheep.net/github/commits
This is from a change introduced here: https://github.com/simonw/datasette/commit/bf8d84af5422606597be893cedd375020cb2b369 in #1805
https://github.com/simonw/datasette/blob/bf8d84af5422606597be893cedd375020cb2b369/datasette/static/app.css#L447-L450",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1828/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1122427321,I_kwDOBm6k_c5C5uG5,1624,Index page `/` has no CORS headers,9599,simonw,open,0,,,,,2,2022-02-02T21:56:10Z,2022-09-28T16:54:22Z,,OWNER,,"Compare the following:
```
% curl -I 'https://latest.datasette.io/fixtures'
HTTP/1.1 200 OK
link: https://latest.datasette.io/fixtures.json; rel=""alternate""; type=""application/json+datasette""
cache-control: max-age=5
referrer-policy: no-referrer
access-control-allow-origin: *
access-control-allow-headers: Authorization
access-control-expose-headers: Link
content-type: text/html; charset=utf-8
x-databases: _memory, _internal, fixtures, extra_database
Date: Wed, 02 Feb 2022 21:55:49 GMT
Server: Google Frontend
Transfer-Encoding: chunked
% curl -I 'https://latest.datasette.io/'
HTTP/1.1 200 OK
link: https://latest.datasette.io/.json; rel=""alternate""; type=""application/json+datasette""
content-type: text/html; charset=utf-8
x-databases: _memory, _internal, fixtures, extra_database
Date: Wed, 02 Feb 2022 21:55:52 GMT
Server: Google Frontend
Transfer-Encoding: chunked
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1388227245,PR_kwDOBm6k_c4_uCkO,1825,Add documentation for serving via OpenRC,1048831,asimpson,closed,0,,,,,2,2022-09-27T19:00:56Z,2022-09-28T04:21:37Z,2022-09-28T04:21:37Z,CONTRIBUTOR,simonw/datasette/pulls/1825,"I also removed a few lines which felt redundant given the following section dedicated to running behind a nginx proxy.
----
:books: Documentation preview :books:: https://datasette--1825.org.readthedocs.build/en/1825/
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1825/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1386593843,I_kwDOCGYnMM5Spb4z,494,Document how to use Just,9599,simonw,closed,0,,,,,2,2022-09-26T19:25:12Z,2022-09-26T19:32:36Z,2022-09-26T19:26:39Z,OWNER,,"I'm using `just` a lot know, based on this file - I should add that to https://sqlite-utils.datasette.io/en/latest/contributing.html
https://github.com/simonw/sqlite-utils/blob/afbd2b2cba45cccb305c3d4638d18db4dd3d4bbd/Justfile#L1-L24",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1363765916,I_kwDOCGYnMM5RSWqc,483,`sqlite-utils install` command,9599,simonw,closed,0,,,,,2,2022-09-06T20:13:55Z,2022-09-26T19:04:43Z,2022-09-26T18:57:15Z,OWNER,,"With the addition of `--functions` in:
- #471
In addition to the existing `convert` command, there are now very good reasons to want to install additional packages into the same virtual environment as `sqlite-utils` itself, to allow them to be used with those features.
This isn't easy if you installed the tool with `pipx` or `brew install sqlite-utils`.
Datasette solved this problem with the `datasette install` command:
- https://github.com/simonw/datasette/issues/925
`sqlite-utils` could benefit from the same idea.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/483/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands:
```
followers Save followers for specified user (defaults to...
followers-ids Populate followers table with IDs of account followers
friends-ids Populate followers table with IDs of account friends
```
Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1378640768,I_kwDOBm6k_c5SLGOA,1816,Validate settings.json on startup in configuration directory mode,9599,simonw,closed,0,,,,,2,2022-09-19T23:35:18Z,2022-09-20T01:15:48Z,2022-09-20T01:15:48Z,OWNER,,"> It might have been useful for Datasette to show an error when started against a `settings.json` file that contains an invalid setting though.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1816/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1378495690,I_kwDOBm6k_c5SKizK,1814,Static files not served,4068,frafra,closed,0,,,,,2,2022-09-19T20:38:17Z,2022-09-19T23:35:06Z,2022-09-19T23:34:30Z,NONE,,"Folder structure:
```
bibliography/
bibliography/static-files
bibliography/static-files/styles.css
bibliography/bibliography.db
bibliography/metadata.json
bibliography/settings.json
```
```
$ cat bibliography/settings.json
{
""suggest_facets"": false,
""truncate_cells_html"": 1000,
""static"": ""assets:static-files/""
}
```
File `/assets/styles.css` is not found (HTTP 404, `Database not found: assets`).
Using datasette revision d0737e4de51ce178e556fc011ccb8cc46bbb6359.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1814/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1367835380,I_kwDOCGYnMM5Rh4L0,487,Specify foreign key against compound key in other table,540968,ryanfox,closed,0,,,,,2,2022-09-09T13:32:09Z,2022-09-11T04:00:44Z,2022-09-11T04:00:44Z,NONE,,"When inserting rows via the library, is it possible to specify a foreign key to a compound primary key?
For example, suppose I create a table:
```
db = Database('events.db')
db['events'].insert_all([
{'venue': 'Times Square', 'date': '2022-12-31', 'title': 'Rockin New Year Eve'},
{'venue': 'Wembley Stadium', 'date': '2022-06-05', 'title': 'FA Cup'},
{'venue': 'Times Square', 'date': '2021-12-31', 'title': 'Rockin New Year Eve'},
], pk=('date', 'venue'))
```
And I want to add related data in another table:
```
act = {'name': 'Rick Astley', 'venue': 'Times Square', 'date': '2021-12-31' }
db['performers'].insert(act, pk=??>)
```
Is it possible to specify a value for `pk` that will point to the compound primary key in `events`?
SQLite does support it:
https://www.sqlite.org/foreignkeys.html#fk_composite",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1353441389,I_kwDOCGYnMM5Qq-Bt,477,Conda Forge,49702524,thewchan,closed,0,,,,,2,2022-08-28T19:03:08Z,2022-09-07T03:46:55Z,2022-09-07T03:46:55Z,NONE,,"Hello! I have successfully put this package on to Conda Forge, and I have extending the invitation for the owner/maintainers of this package to be maintainers on Conda Forge as well. Let me know if you are interested! Thanks.
https://github.com/conda-forge/sqlite-utils-feedstock",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1353196970,I_kwDOCGYnMM5QqCWq,476,Release notes for 3.29,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T23:21:21Z,2022-08-28T04:07:15Z,2022-08-28T04:07:03Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/3.28...104f37fa4d2e7e5999c1d829267b62c737f74d3e,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1352946135,I_kwDOCGYnMM5QpFHX,472,Reuse the locals/globals fix from --functions for other code accepting options,9599,simonw,closed,0,,,8355157,3.29,2,2022-08-27T05:12:05Z,2022-08-27T05:20:12Z,2022-08-27T05:20:12Z,OWNER,,"I figured out a workaround for the ugly `global x` hack here:
- https://github.com/simonw/sqlite-utils/issues/471#issuecomment-1229120653",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/472/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1319881016,PR_kwDOCGYnMM48Mmde,457,Link to installation instructions,9599,simonw,closed,0,,,8355157,3.29,2,2022-07-27T17:38:36Z,2022-08-27T03:55:52Z,2022-07-27T17:57:50Z,OWNER,simonw/sqlite-utils/pulls/457,Also testing https://docs.readthedocs.io/en/stable/pull-requests.html,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1339663518,I_kwDOBm6k_c5P2aSe,1784,"Include ""entrypoint"" option on `--load-extension`?",15178711,asg017,closed,0,,,,,2,2022-08-16T00:22:57Z,2022-08-23T18:34:31Z,2022-08-23T18:34:31Z,CONTRIBUTOR,,"## Problem
SQLite extensions have the option to define multiple ""entrypoints"" in each loadable extension. For example, the upcoming version of `sqlite-lines` will have 2 entrypoints: the default `sqlite3_lines_init` (which SQLite will automatically guess for) and `sqlite3_lines_noread_init`. The `sqlite3_lines_noread_init` version omits functions that read from the filesystem, which is necessary for security purposes when running untrusted SQL (which Datasette does).
(Similar multiple entrypoints will also be added for sqlite-http).
The `--load-extension` flag, however, doesn't give the option to specify a different entrypoint, so the default one is always used.
## Proposal
I want there to be a new command line option of the `--load-extension` flag to specify a custom entrypoint like so:
```
datasette my.db \
--load-extension ./lines0 sqlite3_lines0_noread_init
```
Then, under the hood, this line of code:
https://github.com/simonw/datasette/blob/7af67b54b7d9bca43e948510fc62f6db2b748fa8/datasette/app.py#L562
Would look something like this:
```python
conn.execute(""SELECT load_extension(?, ?)"", [extension, entrypoint])
```
One potential problem: For backward compatibility, I'm not sure if Click allows cli flags to have variable number of options (""arity""). So I guess it could also use a `:` delimiter like `--static`:
```
datasette my.db \
--load-extension ./lines0:sqlite3_lines0_noread_init
```
Or maybe even a new flag name?
```
datasette my.db \
--load-extension-entrypoint ./lines0 sqlite3_lines0_noread_init
```
Personally I prefer the `:` option... and maybe even `--load-extension` -> `--load`? Definitely out of scope for this issue tho
```
datasette my.db \
--load./lines0:sqlite3_lines0_noread_init
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1784/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1342357149,PR_kwDOCGYnMM49Wsnq,465,beanbag-docutils>=2.0,9599,simonw,closed,0,,,,,2,2022-08-17T22:41:39Z,2022-08-17T23:38:07Z,2022-08-17T23:38:02Z,OWNER,simonw/sqlite-utils/pulls/465,Refs #464,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/465/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1339444565,I_kwDOBm6k_c5P1k1V,1783,Better guidance as to what to do after you've installed Datasette,9599,simonw,open,0,,,,,2,2022-08-15T20:11:06Z,2022-08-15T20:14:01Z,,OWNER,,"Feedback [from Discord](https://discord.com/channels/823971286308356157/823971286941302908/1008822978793984060):
> hello, love the project and came for help and to point out a possible gap in the docs. starting with ""getting started"" and ""installation"" every thing looks great, but then there's a giant leap after you have it installed and running. from the user perspective of ""i have a csv of set of csvs that i want to turn into a table(s), what do i do next?"" --- so something like maybe a page for creating your first project should go after ""installation"".
- https://docs.datasette.io/en/0.62/getting_started.html
- https://docs.datasette.io/en/0.62/installation.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1783/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1338278056,I_kwDOBm6k_c5PxICo,1782,Release notes for Datasette 0.62,9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-08-14T15:26:45Z,2022-08-14T17:40:45Z,2022-08-14T17:32:54Z,OWNER,,"I've written a lot of these already for the alphas:
- https://github.com/simonw/datasette/releases/tag/0.62a0
- https://github.com/simonw/datasette/releases/tag/0.62a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1782/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223527226,I_kwDOBm6k_c5I7Ys6,1738,"""Cannot use _sort and _sort_desc at the same time""",9599,simonw,closed,0,,,8303187,Datasette 0.62,2,2022-05-03T01:06:24Z,2022-08-14T16:13:55Z,2022-08-14T16:13:55Z,OWNER,,"Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width:
https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1
Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply:
![sort-bug](https://user-images.githubusercontent.com/9599/166390804-cb289b29-63dc-4986-b7f9-81cf2ae04914.gif)
Also notable: I managed to get to a page with `?_sort_desk=pk1` in the URL three times by clicking around with that button.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1326391841,PR_kwDOCGYnMM48iLGF,462,Discord badge,9599,simonw,closed,0,,,,,2,2022-08-02T20:56:04Z,2022-08-02T21:15:57Z,2022-08-02T21:15:52Z,OWNER,simonw/sqlite-utils/pulls/462,"Also testing fix for:
- https://github.com/readthedocs/readthedocs-preview/issues/10
----
:books: Documentation preview :books:: https://sqlite-utils--462.org.readthedocs.build/en/462/
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/462/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1292377561,PR_kwDOBm6k_c46wdOW,1766,Keep track of config_dir,25778,eyeseast,closed,0,,,,,2,2022-07-03T17:37:02Z,2022-07-18T01:12:45Z,2022-07-18T01:12:45Z,CONTRIBUTOR,simonw/datasette/pulls/1766,"Closes #1764
Small change that adds `self.config_dir = config_dir` to `Datasette.__init__`. This will let plugins also use `config_dir`, if available.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1298531653,I_kwDOCGYnMM5NZgVF,451,Make sqlite_utils.utils.chunks a documented function,9599,simonw,closed,0,,,,,2,2022-07-08T06:01:04Z,2022-07-15T22:09:34Z,2022-07-15T21:59:33Z,OWNER,,I want to use it in another project: https://github.com/simonw/sqlite-utils/blob/8a9fe6498faf783a1fdeb1793e661ad194a05267/sqlite_utils/utils.py#L471-L474,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/451/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1261826957,PR_kwDOBm6k_c45Kojn,1753,Bump furo from 2022.4.7 to 2022.6.4.1,49699333,dependabot[bot],closed,0,,,,,2,2022-06-06T13:10:22Z,2022-06-22T13:22:37Z,2022-06-22T13:22:35Z,CONTRIBUTOR,simonw/datasette/pulls/1753,"Bumps [furo](https://github.com/pradyunsg/furo) from 2022.4.7 to 2022.6.4.1.
Changelog
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2022.4.7&new-version=2022.6.4.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1277328147,I_kwDOCGYnMM5MInsT,446,Use Just to automate running tests and linters locally,9599,simonw,closed,0,,,,,2,2022-06-20T19:51:09Z,2022-06-21T19:28:35Z,2022-06-20T19:54:50Z,OWNER,,I keep committing code that fails additional tests like `mypy` and `flake8` and `black`. Automate those using Just.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1269998342,I_kwDOCGYnMM5LsqMG,443,Make `utils.rows_from_file()` a documented API,9599,simonw,closed,0,,,,,2,2022-06-13T21:53:24Z,2022-06-20T19:49:37Z,2022-06-14T20:12:46Z,OWNER,,"> `rows_from_file()` isn't part of the documented API but maybe it should be!
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1154385916_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1224112817,I_kwDOCGYnMM5I9nqx,430,Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases,9308268,rayvoelker,open,0,,,,,2,2022-05-03T13:33:58Z,2022-06-14T23:26:37Z,,NONE,,"I'm trying to figure out a way to get the `table.extract()` method to complete successfully -- I'm not sure if maybe the cause (and a possible solution) of this on Ubuntu Server 22.04 is to adjust some of the PRAGMA values within SQLite itself ... on another Linux system (PopOS), using this method on this same database appears to work just fine.
Here's the bit that's causing the error, and the resulting error output:
```python
# combine these columns into 1 table ""bib_properties"" :
# best_title
# bib_level_code
# mat_type
# material_code
# best_author
db[""circ_trans""].extract(
[""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""],
table=""bib_properties"",
fk_column=""bib_properties_id""
)
db[""circ_trans""].extract(
[""call_number""],
table=""call_number"",
fk_column=""call_number_id"",
rename={""call_number"": ""value""}
)
```
```python
---------------------------------------------------------------------------
OperationalError Traceback (most recent call last)
Input In [17], in ()
1 # combine these columns into 1 table ""bib_properties"" :
2 # best_title
3 # bib_level_code
4 # mat_type
5 # material_code
6 # best_author
----> 7 db[""circ_trans""].extract(
8 [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""],
9 table=""bib_properties"",
10 fk_column=""bib_properties_id""
11 )
13 db[""circ_trans""].extract(
14 [""call_number""],
15 table=""call_number"",
16 fk_column=""call_number_id"",
17 rename={""call_number"": ""value""}
18 )
File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1764, in Table.extract(self, columns, table, fk_column, rename)
1761 column_order.append(c.name)
1763 # Drop the unnecessary columns and rename lookup column
-> 1764 self.transform(
1765 drop=set(columns),
1766 rename={magic_lookup_column: fk_column},
1767 column_order=column_order,
1768 )
1770 # And add the foreign key constraint
1771 self.add_foreign_key(fk_column, table, ""id"")
File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1526, in Table.transform(self, types, rename, drop, pk, not_null, defaults, drop_foreign_keys, column_order)
1524 with self.db.conn:
1525 for sql in sqls:
-> 1526 self.db.execute(sql)
1527 # Run the foreign_key_check before we commit
1528 if pragma_foreign_keys_was_on:
File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:465, in Database.execute(self, sql, parameters)
463 return self.conn.execute(sql, parameters)
464 else:
--> 465 return self.conn.execute(sql)
OperationalError: database or disk is full
```
This database is about 17G in total size, so I'm assuming the error is coming from the vacuum ... where i'm assuming it's maybe trying to do the temp storage in a location that doesn't have sufficient room. The disk space is more than ample on the host in question (1.8T is free in the directory where the sqlite db resides) The `/tmp` directory however is limited on a smaller disk associated with the OS
I'm trying to think if there's a way to set the `PRAGMA temp_store` or maybe if it's `temp_store_directory` that I'm after ... to use the same local directory of where the file is located (maybe this is a property of the version of sqlite on the system?)
```python
# SET the temp file store to be a file ...
print(db.execute('PRAGMA temp_store').fetchall())
print(db.execute('PRAGMA temp_store=FILE').fetchall())
print(db.execute('PRAGMA temp_store').fetchall())
# the users home directory ...
print(db.execute(""PRAGMA temp_store_directory='/home/plchuser/'"").fetchall())
print(db.execute(""PRAGMA sqlite3_temp_directory='/home/plchuser/'"").fetchall())
print(db.execute(""PRAGMA temp_store_directory"").fetchall())
print(db.execute(""PRAGMA sqlite3_temp_directory"").fetchall())
```
```text
[(1,)]
[]
[(1,)]
[]
[]
[('/home/plchuser/',)]
[]
```
Here's the docs on the Temporary File Storage Locations
https://www.sqlite.org/tempfiles.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1269886084,I_kwDOCGYnMM5LsOyE,442,`maximize_csv_field_size_limit()` utility function,9599,simonw,closed,0,,,,,2,2022-06-13T19:54:54Z,2022-06-14T21:55:15Z,2022-06-14T21:31:49Z,OWNER,,"This code here runs only if `cli.py` is imported: https://github.com/simonw/sqlite-utils/blob/7ddf5300886a32d6daf60cf1d71efe492b65c87e/sqlite_utils/cli.py#L50-L59
I found myself needing the same fix in another library:
- https://github.com/simonw/datasette-socrata/issues/13
It should be a documented utility function.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1250161887,I_kwDOCGYnMM5Kg_Tf,438,illegal UTF-16 surrogate,4068,frafra,closed,0,,,,,2,2022-05-26T22:49:52Z,2022-05-27T08:21:53Z,2022-05-27T08:21:53Z,NONE,,"I am trying to insert `https://artsdatabanken.no/Fab2018/api/export/csv` into a SQLite database, but I have an error when using `sqlite-utils`:
```
sqlite-utils insert --csv --delimiter "";"" --encoding=""utf-16-le"" --pk ""Id"" csv fremmedart test.db
[------------------------------------] 0%
Error: 'utf-16-le' codec can't decode bytes in position 98-99: illegal UTF-16 surrogate
The input you provided uses a character encoding other than utf-8.
You can fix this by passing the --encoding= option with the encoding of the file.
If you do not know the encoding, running 'file filename.csv' may tell you.
It's often worth trying: --encoding=latin-1
```
I tried to convert the file using `iconv -f ""utf-16le"" -t ""utf-8""`, but I still get a similar error (slightly different position):
```
sqlite-utils insert --csv --delimiter "";"" --encoding=utf-8 --pk ""Id"" csv_utf8 fremmedart test.db
[------------------------------------] 0%
Error: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte
The input you provided uses a character encoding other than utf-8.
You can fix this by passing the --encoding= option with the encoding of the file.
If you do not know the encoding, running 'file filename.csv' may tell you.
It's often worth trying: --encoding=latin-1
```
I have no issues reading such file using this Python code:
```python
content = open('csv', encoding='utf-16-le').read())
```
`in2csv` works too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1243704847,I_kwDOCGYnMM5KIW4P,435,Switch to Furo documentation theme,9599,simonw,closed,0,,,,,2,2022-05-20T21:46:39Z,2022-05-20T21:56:10Z,2022-05-20T21:54:43Z,OWNER,,"As seen in:
- https://github.com/simonw/datasette/issues/1746
- https://github.com/simonw/shot-scraper/issues/77",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1243517592,I_kwDOBm6k_c5KHpKY,1748,Add copy buttons next to code examples in the documentation,9599,simonw,closed,0,,,,,2,2022-05-20T19:09:00Z,2022-05-20T19:15:00Z,2022-05-20T19:11:32Z,OWNER,,Similar to the ones in `datasette-copyable` which are implemented here: https://github.com/executablebooks/sphinx-copybutton/tree/f84c001a0507f8ec46779d0701b079a265564583,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1221849746,I_kwDOBm6k_c5I0_KS,1732,Custom page variables aren't decoded,52649,tannewt,open,0,,,,,2,2022-04-30T14:55:46Z,2022-05-03T01:50:45Z,,NONE,,"I have a page `templates/filer/{filer_id}.html`. It uses `filer_id` in a `sql()` call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db.
Datasette should unescape the url component before passing them into the template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1732/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1223177069,PR_kwDOCGYnMM43LrKB,429,Depend on click-default-group-wheel,9599,simonw,closed,0,,,,,2,2022-05-02T18:03:10Z,2022-05-02T18:52:42Z,2022-05-02T18:05:00Z,OWNER,simonw/sqlite-utils/pulls/429,"Trying to get this to work with Pyodide.
Refs: https://github.com/simonw/click-default-group-wheel/issues/3",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1129052172,I_kwDOBm6k_c5DS_gM,1633,base_url or prefix does not work with _exact match,6613091,henrikek,open,0,,,,,2,2022-02-09T21:45:07Z,2022-04-28T09:12:56Z,,NONE,,"When i hit ""Apply"" button to search with ""_exact"" for a column syntax the URL prefix is removed from the url.
![image](https://user-images.githubusercontent.com/6613091/153293758-0b757d55-5757-4987-992e-9426e69a7956.png)
And the result is:
![image](https://user-images.githubusercontent.com/6613091/153294672-87be7809-bb7b-455d-bf1a-41e90bbfa4ae.png)
If I add the marked row to url_builder.py it seams to work:
![image](https://user-images.githubusercontent.com/6613091/153295231-bdd52e37-efcf-4b21-9d37-69f182a922f4.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1213281044,PR_kwDOBm6k_c42qyUI,1717,Add timeout option to Cloudrun build,127565,wragge,closed,0,,,,,2,2022-04-23T11:51:21Z,2022-04-24T14:03:08Z,2022-04-24T14:03:08Z,CONTRIBUTOR,simonw/datasette/pulls/1717,I've found that the Cloudrun build phase often hits a timeout limit with large databases. I believe the default timeout is 10 minutes. This pull request just adds a `--timeout` option to the cloudrun `publish` command and passes the value on to the build step.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1717/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1182227211,I_kwDOBm6k_c5Gd1sL,1692,[plugins][feature request]: Support additional script tag attributes when loading custom JS,9020979,hydrosquall,open,0,,,,,2,2022-03-27T01:16:03Z,2022-03-30T06:14:51Z,,CONTRIBUTOR,,"## Motivation
- The build system for my new [plugin](https://github.com/hydrosquall/datasette-nteract-data-explorer) has two output JS files, one for browsers that support ES modules, one for browsers that don't. At present, I'm only passing one of them into Datasette.
- I'd like to specify the non-es-module script as a fallback for older browsers. I don't want to load it by default, because browsers will only need one, and it's heavy, so for now I'm only supporting modern browsers.
To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, `nomodule` and `defer`. My injected scripts should look something like this:
```html
```
## Proposal
To achieve this, I propose additional optional properties to the API accepted by the `extra_js_urls` hook and custom JS field the `metadata.json` [described here](https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript).
Under this API, I'd write something like this to get the above HTML rendered in Datasette.
```json
{
""extra_js_urls"": [
{
""url"": ""/index.my-es-module-bundle.js"",
""module"": true,
},
{
""url"": ""/index.my-legacy-fallback-bundle.js"",
""nomodule"": """",
""defer"": true
}
]
}
```
## Resources
- [MDN on the script tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script)
- There may be other properties that could be added that are potentially valuable, like `async` or `referrerpolicy`, but I don't have an immediate need for those.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1185868354,I_kwDOBm6k_c5GrupC,1695,Option to un-filter facet not shown for `?col__exact=value`,9599,simonw,open,0,,,,,2,2022-03-30T04:44:02Z,2022-03-30T04:46:18Z,,OWNER,,"Spotted this on a page with `COUNTY__exact=Lee` in the URL:
![CleanShot 2022-03-29 at 21 41 46@2x](https://user-images.githubusercontent.com/9599/160752849-a9039343-3770-4655-920b-f19e25687a57.png)
With `COUNTY=Lee` you get this instead:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1182141761,I_kwDOBm6k_c5Gdg1B,1690,"Idea: `datasette.set_actor_cookie(response, actor)`",9599,simonw,open,0,,,,,2,2022-03-26T22:41:52Z,2022-03-26T22:43:00Z,,OWNER,,"I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/__init__.py#L79-L92
```python
redirect_response = Response.redirect(""/"")
expires_at = int(time.time()) + (24 * 60 * 60)
redirect_response.set_cookie(
""ds_actor"",
datasette.sign(
{
""a"": profile_response.json(),
""e"": baseconv.base62.encode(expires_at),
},
""actor"",
),
)
return redirect_response
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1177101697,I_kwDOBm6k_c5GKSWB,1681,Potential bug in numeric handling where_clause for filters,9599,simonw,open,0,,,,,2,2022-03-22T17:43:50Z,2022-03-22T17:49:09Z,,OWNER,,"> Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like `>`:
>
> https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212
>
> Though... it looks like there's a bug in that? It doesn't account for `float` values - `""3.5"".isdigit()` return `False` - probably for the best, because `int(3.5)` would break that value anyway.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1171599874,I_kwDOCGYnMM5F1TIC,415,Convert with `--multi` and `--dry-run` flag does not work,3976183,dotcs,closed,0,,,,,2,2022-03-16T21:59:46Z,2022-03-21T04:18:24Z,2022-03-21T04:18:24Z,NONE,,"It's not possible to combine `--multi` and `--dry-run` flag in the `convert` command.
Let's first create a simple database from JSON string
```console
$ echo '[{""foo"": ""abc""}]' | sqlite-utils insert demo.db demo -
$ sqlite-utils query demo.db ""SELECT * FROM demo""
[{""foo"": ""abc""}]
```
and then try to convert the ""foo"" column with a static value ""bar"" (see docs [Converting a column into multiple columns](https://sqlite-utils.datasette.io/en/stable/cli.html#converting-a-column-into-multiple-columns))
```console
$ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi --dry-run
Traceback (most recent call last):
File ""/home/dotcs/anaconda3/envs/tools/bin/sqlite-utils"", line 8, in
sys.exit(cli())
File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__
return self.main(*args, **kwargs)
File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1053, in main
rv = self.invoke(ctx)
File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 754, in invoke
return __callback(*args, **kwargs)
File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2686, in convert
for row in db.conn.execute(sql, where_args).fetchall():
sqlite3.OperationalError: user-defined function raised exception
```
But without the `--dry-run` flag it does work as expected:
```console
$ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi
$ sqlite-utils query demo.db ""SELECT * FROM demo""
[{""foo"": ""bar""}]
```
```console
$ sqlite-utils --version
sqlite-utils, version 3.25.1
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
910088936,MDU6SXNzdWU5MTAwODg5MzY=,1355,datasette --get should efficiently handle streaming CSV,9599,simonw,open,0,,,,,2,2021-06-03T04:40:40Z,2022-03-20T22:38:53Z,,OWNER,,"It would be great if you could use `datasette --get` to run queries that return streaming CSV data without running out of RAM.
Current implementation looks like it loads the entire result into memory first: https://github.com/simonw/datasette/blob/f78ebdc04537a6102316d6dbbf6c887565806078/datasette/cli.py#L546-L552",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1174404647,I_kwDOBm6k_c5F__4n,1669,Release 0.61 alpha,9599,simonw,closed,0,,,,,2,2022-03-20T00:35:35Z,2022-03-20T01:24:36Z,2022-03-20T01:24:36Z,OWNER,,"> I'm going to release this as a 0.61 alpha so I can more easily depend on it from `datasette-hashed-urls`.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1170554975,I_kwDOBm6k_c5FxUBf,1663,Document the internals that were used in datasette-hashed-urls,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-03-16T05:17:08Z,2022-03-19T04:04:50Z,2022-03-17T21:32:38Z,OWNER,,"The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features:
- `db.hash`
- `Datasette(..., immutables=[...])`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1122416919,I_kwDOBm6k_c5C5rkX,1623,/-/patterns returns link: alternate JSON header to 404,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-02-02T21:42:49Z,2022-03-19T04:04:49Z,2022-02-02T21:48:56Z,OWNER,,"Bug from:
- #1620
```
% curl -s -I 'https://latest.datasette.io/-/patterns' | grep link
link: https://latest.datasette.io/-/patterns.json; rel=""alternate""; type=""application/json+datasette""
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
626593402,MDU6SXNzdWU2MjY1OTM0MDI=,780,Internals documentation for datasette.metadata() method,9599,simonw,open,0,,,3268330,Datasette 1.0,2,2020-05-28T15:14:22Z,2022-03-15T20:50:34Z,,OWNER,,https://github.com/simonw/datasette/blob/40885ef24e32d91502b6b8bbad1c7376f50f2830/datasette/app.py#L297-L328,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/780/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon!
i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png)
```bash
Traceback (most recent call last):
File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in
sys.exit(cli())
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__
return self.main(*args, **kwargs)
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main
rv = self.invoke(ctx)
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke
return __callback(*args, **kwargs)
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline
profile = utils.get_profile(db, session, **kwargs)
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile
save_users(db, [profile])
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users
transform_user(user)
File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user
user[""created_at""] = parser.parse(user[""created_at""])
KeyError: 'created_at'
```
this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1161937073,I_kwDOBm6k_c5FQcCx,1653,Mechanism to default a table to sorting by multiple columns,9599,simonw,open,0,,,,,2,2022-03-07T21:20:11Z,2022-03-07T21:23:39Z,,OWNER,,"### Discussed in https://github.com/simonw/datasette/discussions/1652
Originally posted by **zaneselvans** March 7, 2022
It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order):
```yaml
databases:
ferc1:
tables:
f1_edcfu_epda:
sort: created_time
```
But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette?
```sql
SELECT
respondent_id,
report_year,
spplmnt_num,
row_number,
row_seq,
row_prvlg,
acct_num,
depr_plnt_base,
est_avg_srvce_lf,
net_salvage,
apply_depr_rate,
mrtlty_crv_typ,
avg_remaining_lf,
report_prd
FROM
f1_edcfu_epda
WHERE
respondent_id = 210
AND report_year = 2020
ORDER BY
report_year, report_prd, respondent_id, spplmnt_num, row_number
LIMIT
1000
```
The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key.
I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names...
```yaml
databases:
ferc1:
tables:
f1_edcfu_epda:
sort: ""(report_year, report_prd, respondent_id, spplmnt_num, row_number)""
```
and they all give me server errors like:
```
Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number)
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1148725876,I_kwDOBm6k_c5EeCp0,1640,"Support static assets where file length may change, e.g. logs",57859326,broccolihighkicks,open,0,,,,,2,2022-02-24T00:34:42Z,2022-03-05T01:19:25Z,,NONE,,"This is a bit of an oxymoron.
I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc).
I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes).
```python
Traceback (most recent call last):
File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path
response = await view(request, send)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static
await asgi_send_file(send, full_path, chunk_size=chunk_size)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file
await send(
File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send
await send(event)
File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send
output = self.conn.send(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send
data_list = self.send_with_data_passthrough(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough
writer(event, data_list.append)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__
self.send_data(event.data, write)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data
raise LocalProtocolError(""Too much data for declared Content-Length"")
h11._util.LocalProtocolError: Too much data for declared Content-Length
ERROR: Exception in ASGI application
Traceback (most recent call last):
File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path
response = await view(request, send)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static
await asgi_send_file(send, full_path, chunk_size=chunk_size)
File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file
await send(
File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send
await send(event)
File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send
output = self.conn.send(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send
data_list = self.send_with_data_passthrough(event)
File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough
writer(event, data_list.append)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__
self.send_data(event.data, write)
File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data
raise LocalProtocolError(""Too much data for declared Content-Length"")
h11._util.LocalProtocolError: Too much data for declared Content-Length
```
Thanks, I am finding Datasette very useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1128139375,I_kwDOCGYnMM5DPgpv,405,"`Database(memory_name=""name"")` constructor argument",9599,simonw,closed,0,,,,,2,2022-02-09T07:15:03Z,2022-02-16T01:23:16Z,2022-02-16T01:23:16Z,OWNER,,"SQLite in-memory databases can be named, in which case multiple connections can be opened to a shared in-memory database running within the same process.
Datasette supports this - SQLite could support it too.
https://docs.datasette.io/en/0.60.2/internals.html#database-ds-path-none-is-mutable-false-is-memory-false-memory-name-none",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1128120451,I_kwDOCGYnMM5DPcCD,404,Add example of `--convert` to the help for `sqlite-utils insert`,9599,simonw,closed,0,,,,,2,2022-02-09T06:49:09Z,2022-02-09T06:56:35Z,2022-02-09T06:55:16Z,OWNER,,"https://sqlite-utils.datasette.io/en/3.23/cli-reference.html#insert would be more useful if it included an example of `--convert` in action.
I can maybe use an example from https://simonwillison.net/2022/Jan/11/sqlite-utils/",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1109783030,I_kwDOBm6k_c5CJfH2,1607,More detailed information about installed SpatiaLite version,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2022-01-20T21:28:03Z,2022-02-09T06:42:02Z,2022-02-09T06:32:28Z,OWNER,,"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like `freexl_version()` and `geos_version()` and `HasMathSQL()` and suchlike.
These could be shown on the `/-/versions` page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1101705012,PR_kwDOBm6k_c4w7eqc,1593,"Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18",49699333,dependabot[bot],closed,0,,,,,2,2022-01-13T13:11:50Z,2022-02-07T13:13:24Z,2022-02-07T13:13:23Z,CONTRIBUTOR,simonw/datasette/pulls/1593,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Release notes
title: 'pytest-asyncio: pytest support for asyncio'
pytest-asyncio is an Apache2 licensed library, written in Python, for
testing asyncio code with pytest.
asyncio code is usually written in the form of coroutines, which makes
it slightly more difficult to test using normal testing tools.
pytest-asyncio provides useful fixtures and markers to make testing
easier.
@pytest.mark.asyncio
async def test_some_asyncio_code():
res = await library.do_something()
assert b"expected result" == res
pytest-asyncio has been strongly influenced by
pytest-tornado.
Features
fixtures for creating and injecting versions of the asyncio event
loop
fixtures for injecting unused tcp/udp ports
pytest markers for treating tests as asyncio coroutines
easy testing with non-default event loops
support for [async def]{.title-ref} fixtures and async generator
fixtures
support auto mode to handle all async fixtures and tests
automatically by asyncio; provide strict mode if a test suite
should work with different async frameworks simultaneously, e.g.
asyncio and trio.
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1593/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1125081640,I_kwDOCGYnMM5DD2Io,401,Update SpatiaLite example in the documentation,9599,simonw,closed,0,,,,,2,2022-02-06T02:02:07Z,2022-02-06T02:05:03Z,2022-02-06T02:03:24Z,OWNER,,"This one here: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions
It should take advantage of the new methods from:
- #79",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
723460107,MDU6SXNzdWU3MjM0NjAxMDc=,187,Maybe: Utility method / CLI tool for initializing SpatiaLite,9599,simonw,closed,0,,,,,2,2020-10-16T19:04:03Z,2022-02-05T00:04:26Z,2020-10-16T19:15:13Z,OWNER,,"> I think this should initialize SpatiaLite against the current database if it has not been initialized already.
>
> Relevant code: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L112-L126",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
557825032,MDU6SXNzdWU1NTc4MjUwMzI=,77,Ability to insert data that is transformed by a SQL function,9599,simonw,closed,0,,,,,2,2020-01-30T23:45:55Z,2022-02-05T00:04:25Z,2020-01-31T00:24:32Z,OWNER,,"I want to be able to run the equivalent of this SQL insert:
```python
# Convert to ""Well Known Text"" format
wkt = shape(geojson['geometry']).wkt
# Insert and commit the record
conn.execute(""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"", (
""Wales"", wkt
))
conn.commit()
```
From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html
To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1119413338,PR_kwDOBm6k_c4x1kCu,1616,Bump black from 21.12b0 to 22.1.0,49699333,dependabot[bot],closed,0,,,,,2,2022-01-31T13:13:46Z,2022-02-02T22:23:52Z,2022-02-02T22:23:51Z,CONTRIBUTOR,simonw/datasette/pulls/1616,"Bumps [black](https://github.com/psf/black) from 21.12b0 to 22.1.0.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.12b0&new-version=22.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1117132741,I_kwDOBm6k_c5ClhfF,1615,Potential simplified publishing mechanism,369053,aidansteele,closed,0,,,,,2,2022-01-28T08:34:50Z,2022-02-02T07:34:21Z,2022-02-02T07:34:17Z,NONE,,"Hi,
Forewarning: this idea is one I've only been thinking about for a while and it's not fully fleshed-out yet.
I love Datasette and what it stands for. I was thinking about how we could make it accessible to more people, especially those without access to credit cards required for a lot of hosting options. Or they might not feel comfortable signing up for said services.
So I was thinking I might create a service that hosts Datasette instances for folks. I'd probably stick it on AWS Lambda and limit requests to something like n/month to avoid bankrupting myself. If I did build such a hypothetical service, I was thinking I would rely on GitHub Actions to do the heavy lifting.
E.g. user `johndoe` creates a repo `my-animals` with a couple of files: `dogs.csv`, `cats.csv` and the following GitHub Actions workflow:
```yaml
# .github/workflows/push.yml
on:
push
# this allows the publish action to use OIDC to authenticate johndoe/my-animals
permissions:
id-token: write
contents: read
jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/setup-python@v2
- run: pip install sqlite-utils
- uses: actions/checkout@v2
- run: |
set -eux
sqlite-utils create-database animals.db
sqlite-utils insert animals.db dogs dogs.csv --csv
sqlite-utils insert animals.db cats cats.csv --csv
- uses: datasette-hub/publish@v1
with:
db: animals.db
metadata: meta.yml
# this step is helpful for debugging why the
# generated sqlite db was rejected
- uses: actions/upload-artifact@v2
if: failure()
with:
path: animals.db
retention-days: 1
```
This would then cause a Datasette instance to be available at `https://johndoe-my-animals.datasette-hub.test/`. It feels like this could significantly reduce the friction to someone being able to go from data set to Datasette.
What do you think? Does this address a real need? Or am I perhaps misunderstanding the main friction points? As a bonus: it feels like this would pair well with [git scraping](https://simonwillison.net/2020/Oct/9/git-scraping/).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1114638930,I_kwDOCGYnMM5CcApS,391,`sqlite-utils bulk` progress bar,9599,simonw,closed,0,,,,,2,2022-01-26T05:14:49Z,2022-01-26T05:17:20Z,2022-01-26T05:16:51Z,OWNER,,"It can easily have a progress bar because it works by looping through an iterator: https://github.com/simonw/sqlite-utils/blob/a9fca7efa4184fbb2a65ca1275c326950ed9d3c1/sqlite_utils/cli.py#L1014-L1018
Should also support the `--silent` option if I add this.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
471818939,MDU6SXNzdWU0NzE4MTg5Mzk=,48,"Jupyter notebook demo of the library, launchable on Binder",9599,simonw,closed,0,,,,,2,2019-07-23T17:05:05Z,2022-01-26T02:08:46Z,2022-01-26T02:08:39Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1114544727,I_kwDOCGYnMM5CbppX,389,Plausible analytics for documentation,9599,simonw,closed,0,,,,,2,2022-01-26T01:58:35Z,2022-01-26T02:07:41Z,2022-01-26T02:07:41Z,OWNER,,"```html
```
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/388#issuecomment-1021785268_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
842416110,MDU6SXNzdWU4NDI0MTYxMTA=,1278,SpatiaLite timezones demo is broken,9599,simonw,closed,0,,,,,2,2021-03-27T04:45:27Z,2022-01-20T21:29:43Z,2021-03-27T16:17:13Z,OWNER,,https://github.com/simonw/datasette/blob/5fd02890650db790b2ffdb90eb9f78f8e0639c37/docs/spatialite.rst#L96,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
443040665,MDU6SXNzdWU0NDMwNDA2NjU=,466,"Move ""no such module: VirtualSpatialIndex"" code elsewhere",9599,simonw,closed,0,,,4305096,0.28,2,2019-05-11T22:09:00Z,2022-01-20T21:29:41Z,2019-05-11T22:57:22Z,OWNER,,"We currently show a useful warning (from #331) when the user tries to open a spatialite database without first loading the module:
https://github.com/simonw/datasette/blob/c692cd291111050483a32bea1ee08e994a0b781b/datasette/app.py#L547-L554
This code is part of `.inspect()` which is going away - see #462 - so I need to find somewhere else for it to live.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
563347679,MDU6SXNzdWU1NjMzNDc2Nzk=,668,Make it easier to load SpatiaLite,9599,simonw,closed,0,,,,,2,2020-02-11T17:03:43Z,2022-01-20T21:29:41Z,2021-01-04T20:18:39Z,OWNER,,"```
$ datasette spatial.db
Serve! files=('spatial.db',) (immutables=()) on port 8001
ERROR: conn=, sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex
Usage: datasette serve [OPTIONS] [FILES]...
Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module.
Read more: https://datasette.readthedocs.io/en/latest/spatialite.html
```
This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it:
```
datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib
```
Even better: if Datasette had a `--spatialite` option which automatically loads the extension from common locations, if it can find it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/668/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
336936010,MDU6SXNzdWUzMzY5MzYwMTA=,331,Datasette throws error when loading spatialite db without extension loaded,82988,psychemedia,closed,0,,,,,2,2018-06-29T09:51:14Z,2022-01-20T21:29:40Z,2018-07-10T15:13:36Z,CONTRIBUTOR,,"When starting datasette on a SpatialLite database *without* loading the SpatiaLite extension (using eg `--load-extension=/usr/local/lib/mod_spatialite.dylib`) an error is thrown and the server fails to start:
```
datasette -p 8003 adminboundaries.db
Serve! files=('adminboundaries.db',) on port 8003
Traceback (most recent call last):
File ""/Users/ajh59/anaconda3/bin/datasette"", line 11, in
sys.exit(cli())
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 722, in __call__
return self.main(*args, **kwargs)
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 697, in main
rv = self.invoke(ctx)
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 535, in invoke
return callback(*args, **kwargs)
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/cli.py"", line 552, in serve
ds.inspect()
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/app.py"", line 273, in inspect
""tables"": inspect_tables(conn, self.metadata.get(""databases"", {}).get(name, {}))
File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/inspect.py"", line 79, in inspect_tables
""PRAGMA table_info({});"".format(escape_sqlite(table))
sqlite3.OperationalError: no such module: VirtualSpatialIndex
```
It would be nice to trap this and return a message saying something like:
```
It looks like you're trying to load a SpatiaLite database? Make sure you load in the SpatiaLite extension when starting datasette.
Read more: https://datasette.readthedocs.io/en/latest/spatialite.html
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1097040427,I_kwDOBm6k_c5BY4Ir,1587,Add `sqlite_stat1`(-4) tables to hidden table list,9599,simonw,closed,0,,,,,2,2022-01-08T21:28:20Z,2022-01-20T04:12:59Z,2022-01-20T04:12:59Z,OWNER,,"> Running `ANALYZE` creates a new visible table called `sqlite_stat1`: https://www.sqlite.org/fileformat.html#the_sqlite_stat1_table
>
> This should be added to the default list of hidden tables in Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
991467558,MDU6SXNzdWU5OTE0Njc1NTg=,1466,Add Datasette Desktop to installation documentation,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-09-08T19:41:27Z,2022-01-13T22:28:28Z,2022-01-13T21:55:18Z,OWNER,,See https://datasette.io/desktop and https://simonwillison.net/2021/Sep/8/datasette-desktop/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083921371,I_kwDOBm6k_c5Am1Pb,1570,Separate db.execute_write() into three methods,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:45:54Z,2022-01-13T22:27:38Z,2021-12-18T18:57:25Z,OWNER,,"> Rather than adding a `executemany=True` parameter, I'm now thinking a better design might be to have three methods:
>
> - `db.execute_write(sql, params=None, block=False)`
> - `db.execute_write_script(sql, block=False)`
> - `db.execute_write_many(sql, params_seq, block=False)`
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997267416_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083895395,I_kwDOBm6k_c5Amu5j,1569,"db.execute_write(..., executescript=True) parameter",9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T18:20:47Z,2022-01-13T22:27:27Z,2021-12-18T18:34:18Z,OWNER,,"> Idea: teach `execute_write` to accept an optional `executescript=True` parameter, like this:
```diff
diff --git a/datasette/database.py b/datasette/database.py
index 468e936..1a424f5 100644
--- a/datasette/database.py
+++ b/datasette/database.py
@@ -94,10 +94,14 @@ class Database:
f""file:{self.path}{qs}"", uri=True, check_same_thread=False
)
- async def execute_write(self, sql, params=None, block=False):
+ async def execute_write(self, sql, params=None, executescript=False, block=False):
+ assert not executescript and params, ""Cannot use params with executescript=True""
def _inner(conn):
with conn:
- return conn.execute(sql, params or [])
+ if executescript:
+ return conn.executescript(sql)
+ else:
+ return conn.execute(sql, params or [])
with trace(""sql"", database=self.name, sql=sql.strip(), params=params):
results = await self.execute_write_fn(_inner, block=block)
```
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997248364_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083726550,I_kwDOBm6k_c5AmFrW,1568,Trace should show queries on the write connection too,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-18T02:34:12Z,2022-01-13T22:27:23Z,2021-12-18T02:42:34Z,OWNER,,"> Here's why - `trace` only applies to read, not write SQL operations: https://github.com/simonw/datasette/blob/7c8f8aa209e4ba7bf83976f8495d67c28fbfca24/datasette/database.py#L209-L211
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1555#issuecomment-997128508_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083573206,I_kwDOBm6k_c5AlgPW,1563,Datasette(... files=) should not be a required argument,9599,simonw,closed,0,,,7571612,Datasette 0.60,2,2021-12-17T19:54:18Z,2022-01-13T22:27:18Z,2021-12-18T02:19:40Z,OWNER,,"```pycon
>>> ds = Datasette(memory=True)
Traceback (most recent call last):
File """", line 1, in
TypeError: __init__() missing 1 required positional argument: 'files'
>>> ds = Datasette(memory=True, files=[])
```
I wanted to create an in-memory Datasette for running some tests, no point in forcing me to pass `files=[]` to do that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
712202333,MDU6SXNzdWU3MTIyMDIzMzM=,982,"SQL editor should allow execution of write queries, if you have permission",9599,simonw,open,0,,,,,2,2020-09-30T19:04:35Z,2022-01-13T22:21:29Z,,OWNER,,"The `datasette-write` plugin provides this at the moment https://github.com/simonw/datasette-write - but it feels like it should be a built-in capability, protected by a default permission.
UI concept: if you have write permission then the existing SQL editor gets an ""execute write"" checkbox underneath it.
JavaScript can spot if you appear to be trying to execute an UPDATE or INSERT or DELETE query and check that checkbox for you.
If you link to a query page with a non-SELECT then that query will be displayed in the box ready for you to POST submit it. The page will also then get ""cannot be embedded"" headers to protect against clickjacking.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/982/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1099584685,I_kwDOCGYnMM5BilSt,381,`sqlite-utils rows` options `--limit` and `--offset`,9599,simonw,closed,0,,,,,2,2022-01-11T20:23:12Z,2022-01-11T23:33:37Z,2022-01-11T23:19:36Z,OWNER,,Because I often want to use it just to preview a few rows from the database. Piping through `| head -n 20` works for JSON and CSV (they stream) but not for `--table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/381/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1097041471,PR_kwDOCGYnMM4wsVM6,367,Initial prototype of .analyze() methods,9599,simonw,closed,0,,,7558727,3.21,2,2022-01-08T21:35:12Z,2022-01-10T19:31:08Z,2022-01-10T19:31:08Z,OWNER,simonw/sqlite-utils/pulls/367,Refs #366,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/367/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1,
1077628073,I_kwDOBm6k_c5AO0yp,1550,Research option for returning all rows from arbitrary query,9599,simonw,open,0,,,,,2,2021-12-11T19:31:11Z,2021-12-11T23:43:24Z,,OWNER,,"Inspired by thinking about #1549 - returning ALL rows from an arbitrary query is a lot easier if you just run that query and keep iterating over the cursor.
I've avoided doing that in the past because it could tie up a connection for a long time - but in private instances this wouldn't be such a problem.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1073712378,I_kwDOBm6k_c4__4z6,1544,Code that detects the label column for a table is case-sensitive,9599,simonw,closed,0,,,,,2,2021-12-07T20:01:25Z,2021-12-07T20:03:43Z,2021-12-07T20:03:43Z,OWNER,,I just noticed that a column called `Name` is not being picked up as the label column for a table.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1066501534,I_kwDOCGYnMM4_kYWe,345,`table.strict` introspection boolean for identifying STRICT mode tables,9599,simonw,closed,0,,,,,2,2021-11-29T21:05:10Z,2021-11-29T22:45:26Z,2021-11-29T22:44:36Z,OWNER,,"> From the STRICT docs:
>> The SQLite parser accepts a comma-separated list of table options after the final close parenthesis in a CREATE TABLE statement. As of this writing (2021-08-23) only two options are recognized:
>>
>> - STRICT
>> - [WITHOUT ROWID](https://www.sqlite.org/withoutrowid.html)
>
> So I think I need to read the `CREATE TABLE` statement from the `sqlite_master` table, split on the last `)`, split those tokens on `,` and see if `create` is in there (case insensitive).
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982020757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1059549523,I_kwDOBm6k_c4_J3FT,1526,"Add to vercel.json, rather than overwriting it.",192568,mroswell,closed,0,,,,,2,2021-11-22T00:47:12Z,2021-11-22T04:49:45Z,2021-11-22T04:13:47Z,CONTRIBUTOR,,"I'd like to be able to add to vercel.json. But Datasette overwrites whatever I put in that file. I originally reported this here:
https://github.com/simonw/datasette-publish-vercel/issues/51
In that case, I wanted to do a rewrite... and now I need to do 301 redirects (because we had to rename our site).
Can this be addressed?
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1057996111,I_kwDOBm6k_c4_D71P,1517,Let `register_routes()` over-ride default routes within Datasette,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2021-11-19T00:22:15Z,2021-11-19T03:20:00Z,2021-11-19T03:07:27Z,OWNER,,"See https://github.com/simonw/datasette/issues/878#issuecomment-973554024_ - right now `register_routes()` can't replace default Datasette routes.
It would be neat if plugins could do this - especially if there was a neat documented way for them to then re-dispatch to the original route code after making some kind of modification.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1056117435,PR_kwDOBm6k_c4up0R0,1514,Bump black from 21.9b0 to 21.11b0,49699333,dependabot[bot],closed,0,,,,,2,2021-11-17T13:13:55Z,2021-11-18T13:11:17Z,2021-11-18T13:11:15Z,CONTRIBUTOR,simonw/datasette/pulls/1514,"Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.11b0.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.9b0&new-version=21.11b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1041158024,PR_kwDOBm6k_c4t7RKr,1500,Bump black from 21.9b0 to 21.10b0,49699333,dependabot[bot],closed,0,,,,,2,2021-11-01T13:11:23Z,2021-11-17T13:14:00Z,2021-11-17T13:13:58Z,CONTRIBUTOR,simonw/datasette/pulls/1500,"Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.10b0.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.9b0&new-version=21.10b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1026794056,I_kwDOCGYnMM49M6JI,331,Mypy error: found module but no type hints or library stubs,53032010,andreaslongo,closed,0,,,,,2,2021-10-14T20:29:50Z,2021-11-14T23:21:08Z,2021-11-14T23:21:08Z,NONE,,"```
Python 3.9.5
mypy 0.910
sqlite-utils 3.17.1
```
While using sqlite-utils as a library, when I use mypy for static type checking, it throws an error:
```
mypy .
src/etl.py:5: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs
import sqlite_utils
^
src/etl.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
test/test_etl.py:4: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs
import sqlite_utils
^
Found 2 errors in 2 files (checked 7 source files)
```
When I add a `py.typed` file to the sqlite-utils package to mark it as PEP 561 compatible, the error goes away.
```
al@nbal ..b/python3.9/site-packages/sqlite_utils (git)-[main] % la
total 200
drwx------ 3 al al 4096 Oct 14 22:00 .
drwx------ 117 al al 4096 Oct 12 21:12 ..
-rw------- 1 al al 64409 Oct 12 21:11 cli.py
-rw------- 1 al al 109092 Oct 12 21:11 db.py
-rw------- 1 al al 0 Oct 14 22:00 py.typed
-rw------- 1 al al 684 Oct 12 21:11 recipes.py
-rw------- 1 al al 7988 Oct 12 21:11 utils.py
-rw------- 1 al al 113 Oct 12 21:11 __init__.py
```
I would like to suggest adding a `py.typed` file to the repository.
See also the mypy docs on creating PEP 561 compatible packages:
https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1021550542,I_kwDOBm6k_c4845_O,1482,Support Python 3.10,9599,simonw,closed,0,,,,,2,2021-10-09T00:30:52Z,2021-10-24T22:21:40Z,2021-10-24T22:19:55Z,OWNER,,"I started work on this in #1481 where I found a Python 3.10 bug that needs a workaround in Janus, see:
- https://github.com/aio-libs/janus/issues/358
This is a tracking issue for anything else that shows up.
This is also needed for the Homebrew package to upgrade to 3.10:
- https://github.com/Homebrew/homebrew-core/pull/86932",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
950664971,MDU6SXNzdWU5NTA2NjQ5NzE=,1401,unordered list is not rendering bullet points in description_html on database page,536941,fgregg,open,0,,,,,2,2021-07-22T13:24:18Z,2021-10-23T13:09:10Z,,CONTRIBUTOR,,"Thanks for this tremendous package, @simonw!
In the `description_html` for a database, I [have an unordered list](https://github.com/labordata/warehouse/blob/fcea4502e5b615b0eb3e0bdcb45ec634abe20bb6/warehouse_metadata.yml#L19-L22).
However, on the database page on the deployed site, it is not rendering this as a bulleted list.
![Screenshot 2021-07-22 at 09-21-51 nlrb](https://user-images.githubusercontent.com/536941/126645923-2777b7f1-fd4c-4d2d-af70-a35e49a07675.png)
Page here: https://labordata-warehouse.herokuapp.com/nlrb-9da4ae5
The documentation gives an [example of using an unordered list](https://docs.datasette.io/en/stable/metadata.html#using-yaml-for-metadata) in a `description_html`, so I expected this will work.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
991237645,MDExOlB1bGxSZXF1ZXN0NzI5NzMxNDQx,326,Test against 3.10-dev,9599,simonw,closed,0,,,,,2,2021-09-08T15:01:15Z,2021-10-13T21:49:28Z,2021-10-13T21:49:28Z,OWNER,simonw/sqlite-utils/pulls/326,"This tests against the latest 3.10 RC, https://www.python.org/downloads/release/python-3100rc2/",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
994450961,MDU6SXNzdWU5OTQ0NTA5NjE=,1469,"Column cog shows ""facet by this"" when already default faceted",9599,simonw,closed,0,,,,,2,2021-09-13T04:51:26Z,2021-10-13T21:20:07Z,2021-10-13T21:20:07Z,OWNER,,"e.g. on https://covid-19.datasettes.com/covid/economist_excess_deaths
But if you add `?_facet=country` to the URL that goes away: https://covid-19.datasettes.com/covid/economist_excess_deaths?_facet_size=5&_facet=country
The logic that decides if the ""Facet by this"" item is shown does not take default `metadata.json` facets into account.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
602533481,MDU6SXNzdWU2MDI1MzM0ODE=,3,"Import EXIF data into SQLite - lens used, ISO, aperture etc",9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,2,2020-04-18T19:24:31Z,2021-10-05T12:38:24Z,,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
999902754,I_kwDOBm6k_c47mU4i,1473,base logo link visits `undefined` rather than href url,192568,mroswell,open,0,,,,,2,2021-09-18T04:17:04Z,2021-09-19T00:45:32Z,,CONTRIBUTOR,,"I have two connected sites:
http://www.SaferOrToxic.org
(a Hugo website)
and:
http://disinfectants.SaferOrToxic.org/disinfectants/listN
(a datasette table page)
The latter is linked as ""The List"" in the former's menu.
(I'd love a prettier URL, but that's what I've got.)
On:
http://disinfectants.SaferOrToxic.org/disinfectants/listN
... all the other menu links should point back to:
https://www.SaferOrToxic.org
And they do!
But the logo, for some reason--though it has an href pointing to:
https://www.SaferOrToxic.org
Keeps going to this instead:
https://disinfectants.saferortoxic.org/disinfectants/undefined
What is causing that? How can I fix it?
In #1284 back in March, I was doing battle with the index.html template, in a still unresolved issue. (I wanted only a single table page at the root.)
But I thought, well, if I can't resolve that, at least I could just point the main website to the datasette page (""The List,"") and then have the List point back to the home website.
The menu hrefs to https://www.SaferOrToxic.org work just fine, exactly as they should, from the datasette page. Even the Home link works properly.
But the logo link keeps rewriting to: https://disinfectants.saferortoxic.org/disinfectants/undefined
This is the HTML:
```
```
Is this somehow related to cloudflare?
Or something in the datasette code?
I'm starting to think it's a cloudflare issue.
Can I at least rule out it being a datasette issue?
My repository is here:
https://github.com/mroswell/list-N
(BTW, I couldn't figure out how to reference a local image, either, on the datasette side, which is why I'm using the image from the www home page.)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
982780906,MDExOlB1bGxSZXF1ZXN0NzIyNDgwNTQy,1453,Bump black from 21.7b0 to 21.8b0,49699333,dependabot[bot],closed,0,,,,,2,2021-08-30T13:13:39Z,2021-09-14T13:10:40Z,2021-09-14T13:10:38Z,CONTRIBUTOR,simonw/datasette/pulls/1453,"Bumps [black](https://github.com/psf/black) from 21.7b0 to 21.8b0.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.7b0&new-version=21.8b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1453/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
994390593,MDU6SXNzdWU5OTQzOTA1OTM=,1468,Faceting for custom SQL queries,72577720,MichaelTiemannOSC,closed,0,,,,,2,2021-09-13T02:52:16Z,2021-09-13T04:54:22Z,2021-09-13T04:54:17Z,CONTRIBUTOR,,"Facets are awesome. But not when I need to join to tidy tables together. Or even just running explicitly the default SQL query that simply lists all the rows and columns of a table (up to SIZE). That is to say, when I browse a table, I see facets:
https://latest.datasette.io/fixtures/compound_three_primary_keys
But when I run a custom query, I don't:
https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101
Is there an idiom to cause custom SQL to come back with facet suggestions?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1468/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
602585497,MDU6SXNzdWU2MDI1ODU0OTc=,7,Integrate image content hashing,9599,simonw,open,0,,,,,2,2020-04-19T00:36:58Z,2021-08-26T02:01:01Z,,MEMBER,,To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,
976405225,MDU6SXNzdWU5NzY0MDUyMjU=,320,sqlite-utils memory --analyze option,9599,simonw,closed,0,,,,,2,2021-08-22T15:37:10Z,2021-08-22T15:46:56Z,2021-08-22T15:44:29Z,OWNER,,To provide a way of running [analyze-tables](https://sqlite-utils.datasette.io/en/stable/cli.html#analyzing-tables) directly against JSON or CSV data.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
975166271,MDU6SXNzdWU5NzUxNjYyNzE=,20,Add index on workout_points.date,9599,simonw,open,0,,,,,2,2021-08-20T01:08:04Z,2021-08-20T01:12:48Z,,MEMBER,,"Sorting that by date makes sense for seeing most recent points, and my DB has 2.5m points in so it's an expensive sort!",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
779211940,MDExOlB1bGxSZXF1ZXN0NTQ5MjA0MDYz,55,Fix archive imports,21148,jacobian,closed,0,,,,,2,2021-01-05T15:54:48Z,2021-08-20T00:02:49Z,2021-08-20T00:02:49Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/55,This fixes the issues discussed in #54,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
965210966,MDU6SXNzdWU5NjUyMTA5NjY=,314,Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`,9599,simonw,closed,0,,,,,2,2021-08-10T18:03:59Z,2021-08-18T22:25:21Z,2021-08-18T22:25:21Z,OWNER,,"> Adding type signatures to `create_table()` and `.create_table_sql()` is a bit too involved, I'll do that in a separate issue.
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/312#issuecomment-896200682_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
974067156,MDU6SXNzdWU5NzQwNjcxNTY=,318,Research: handle gzipped CSV directly,9599,simonw,open,0,,,,,2,2021-08-18T21:23:04Z,2021-08-18T21:25:30Z,,OWNER,,"Would it be worthwhile for the `sqlite-utils` command-line tool to grow features to efficiently directly interact with gzipped CSV data?
Maybe add `--gz` options to both `insert` and to the various commands that output query results.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
832687563,MDExOlB1bGxSZXF1ZXN0NTkzODA1ODA0,247,FTS quote functionality from datasette,16001974,DeNeutoy,closed,0,,,,,2,2021-03-16T11:17:34Z,2021-08-18T18:43:12Z,2021-08-18T18:43:12Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/247,"Addresses #246 - this is a bit of a kludge because it doesn't actually *validate* the FTS string, just makes sure that it will not crash when executed, but I figured that building a query parser is a bit out of the scope of sqlite-utils and if you actually want to use the query language, you probably need to parse that yourself.
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
969548935,MDU6SXNzdWU5Njk1NDg5MzU=,1429,UI for setting `?_size=max` on table page,9599,simonw,open,0,,,,,2,2021-08-12T20:52:09Z,2021-08-13T04:37:41Z,,OWNER,,"It defaults to 100 per page, but you can increase that to 1000 per page using `?_size=max` (or higher if `max_returned_rows` is set higher than that).
But... that's only available to people who know how to hack URLs.
Solution: add a link that sets that option to the pagination block at the bottom of the table:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
961008507,MDU6SXNzdWU5NjEwMDg1MDc=,308,Add an interactive tutorial as a Jupyter notebook,9599,simonw,open,0,,,,,2,2021-08-04T20:34:22Z,2021-08-04T21:30:59Z,,OWNER,,Can show people how to open this up in Binder.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
959284434,MDExOlB1bGxSZXF1ZXN0NzAyNDIyMjYz,1418,Spelling corrections plus CI job for codespell,9599,simonw,closed,0,,,,,2,2021-08-03T16:21:19Z,2021-08-03T16:36:39Z,2021-08-03T16:36:38Z,OWNER,simonw/datasette/pulls/1418,Refs #1417.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
951581763,MDU6SXNzdWU5NTE1ODE3NjM=,298,Read lines with JSON object,2172260,qqilihq,closed,0,,,,,2,2021-07-23T13:28:52Z,2021-08-03T06:50:47Z,2021-08-02T21:55:16Z,NONE,,"I found this posted on HN a while ago and love it -- thank you!
As a minor improvement, it would be great to have the ability to parse a file with line-separated JSON objects. Currently the parser obviously requires an array wrapping all these objects.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
958516743,MDU6SXNzdWU5NTg1MTY3NDM=,306,Configure sphinx.ext.extlinks for issues,9599,simonw,closed,0,,,,,2,2021-08-02T21:19:19Z,2021-08-02T21:39:34Z,2021-08-02T21:29:22Z,OWNER,,As seen in Datasette: https://github.com/simonw/datasette/issues/1227,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
957298475,MDU6SXNzdWU5NTcyOTg0NzU=,1407,OSError: AF_UNIX path too long in ds_unix_domain_socket_server,9599,simonw,closed,0,,,,,2,2021-07-31T18:36:06Z,2021-07-31T19:03:44Z,2021-07-31T19:03:44Z,OWNER,,"Got this exception while working on #1406.
```
@pytest.fixture(scope=""session"")
def ds_unix_domain_socket_server(tmp_path_factory):
socket_folder = tmp_path_factory.mktemp(""uds"")
uds = str(socket_folder / ""datasette.sock"")
ds_proc = subprocess.Popen(
[""datasette"", ""--memory"", ""--uds"", uds],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
cwd=tempfile.gettempdir(),
)
# Give the server time to start
time.sleep(1.5)
# Check it started successfully
> assert not ds_proc.poll(), ds_proc.stdout.read().decode(""utf-8"")
E AssertionError: INFO: Started server process [48453]
E INFO: Waiting for application startup.
E INFO: Application startup complete.
E Traceback (most recent call last):
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/bin/datasette"", line 33, in
E sys.exit(load_entry_point('datasette', 'console_scripts', 'datasette')())
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__
E return self.main(*args, **kwargs)
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1062, in main
E rv = self.invoke(ctx)
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke
E return _process_result(sub_ctx.command.invoke(sub_ctx))
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke
E return ctx.invoke(self.callback, **ctx.params)
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/click/core.py"", line 763, in invoke
E return __callback(*args, **kwargs)
E File ""/Users/simon/Dropbox/Development/datasette/datasette/cli.py"", line 583, in serve
E uvicorn.run(ds.app(), **uvicorn_kwargs)
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/main.py"", line 393, in run
E server.run()
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 50, in run
E loop.run_until_complete(self.serve(sockets=sockets))
E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py"", line 616, in run_until_complete
E return future.result()
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 67, in serve
E await self.startup(sockets=sockets)
E File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/uvicorn/server.py"", line 133, in startup
E server = await asyncio.start_unix_server(
E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/streams.py"", line 132, in start_unix_server
E return await loop.create_unix_server(factory, path, **kwds)
E File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/unix_events.py"", line 296, in create_unix_server
E sock.bind(path)
E OSError: AF_UNIX path too long
E
E assert not 1
E + where 1 = >()
E + where > = .poll
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1407/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
957302085,MDU6SXNzdWU5NTczMDIwODU=,1408,"Review places in codebase that use os.chdir(), in particularly relating to tests",9599,simonw,open,0,,,,,2,2021-07-31T18:57:06Z,2021-07-31T19:00:32Z,,OWNER,,"> To clarify: the core problem here is that an error is thrown any time you call `os.getcwd()` but the directory you are currently in has been deleted.
>
> `runner.isolated_filesystem()` assumes that the current directory in has not been deleted. But the various temporary directory utilities in `pytest` work by creating directories and then deleting them.
>
> Maybe there's a larger problem here that I play a bit fast and loose with `os.chdir()` in both the test suite and in various lines of code in Datasette itself (in particular in the publish commands)?
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1406#issuecomment-890390198_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
951185411,MDU6SXNzdWU5NTExODU0MTE=,1402,feature request: social meta tags,536941,fgregg,open,0,,,,,2,2021-07-23T01:57:23Z,2021-07-26T19:31:41Z,,CONTRIBUTOR,,"it would be very nice if the twitter, slack, and other social media could make rich cards when people post a link to a datasette instance
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
792652391,MDU6SXNzdWU3OTI2NTIzOTE=,1199,Experiment with PRAGMA mmap_size=N,9599,simonw,open,0,,,,,2,2021-01-23T21:24:09Z,2021-07-17T17:39:17Z,,OWNER,,"https://sqlite.org/mmap.html - SQLite supports memory-mapped I/O but it's disabled by default. The `PRAGMA mmap_size=N` option can be used to enable it.
It would be very interesting to understand the impact this could have on Datasette performance for various different shapes of data.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
940891698,MDU6SXNzdWU5NDA4OTE2OTg=,1390,Mention restarting systemd in documentation,9599,simonw,closed,0,,,,,2,2021-07-09T16:05:15Z,2021-07-09T16:32:57Z,2021-07-09T16:32:33Z,OWNER,,"https://docs.datasette.io/en/stable/deploying.html#running-datasette-using-systemd
Need to clarify that if you add a new database or change metadata you need to restart systemd.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
927766296,MDU6SXNzdWU5Mjc3NjYyOTY=,291,Adopt flake8,9599,simonw,closed,0,,,,,2,2021-06-23T01:19:37Z,2021-06-24T17:50:27Z,2021-06-24T17:50:27Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
914130834,MDExOlB1bGxSZXF1ZXN0NjY0MDcyMDQ2,1370,Ensure db.path is a string before trying to insert into internal database,25778,eyeseast,closed,0,,,,,2,2021-06-08T01:16:48Z,2021-06-21T15:57:39Z,2021-06-21T15:57:39Z,CONTRIBUTOR,simonw/datasette/pulls/1370,"Fixes #1365
This is the simplest possible fix, with a test that will fail without it. There are a bunch of places where `db.path` is getting converted to and from a `Path` type, so this fix errs on the side of calling `str(db.path)` right before it's inserted.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1370/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
923612361,MDExOlB1bGxSZXF1ZXN0NjcyMzU5NjA5,277,add -h support closes #276,601708,mcint,closed,0,,,,,2,2021-06-17T08:08:26Z,2021-06-18T14:56:59Z,2021-06-18T14:56:59Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/277,This appears to be the [canonical solution](https://click.palletsprojects.com/en/7.x/documentation/#help-parameter-customization).,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
503190241,MDU6SXNzdWU1MDMxOTAyNDE=,584,Codec error in some CSV exports,9599,simonw,closed,0,,,,,2,2019-10-07T01:15:34Z,2021-06-17T18:13:20Z,2019-10-18T05:23:16Z,OWNER,,"Got this exploring my Swarm checkins:
![448DBFC4-71F8-4846-83C0-BEA511B2157A](https://user-images.githubusercontent.com/9599/66279259-3af53480-e865-11e9-9651-04fd2d895392.jpeg)
`/swarm/stickers.csv?stickerType=messageOnly&_size=max`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
516748849,MDU6SXNzdWU1MTY3NDg4NDk=,612,CSV export is broken for tables with null foreign keys,9599,simonw,closed,0,,,,,2,2019-11-02T22:52:47Z,2021-06-17T18:13:20Z,2019-11-02T23:12:53Z,OWNER,,"Following on from #406 - this CSV export appears to be broken:
https://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max
```csv
pk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label
1,1,hello,1,1
2,,
```
That second row should have 5 values, but it only has 4.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
736365306,MDU6SXNzdWU3MzYzNjUzMDY=,1083,Advanced CSV export for arbitrary queries,9599,simonw,open,0,,,,,2,2020-11-04T19:23:05Z,2021-06-17T18:12:31Z,,OWNER,,"There's no link to download the CSV file - the table page has that as an advanced export option, but this is missing from the query page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1083/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
759695780,MDU6SXNzdWU3NTk2OTU3ODA=,1133,Option to omit header row in CSV export,9599,simonw,closed,0,,,,,2,2020-12-08T18:54:46Z,2021-06-17T18:12:31Z,2020-12-10T23:28:51Z,OWNER,,`?_header=off` - for symmetry with existing option `?_nl=on`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
919508498,MDU6SXNzdWU5MTk1MDg0OTg=,1375,JSON export dumps JSON fields as TEXT,4068,frafra,closed,0,,,,,2,2021-06-12T09:45:08Z,2021-06-14T09:41:59Z,2021-06-13T15:37:58Z,NONE,,"Hi!
When a user tries to export data as JSON, I would expect to see the value of JSON columns represented as JSON instead of being rendered as a string. What do you think?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
913809802,MDU6SXNzdWU5MTM4MDk4MDI=,1366,Get rid of this `restore_working_directory` hack entirely,9599,simonw,open,0,,,,,2,2021-06-07T18:01:21Z,2021-06-07T18:03:03Z,,OWNER,,"> That seems to have fixed it. I'd love to get rid of this `restore_working_directory` hack entirely.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1361#issuecomment-855308811_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
912464443,MDU6SXNzdWU5MTI0NjQ0NDM=,1360,"Security flaw, to be fixed in 0.56.1 and 0.57",9599,simonw,closed,0,,,,,2,2021-06-05T21:53:51Z,2021-06-05T22:23:23Z,2021-06-05T22:22:06Z,OWNER,,"See security advisory here for details: https://github.com/simonw/datasette/security/advisories/GHSA-xw7c-jx9m-xh5g - the `?_trace=1` debugging option was not correctly escaping its JSON output, resulting in a [reflected cross-site scripting](https://owasp.org/www-community/attacks/xss/#reflected-xss-attacks) vulnerability.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
849582643,MDExOlB1bGxSZXF1ZXN0NjA4MzM0MDk2,1291,Update docs: explain allow_download setting,5413548,louispotok,closed,0,,,,,2,2021-04-03T05:28:33Z,2021-06-05T19:48:51Z,2021-06-05T19:48:51Z,CONTRIBUTOR,simonw/datasette/pulls/1291,"This fixes one possible source of confusion seen in #502 and clarifies
when database downloads will be shown and allowed.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
828811618,MDU6SXNzdWU4Mjg4MTE2MTg=,1257,Table names containing single quotes break things,9599,simonw,closed,0,,,,,2,2021-03-11T06:29:38Z,2021-06-02T03:28:29Z,2021-06-02T03:28:29Z,OWNER,,"e.g. I found a table called `Yesterday's ELRs by County`
It threw an error inside the `detect_fts()` function attempting to run this SQL query:
```sql
select name from sqlite_master
where rootpage = 0
and (
sql like '%VIRTUAL TABLE%USING FTS%content=""Yesterday's ELRs by County""%'
or sql like '%VIRTUAL TABLE%USING FTS%content=[Yesterday's ELRs by County]%'
or (
tbl_name = ""Yesterday's ELRs by County""
and sql like '%VIRTUAL TABLE%USING FTS%'
)
)
```
Here's the code at fault: https://github.com/simonw/datasette/blob/640ac7071b73111ba4423812cd683756e0e1936b/datasette/utils/__init__.py#L534-L548",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
906977719,MDU6SXNzdWU5MDY5Nzc3MTk=,1350,?_nofacets=1 query string argument for disabling facets and suggested facets,9599,simonw,closed,0,,,,,2,2021-05-31T02:22:29Z,2021-06-01T16:19:38Z,2021-05-31T02:39:18Z,OWNER,,"This is needed as an internal option for #1349. `datasette-graphql` can benefit from this too - maybe can even use it so that if you pass `?_shape=array` it gets automatically added, fixing #263.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
908446997,MDU6SXNzdWU5MDg0NDY5OTc=,1353,?_nocount=1 for opting out of table counts,9599,simonw,closed,0,,,,,2,2021-06-01T15:53:27Z,2021-06-01T16:18:54Z,2021-06-01T16:17:04Z,OWNER,,"Running a trace against a CSV streaming export with the new `_trace=1` feature from #1351 shows that the following code is executing a `select count(*) from table` for every page of results returned: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/table.py#L700-L705
This is inefficient - a new `?_nocount=1` option would let us disable this count in the same way as #1349: https://github.com/simonw/datasette/blob/d1d06ace49606da790a765689b4fbffa4c6deecb/datasette/views/base.py#L264-L276
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
904598267,MDExOlB1bGxSZXF1ZXN0NjU1NzQxNDI4,1348,DRAFT: add test and scan for docker images,10801138,blairdrummond,open,0,,,,,2,2021-05-28T03:02:12Z,2021-05-28T03:06:16Z,,CONTRIBUTOR,simonw/datasette/pulls/1348,"**NOTE: I don't think this PR is ready, since the arm/v6 and arm/v7 images are failing pytest due to missing dependencies (gcc and friends). But it's pretty close.**
Closes https://github.com/simonw/datasette/issues/1344 . Using a build-matrix for the platforms and [this test](https://github.com/simonw/datasette/issues/1344#issuecomment-849820019), we test all the platforms in parallel. I also threw in container scanning.
### Switch `pip install` to use either tags or commit shas
Notably! This also [changes the Dockerfile](https://github.com/blairdrummond/datasette/blob/7fe5315d68e04fce64b5bebf4e2d7feec44f8546/Dockerfile#L20) so that it accepts tags or commit-shas.
```
# It's backwards compatible with tags, but also lets you use shas
root@712071df17af:/# pip install git+git://github.com/simonw/datasette.git@0.56
Collecting git+git://github.com/simonw/datasette.git@0.56
Cloning git://github.com/simonw/datasette.git (to revision 0.56) to /tmp/pip-req-build-u6dhm945
Running command git clone -q git://github.com/simonw/datasette.git /tmp/pip-req-build-u6dhm945
Running command git checkout -q af5a7f1c09f6a902bb2a25e8edf39c7034d2e5de
Collecting Jinja2<2.12.0,>=2.10.3
Downloading Jinja2-2.11.3-py2.py3-none-any.whl (125 kB)
```
This lets you build the containers in CI every push for testing, which maybe resolves [this problem](https://github.com/simonw/datasette/issues/1272#issuecomment-808648974)?
# Workflow run example
You can see the results in my workflow [here](https://github.com/blairdrummond/datasette/pull/2/checks?check_run_id=2690570717). The commit history is different because I squashed this branch, also in the testing branch I had to change `github.com/simonw` to `github.com/blairdrummond` for the CI to pick up my git_sha.
## Why did the builds fail?
**NOTE:** The results of all the tests fail, but for different reasons! A few fail to install Rust, the amd64 passes the tests (phew!) but has critical CVEs which fail the container scan, the Arm/v6 and Arm/v7 seem to fail to install the test dependencies due to missing programs like `gcc`. (`gcc` is not sufficient though, as [this run](https://github.com/blairdrummond/datasette/pull/3/checks?check_run_id=2690672982) indicates) ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
892457208,MDU6SXNzdWU4OTI0NTcyMDg=,1327,Support Unicode characters in metadata.json,20846286,GmGniap,closed,0,,,,,2,2021-05-15T14:33:58Z,2021-05-24T19:10:21Z,2021-05-24T19:10:21Z,NONE,,"Hello , when I used Burmese (Unicode) characters in metadata.json like below -
![image](https://user-images.githubusercontent.com/20846286/118364978-cba70100-b5c0-11eb-967c-7dc3b62478f2.png)
It gave wrong results when I run datasette -
![image](https://user-images.githubusercontent.com/20846286/118365025-fc873600-b5c0-11eb-97ce-19541b8cc6d8.png)
It would be great & helpful for us if metadata.json can support in Unicode supported Asian Languages.
Thanks & Regards. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
890073989,MDExOlB1bGxSZXF1ZXN0NjQzMTQ5MzY0,1325,"Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0",49699333,dependabot[bot],closed,0,,,,,2,2021-05-12T13:09:03Z,2021-05-22T23:54:25Z,2021-05-22T23:54:25Z,CONTRIBUTOR,simonw/datasette/pulls/1325,"Updates the requirements on [itsdangerous](https://github.com/pallets/itsdangerous) to permit the latest version.
Release notes
Follow our blog, Twitter, or GitHub to see future announcements.
This represents a significant amount of work, and there are quite a few changes. Be sure to carefully read the changelog, and use tools such as pip-compile and Dependabot to pin your dependencies and control your updates.
JWS support (JSONWebSignatureSerializer,
TimedJSONWebSignatureSerializer) is deprecated. Use a dedicated
JWS/JWT library such as authlib instead. :issue:129
Importing itsdangerous.json is deprecated. Import Python's
json module instead. :pr:152
Simplejson is no longer used if it is installed. To use a different
library, pass it as Serializer(serializer=...). :issue:146
datetime values are timezone-aware with timezone.utc. Code
using TimestampSigner.unsign(return_timestamp=True) or
BadTimeSignature.date_signed may need to change. :issue:150
If a signature has an age less than 0, it will raise
SignatureExpired rather than appearing valid. This can happen if
the timestamp offset is changed. :issue:126
BadTimeSignature.date_signed is always a datetime object
rather than an int in some cases. :issue:124
Added support for key rotation. A list of keys can be passed as
secret_key, oldest to newest. The newest key is used for
signing, all keys are tried for unsigning. :pr:141
Removed the default SHA-512 fallback signer from
default_fallback_signers. :issue:155
Add type information for static typing tools. :pr:186
Version 1.1.0
Released 2018-10-26
Change default signing algorithm back to SHA-1. :pr:113
Added a default SHA-512 fallback for users who used the yanked 1.0.0
release which defaulted to SHA-512. :pr:114
Add support for fallback algorithms during deserialization to
support changing the default in the future without breaking existing
signatures. :pr:113
Changed capitalization of packages back to lowercase as the change
in capitalization broke some tooling. :pr:113
Version 1.0.0
Released 2018-10-18
YANKED
... (truncated)
Commits
d101100 Merge pull request #235 from pallets/release-2.0.0
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
847423559,MDU6SXNzdWU4NDc0MjM1NTk=,253,fixtures.db example error in sql-utils blog post,192568,mroswell,closed,0,,,,,2,2021-03-31T22:07:36Z,2021-05-19T03:31:48Z,2021-05-19T03:31:47Z,NONE,,"En route to trying to understand column order transform documentation, I tried the instructions here:
https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/
I get a malformed database schema syntax error.
```
$ wget https://latest.datasette.io/fixtures.db
--2021-03-31 18:00:23-- https://latest.datasette.io/fixtures.db
Resolving latest.datasette.io (latest.datasette.io)... 2607:f8b0:4004:801::2013, 142.250.73.211
Connecting to latest.datasette.io (latest.datasette.io)|2607:f8b0:4004:801::2013|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [application/octet-stream]
Saving to: ‘fixtures.db’
fixtures.db [ <=> ] 260.00K --.-KB/s in 0.1s
2021-03-31 18:00:23 (2.41 MB/s) - ‘fixtures.db’ saved [266240]
$ sqlite3 fixtures.db '.schema facetable'
Error: malformed database schema (generated_columns) - near ""AS"": syntax error
$ sqlite3 fixtures.db
SQLite version 3.28.0 2019-04-15 14:49:49
Enter "".help"" for usage hints.
sqlite> .schema
Error: malformed database schema (generated_columns) - near ""AS"": syntax error
```
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
861622839,MDU6SXNzdWU4NjE2MjI4Mzk=,256,inserting with --nl errors with: sqlite3.OperationalError: table
has no column named ,279769,rathboma,closed,0,,,,,2,2021-04-19T18:01:03Z,2021-05-19T03:26:54Z,2021-05-19T03:26:54Z,NONE,,"I have a `jsonl` file, it is 10,000 lines long.
Inserting from the cli with `sqlite-utils insert db table file --nl --batch-size 10000` fails with this missing column error, even though I'm telling it to use the whole file in the first batch.
This seems similar to #18 and #139, but maybe it's unique to `--nl`?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
890073940,MDExOlB1bGxSZXF1ZXN0NjQzMTQ5MzIw,1324,"Update jinja2 requirement from <2.12.0,>=2.10.3 to >=2.10.3,<3.1.0",49699333,dependabot[bot],closed,0,,,,,2,2021-05-12T13:08:59Z,2021-05-17T17:19:41Z,2021-05-17T17:19:40Z,CONTRIBUTOR,simonw/datasette/pulls/1324,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
Release notes
Follow our blog, Twitter, or GitHub to see future announcements.
This represents a significant amount of work, and there are quite a few changes. Be sure to carefully read the changelog, and use tools such as pip-compile and Dependabot to pin your dependencies and control your updates.
Use :pep:451 API to load templates with
:class:~loaders.PackageLoader. :issue:1168
Fix a bug that caused imported macros to not have access to the
current template's globals. :issue:688
Add ability to ignore trim_blocks using +%}. :issue:1036
Fix a bug that caused custom async-only filters to fail with
constant input. :issue:1279
Fix UndefinedError incorrectly being thrown on an undefined variable
instead of Undefined being returned on
NativeEnvironment on Python 3.10. :issue:1335
Blocks can be marked as required. They must be overridden at
some point, but not necessarily by the direct child. :issue:1147
Deprecate the autoescape and with extensions, they are
built-in to the compiler. :issue:1203
The urlize filter recognizes mailto: links and takes
extra_schemes (or env.policies["urlize.extra_schemes"]) to
recognize other schemes. It tries to balance parentheses within a
URL instead of ignoring trailing characters. The parsing in general
has been updated to be more efficient and match more cases. URLs
without a scheme are linked as https:// instead of http://.
:issue:522, 827, 1172, :pr:1195
Filters that get attributes, such as map and groupby, can
use a false or empty value as a default. :issue:1331
Fix a bug that prevented variables set in blocks or loops from
being accessed in custom context functions. :issue:768
Fix a bug that caused scoped blocks from accessing special loop
variables. :issue:1088
Update the template globals when calling
Environment.get_template(globals=...) even if the template was
already loaded. :issue:295
Do not raise an error for undefined filters in unexecuted
if-statements and conditional expressions. :issue:842
Add is filter and is test tests to test if a name is a
registered filter or test. This allows checking if a filter is
available in a template before using it. Test functions can be
decorated with @pass_environment, @pass_eval_context,
or @pass_context. :issue:842, :pr:1248
Support pgettext and npgettext (message contexts) in i18n
extension. :issue:441
The |indent filter's width argument can be a string to
... (truncated)
Commits
417f822 Merge pull request #1417 from pallets/release-3.0.0
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
876431852,MDExOlB1bGxSZXF1ZXN0NjMwNTc4NzM1,1318,Bump black from 21.4b2 to 21.5b0,49699333,dependabot[bot],closed,0,,,,,2,2021-05-05T13:07:51Z,2021-05-11T13:12:32Z,2021-05-11T13:12:31Z,CONTRIBUTOR,simonw/datasette/pulls/1318,"Bumps [black](https://github.com/psf/black) from 21.4b2 to 21.5b0.
Release notes
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=21.4b2&new-version=21.5b0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
842862708,MDU6SXNzdWU4NDI4NjI3MDg=,1280,Ability to run CI against multiple SQLite versions,9599,simonw,open,0,,,,,2,2021-03-28T23:54:50Z,2021-05-10T19:07:46Z,,OWNER,,"Issue #1276 happened because I didn't run tests against a SQLite version prior to 3.16.0 (released 2017-01-02).
Glitch is a deployment target and runs SQLite 3.11.0 from 2016-02-15.
If CI ran against that version of SQLite this bug could have been avoided.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
870125126,MDU6SXNzdWU4NzAxMjUxMjY=,1310,I'm creating a plugin to export a spreadsheet file (.ods or .xlsx),3747136,ColinMaudry,closed,0,,,,,2,2021-04-28T16:20:11Z,2021-04-30T07:26:11Z,2021-04-30T06:58:46Z,NONE,,"Hi,
I have started developing a plugin to export records as a spreadsheet file. It could be ods or xlsx, whatever is easier.
I have spotted the following packages:
- ods files: https://pypi.org/project/odswriter/
- xlsx files: https://openpyxl.readthedocs.io/en/stable/index.html (quite powerful) or https://xlsxwriter.readthedocs.io/ (faster)
This is the code I have so far, I test it with the `--plugins-dir` option:
```python
from datasette import hookimpl
from datasette.utils.asgi import Response
import odswriter as ods
def render_spreadsheet(rows):
with ods.writer(open(""test.ods"",""wb"")) as odsfile:
for row in rows:
odsfile.writerow([""String"", ""ABCDEF123456"", ""123456""])
return Response(odsfile, content_type=""application/vnd.oasis.opendocument.spreadsheet"", status=200)
@hookimpl
def register_output_renderer():
return {""extension"": ""ods"", ""render"": render_spreadsheet}
```
I get the following error:
```
Traceback (most recent call last):
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path
await response.asgi_send(send)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send
body = body.encode(""utf-8"")
AttributeError: 'ODSWriter' object has no attribute 'encode'
ERROR: Exception in ASGI application
Traceback (most recent call last):
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path
await response.asgi_send(send)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send
body = body.encode(""utf-8"")
AttributeError: 'ODSWriter' object has no attribute 'encode'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 396, in run_asgi
result = await app(self.scope, self.receive, self.send)
File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py"", line 45, in __call__
return await self.app(scope, receive, send)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 161, in __call__
await self.app(scope, receive, send)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/tracer.py"", line 75, in __call__
await self.app(scope, receive, send)
File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 107, in app_wrapped_with_csrf
await app(scope, receive, wrapped_send)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1086, in __call__
return await self.route_path(scope, receive, send, path)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1133, in route_path
return await self.handle_500(request, send, exception)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1267, in handle_500
await asgi_send_html(
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 217, in asgi_send_html
await asgi_send(
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 237, in asgi_send
await asgi_start(send, status, headers, content_type)
File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 246, in asgi_start
await send(
File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 103, in wrapped_send
await send(event)
File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 482, in send
raise RuntimeError(msg % message_type)
RuntimeError: Expected ASGI message 'http.response.body', but got 'http.response.start'.
```
I tried with `AsgiFileDownload` like in [DatabaseDownload](https://github.com/simonw/datasette/blob/main/datasette/views/database.py#L150) to deal with the binary nature of the ods file, but the renderer expects a Response:
> should be dict or Response
However, the `Response` class only supports the following methods, not binary:
- html
- text
- json
- redirect
How would you suggest me to proceed to have my ods file downloaded?
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
871046111,MDExOlB1bGxSZXF1ZXN0NjI2MTMwMTM1,1313,Bump black from 20.8b1 to 21.4b2,27856297,dependabot-preview[bot],closed,0,,,,,2,2021-04-29T13:58:06Z,2021-04-29T15:47:50Z,2021-04-29T15:47:49Z,CONTRIBUTOR,simonw/datasette/pulls/1313,"Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b2.
Release notes
Fix crash when atypical whitespace is cleaned out of dostrings (#2120)
Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags
in the name of the cache file. Without this fix, changes in these flags would not take
effect if the cache had already been populated. (#2131)
Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
21.4b0
Black
Fixed a rare but annoying formatting instability created by the combination of
optional trailing commas inserted by Black and optional parentheses looking at
pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many
duplicates. (#2126)
Black now processes one-line docstrings by stripping leading and trailing spaces,
and adding a padding space when needed to break up """". (#1740)
Black now cleans up leading non-breaking spaces in comments (#2092)
Black now respects --skip-string-normalization when normalizing multiline
docstring quotes (#1637)
Black no longer removes all empty lines between non-function code and decorators
when formatting typing stubs. Now Black enforces a single empty line. (#1646)
Fix crash when atypical whitespace is cleaned out of dostrings (#2120)
Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags
in the name of the cache file. Without this fix, changes in these flags would not take
effect if the cache had already been populated. (#2131)
Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
21.4b0
Black
Fixed a rare but annoying formatting instability created by the combination of
optional trailing commas inserted by Black and optional parentheses looking at
pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many
duplicates. (#2126)
Black now processes one-line docstrings by stripping leading and trailing spaces,
and adding a padding space when needed to break up """". (#1740)
Black now cleans up leading non-breaking spaces in comments (#2092)
Black now respects --skip-string-normalization when normalizing multiline
docstring quotes (#1637)
[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b2)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b2)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
- `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme
Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):
- Update frequency (including time of day and day of week)
- Pull request limits (per update run and/or open at any time)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
870227815,MDExOlB1bGxSZXF1ZXN0NjI1NDU3NTc5,1311,Bump black from 20.8b1 to 21.4b1,27856297,dependabot-preview[bot],closed,0,,,,,2,2021-04-28T18:25:58Z,2021-04-29T13:58:11Z,2021-04-29T13:58:09Z,CONTRIBUTOR,simonw/datasette/pulls/1311,"Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b1.
Release notes
Fix crash when atypical whitespace is cleaned out of dostrings (#2120)
Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags
in the name of the cache file. Without this fix, changes in these flags would not take
effect if the cache had already been populated. (#2131)
Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
21.4b0
Black
Fixed a rare but annoying formatting instability created by the combination of
optional trailing commas inserted by Black and optional parentheses looking at
pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many
duplicates. (#2126)
Black now processes one-line docstrings by stripping leading and trailing spaces,
and adding a padding space when needed to break up """". (#1740)
Black now cleans up leading non-breaking spaces in comments (#2092)
Black now respects --skip-string-normalization when normalizing multiline
docstring quotes (#1637)
Black no longer removes all empty lines between non-function code and decorators
when formatting typing stubs. Now Black enforces a single empty line. (#1646)
Black no longer adds an incorrect space after a parenthesized assignment expression
in if/while statements (#1655)
Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason
to split lines (#1824)
Fix crash when atypical whitespace is cleaned out of dostrings (#2120)
Reflect the --skip-magic-trailing-comma and --experimental-string-processing flags
in the name of the cache file. Without this fix, changes in these flags would not take
effect if the cache had already been populated. (#2131)
Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling
21.4b0
Black
Fixed a rare but annoying formatting instability created by the combination of
optional trailing commas inserted by Black and optional parentheses looking at
pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many
duplicates. (#2126)
Black now processes one-line docstrings by stripping leading and trailing spaces,
and adding a padding space when needed to break up """". (#1740)
Black now cleans up leading non-breaking spaces in comments (#2092)
Black now respects --skip-string-normalization when normalizing multiline
docstring quotes (#1637)
Black no longer removes all empty lines between non-function code and decorators
when formatting typing stubs. Now Black enforces a single empty line. (#1646)
Black no longer adds an incorrect space after a parenthesized assignment expression
in if/while statements (#1655)
Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason
to split lines (#1824)
[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b1)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b1)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
- `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme
Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):
- Update frequency (including time of day and day of week)
- Pull request limits (per update run and/or open at any time)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
869237023,MDExOlB1bGxSZXF1ZXN0NjI0NjM1NDQw,1309,Bump black from 20.8b1 to 21.4b0,27856297,dependabot-preview[bot],closed,0,,,,,2,2021-04-27T20:28:11Z,2021-04-28T18:26:06Z,2021-04-28T18:26:04Z,CONTRIBUTOR,simonw/datasette/pulls/1309,"Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b0.
Release notes
Fixed a rare but annoying formatting instability created by the combination of
optional trailing commas inserted by Black and optional parentheses looking at
pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many
duplicates. (#2126)
Black now processes one-line docstrings by stripping leading and trailing spaces,
and adding a padding space when needed to break up """". (#1740)
Black now cleans up leading non-breaking spaces in comments (#2092)
Black now respects --skip-string-normalization when normalizing multiline
docstring quotes (#1637)
Black no longer removes all empty lines between non-function code and decorators
when formatting typing stubs. Now Black enforces a single empty line. (#1646)
Black no longer adds an incorrect space after a parenthesized assignment expression
in if/while statements (#1655)
Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason
to split lines (#1824)
Fixed a rare but annoying formatting instability created by the combination of
optional trailing commas inserted by Black and optional parentheses looking at
pre-existing "magic" trailing commas. This fixes issue #1629 and all of its many many
duplicates. (#2126)
Black now processes one-line docstrings by stripping leading and trailing spaces,
and adding a padding space when needed to break up """". (#1740)
Black now cleans up leading non-breaking spaces in comments (#2092)
Black now respects --skip-string-normalization when normalizing multiline
docstring quotes (#1637)
Black no longer removes all empty lines between non-function code and decorators
when formatting typing stubs. Now Black enforces a single empty line. (#1646)
Black no longer adds an incorrect space after a parenthesized assignment expression
in if/while statements (#1655)
Added --skip-magic-trailing-comma / -C to avoid using trailing commas as a reason
to split lines (#1824)
[![Dependabot compatibility score](https://api.dependabot.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b0)](https://dependabot.com/compatibility-score/?dependency-name=black&package-manager=pip&previous-version=20.8b1&new-version=21.4b0)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
- `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme
Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):
- Update frequency (including time of day and day of week)
- Pull request limits (per update run and/or open at any time)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
281110295,MDU6SXNzdWUyODExMTAyOTU=,173,I18n and L10n support,50138,janimo,open,0,,,,,2,2017-12-11T17:49:58Z,2021-04-26T12:10:01Z,,NONE,,It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/173/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
791237799,MDU6SXNzdWU3OTEyMzc3OTk=,1196,Access Denied Error in Windows,2826376,QAInsights,open,0,,,,,2,2021-01-21T15:40:40Z,2021-04-14T19:28:38Z,,NONE,,"I am trying to publish a db to vercel. But while issuing the below command throwing `Access Denied` error which is leading to `RecursionError: maximum recursion depth exceeded while calling a Python object`.
I am using PyCharm and Python 3.9. I have reinstalled both and launched PyCharm as Admin in Windows 10. But still the issue persists.
Issued command `datasette publish vercel jmeter.db --project jmeter --install datasette-vega`
PS: localhost is working fine.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
826700095,MDU6SXNzdWU4MjY3MDAwOTU=,1255,Facets timing out but work when filtering,1219001,robroc,open,0,,,,,2,2021-03-09T22:01:39Z,2021-04-02T20:50:08Z,,NONE,,"System info:
Windows 10
Datasette 0.55 installed via pip
Python 3.8.5 in a conda environment
I'm getting the message `These facets timed out` on any faceting operation. However, when I apply a filter, the facets appear in the filtered view. The error returns when the filter is removed. My data only has 38,450 rows.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1255/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
843739658,MDExOlB1bGxSZXF1ZXN0NjAzMDgyMjgw,1282,Fix little typo,192568,mroswell,closed,0,,,,,2,2021-03-29T19:45:28Z,2021-03-29T19:57:34Z,2021-03-29T19:57:34Z,CONTRIBUTOR,simonw/datasette/pulls/1282,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1282/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
576722115,MDU6SXNzdWU1NzY3MjIxMTU=,696,Single failing unit test when run inside the Docker image,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2020-03-06T06:16:36Z,2021-03-29T17:04:19Z,2021-03-07T07:41:18Z,OWNER,,"```
docker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash
root@0e1928cfdf79:/# cd /mnt
root@0e1928cfdf79:/mnt# pip install -e .[test]
root@0e1928cfdf79:/mnt# pytest
```
I get one failure!
It was for `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]`
```
def test_searchable(app_client, path, expected_rows):
response = app_client.get(path)
> assert expected_rows == response.json[""rows""]
E AssertionError: assert [[1, 'barry c...sel', 'puma']] == []
E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E Full diff:
E + []
E - [[1, 'barry cat', 'terry dog', 'panther'],
E - [2, 'terry dog', 'sara weasel', 'puma']]
```
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
831163537,MDExOlB1bGxSZXF1ZXN0NTkyNTQ4MTAz,1260,Fix: code quality issues,25361949,withshubh,closed,0,,,,,2,2021-03-14T13:56:10Z,2021-03-29T00:22:41Z,2021-03-29T00:22:41Z,NONE,simonw/datasette/pulls/1260,"### Description
Hi :wave: I work at [DeepSource](https://deepsource.io), I ran DeepSource analysis on the forked copy of this repo and found some interesting [code quality issues](https://deepsource.io/gh/withshubh/datasette/issues/?category=recommended) in the codebase, opening this PR so you can assess if our platform is right and helpful for you.
### Summary of changes
- Replaced ternary syntax with if expression
- Removed redundant `None` default
- Used `is` to compare type of objects
- Iterated dictionary directly
- Removed unnecessary lambda expression
- Refactored unnecessary `else` / `elif` when `if` block has a `return` statement
- Refactored unnecessary `else` / `elif` when `if` block has a `raise` statement
- Added .deepsource.toml to continuously analyze and detect code quality issues",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
825217564,MDExOlB1bGxSZXF1ZXN0NTg3MzMyNDcz,1252,Add back styling to lists within table cells (fixes #1141),7476523,bobwhitelock,closed,0,,,,,2,2021-03-09T03:00:57Z,2021-03-29T00:14:04Z,2021-03-29T00:14:04Z,CONTRIBUTOR,simonw/datasette/pulls/1252,"This overrides the Datasette reset - see https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/datasette/static/app.css#L35-L38 - to add back the default styling of list items displayed within Datasette table cells.
Following this change, the same content as in the original issue looks like this:
![2021-03-09_02:57:32](https://user-images.githubusercontent.com/7476523/110411982-63e5ae80-8083-11eb-9b5c-e5dc825073e2.png)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1252/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
842556944,MDExOlB1bGxSZXF1ZXN0NjAyMTA3OTM1,1279,Minor Docs Update. Added `--app` to fly install command.,1019791,koaning,closed,0,,,,,2,2021-03-27T16:58:08Z,2021-03-29T00:11:55Z,2021-03-29T00:11:55Z,CONTRIBUTOR,simonw/datasette/pulls/1279,"Without this flag, there's an error locally.
```
> datasette publish fly bigmac.db
Usage: datasette publish fly [OPTIONS] [FILES]...
Try 'datasette publish fly --help' for help.
Error: Missing option '-a' / '--app'.
```
I also got an error message which later turned out to be because I hadn't added my credit card information yet to `fly`. I wasn't sure if I should add that mention to the docs here, or to submit a bug-report over at https://github.com/simonw/datasette-publish-fly. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
839008371,MDU6SXNzdWU4MzkwMDgzNzE=,1274,Might there be some way to comment metadata.json?,192568,mroswell,closed,0,,,,,2,2021-03-23T18:33:00Z,2021-03-23T20:14:54Z,2021-03-23T20:14:54Z,CONTRIBUTOR,,"I don't know what license to use... Would be nice to be able to add a comment regarding that uncertainty in my metadata.json file
I like laktak's little video comment in favor of Human json (Hjson)
https://stackoverflow.com/questions/244777/can-comments-be-used-in-json
Hmmm... one of the commenters there said comments are allowed in yaml... so that's a good argument for yaml.
Anyhow, just came to mind, and thought I'd mention it here. Looks like https://hjson.github.io/ has the details.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
279547886,MDU6SXNzdWUyNzk1NDc4ODY=,163,Document the querystring argument for setting a different time limit,9599,simonw,closed,0,,,,,2,2017-12-05T22:05:08Z,2021-03-23T02:44:33Z,2017-12-06T15:06:57Z,OWNER,,"http://datasette.readthedocs.io/en/latest/sql_queries.html#query-limits
Need to explain why this is useful too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
837348479,MDU6SXNzdWU4MzczNDg0Nzk=,1269,Don't attempt to run count(*) against virtual tables,9599,simonw,closed,0,,,,,2,2021-03-22T05:57:43Z,2021-03-22T17:40:42Z,2021-03-22T17:40:41Z,OWNER,,"Counting the rows in a virtual table doesn't seem very interesting to me, and it's the cause of at least one crashing bug with SpatiaLite 5.0 on Linux, see https://github.com/simonw/datasette/issues/1268",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
836963850,MDU6SXNzdWU4MzY5NjM4NTA=,249,Full text search possibly broken?,36287,prabhur,closed,0,,,,,2,2021-03-21T02:03:44Z,2021-03-21T02:43:32Z,2021-03-21T02:43:32Z,NONE,,"I'm not quite sure if this is an issue with sqlite-utils or datasette.
**Background**
I was previously using sqlite-utils version < 3.6. I have a bunch of csv files that have some data scraped from a website.
```
sqlite-utils create-table mydb.db post \
posted_date text \
url text \
title text \
raw_text text \
--not-null posted_date \
--not-null url \
--pk=url
```
FTS is enabled via
`sqlite-utils enable-fts ./mydb.db post title raw_text`
Data is loaded to the table via
`sqlite-utils insert ./mydb.db post ${filename} --csv`
Note that the data contains text in my language Tamil.
Loading happens just fine.
datasette serves the db file just fine. It recognizes FTS and shows the ""search"" box. However, none of the queries work. Whatever text I supply, it always returns 0 rows. I literally copy paste words from the row listing on the screen and paste it on the search box.
Interestingly, only thing I can remember is switching to sqlite-utils 3.6. I had to do this because the prior version had an issue with column size.
I have attached one of the csv files that can be loaded to the table. Substitute ""${filename}"" with that file for the sqlite-utils insert command.
[posts_20200417-20201231.csv.zip](https://github.com/simonw/sqlite-utils/files/6176697/posts_20200417-20201231.csv.zip)
Interestingly, the FTS based search from datasette worked just fine before this version upgrade. That is, the queries returned results. I will try to downgrade just to see if the theory is correct.
I appreciate any help here. Thanks.
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
824067604,MDU6SXNzdWU4MjQwNjc2MDQ=,1250,Research: Plugin hook for alternative database connections,9599,simonw,closed,0,,,,,2,2021-03-08T00:28:15Z,2021-03-12T01:01:25Z,2021-03-12T01:01:17Z,OWNER,,"The `Database` class is a natural looking fit for a plugin hook to load custom database connections... potentially even databases other than SQLite. DuckDB (refs #968) could make for a great starting point, since it looks very compatible with the existing SQLite code.
The real win would be if this could lead to running Datasette against PostgreSQL. I made some initial explorations in that direction a while ago in #670.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
794554881,MDU6SXNzdWU3OTQ1NTQ4ODE=,1208,A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper,4488943,kbaikov,closed,0,,,,,2,2021-01-26T20:56:28Z,2021-03-11T16:15:49Z,2021-03-11T16:15:49Z,CONTRIBUTOR,,"Your code is full of open files that are never closed, especially when you deal with reading/writing json/yaml files.
If you run python with warnings enabled this problem becomes evident.
This probably contributes to some memory leaks in long running datasettes if the GC will not 'collect' those resources properly.
This is easily fixed by using a context manager instead of just using open:
```python
with open('some_file', 'w') as opened_file:
opened_file.write('string')
```
In some newer parts of the code you use Path objects 'read_text' and 'write_text' functions which close the file properly and are prefered in some cases.
If you want I can create a PR for all places i found this pattern in.
Bellow is a fraction of places where i found a ResourceWarning:
```python
update-docs-help.py:
20 actual = actual.replace(""Usage: cli "", ""Usage: datasette "")
21: open(docs_path / filename, ""w"").write(actual)
22
datasette\app.py:
210 ):
211: inspect_data = json.load((config_dir / ""inspect-data.json"").open())
212 if immutables is None:
266 if config_dir and (config_dir / ""settings.json"").exists() and not config:
267: config = json.load((config_dir / ""settings.json"").open())
268 self._settings = dict(DEFAULT_SETTINGS, **(config or {}))
445 self._app_css_hash = hashlib.sha1(
446: open(os.path.join(str(app_root), ""datasette/static/app.css""))
447 .read()
datasette\cli.py:
130 else:
131: out = open(inspect_file, ""w"")
132 loop = asyncio.get_event_loop()
459 if inspect_file:
460: inspect_data = json.load(open(inspect_file))
461
```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
806918878,MDExOlB1bGxSZXF1ZXN0NTcyMjU0MTAz,1223,Add compile option to Dockerfile to fix failing test (fixes #696),7476523,bobwhitelock,closed,0,,,,,2,2021-02-12T03:38:05Z,2021-03-07T12:01:12Z,2021-03-07T07:41:17Z,CONTRIBUTOR,simonw/datasette/pulls/1223,"This test was failing when run inside the Docker container: `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]`,
with this error:
```
def test_searchable(app_client, path, expected_rows):
response = app_client.get(path)
> assert expected_rows == response.json[""rows""]
E AssertionError: assert [[1, 'barry c...sel', 'puma']] == []
E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E Full diff:
E + []
E - [[1, 'barry cat', 'terry dog', 'panther'],
E - [2, 'terry dog', 'sara weasel', 'puma']]
```
The issue was that the version of sqlite3 built inside the Docker container was built with FTS3 and FTS4 enabled, but without the
`SQLITE_ENABLE_FTS3_PARENTHESIS` compile option passed, which adds support for using `AND` and `NOT` within `match` expressions (see https://sqlite.org/fts3.html#compiling_and_enabling_fts3_and_fts4 and https://www.sqlite.org/compile.html).
Without this, the `AND` used in the search in this test was being interpreted as a literal string, and so no matches were found. Adding this compile option fixes this.
---
I actually ran into this issue because the same test was failing when I ran the test suite on my own machine, outside of Docker, and so I eventually tracked this down to my system sqlite3 also being compiled without this option.
I wonder if this is a sign of a slightly deeper issue, that Datasette can silently behave differently based on the version and compilation of sqlite3 it is being used with. On my own system I fixed the test suite by running `pip install pysqlite3-binary`, so that this would be picked up instead of the `sqlite` package, as this seems to be compiled using this option, . Maybe using `pysqlite3-binary` could be installed/recommended by default so a more deterministic version of sqlite is used? Or there could be some feature detection done on the available sqlite version, to know what features are available and can be used/tested?",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
617323873,MDU6SXNzdWU2MTczMjM4NzM=,766,Enable wildcard-searches by default,2181410,clausjuhl,open,0,,,,,2,2020-05-13T10:14:48Z,2021-03-05T16:35:21Z,,NONE,,"Hi Simon.
It seems that datasette currently has wildcard-searches disabled by default (along with the boolean search-options, NEAR-queries and more, and despite the docs). If I try out the search-url provided in the [docs](https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-page-and-table-view-api) (https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort), it does not handle wildcard-searches, and I'm unable to make it work on my datasette-instance.
I would argue that wildcard-searches is such a standard query, that it should be enabled by default. Requiring ""_searchmode=raw"" when using prefix-searches seems unnecessary. Plus: What happens to non-ascii searches when using ""_searchmode=raw""? Is the ""escape_fts""-function from datasette.utils ignored?
Thanks!
/Claus",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
815955014,MDExOlB1bGxSZXF1ZXN0NTc5Njk3ODMz,1243,fix small typo,306240,UtahDave,closed,0,,,,,2,2021-02-25T00:22:34Z,2021-03-04T05:46:10Z,2021-03-04T05:46:10Z,CONTRIBUTOR,simonw/datasette/pulls/1243,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
818430405,MDU6SXNzdWU4MTg0MzA0MDU=,1247,datasette.add_memory_database() method,9599,simonw,closed,0,,,,,2,2021-03-01T03:48:38Z,2021-03-01T04:02:26Z,2021-03-01T04:02:26Z,OWNER,,"I just wrote this code:
https://github.com/simonw/datasette/blob/47eb885cc2c3aafa03645c330c6f597bee9b3b25/tests/test_facets.py#L334-L335
It would be nice if you didn't have to separately instantiate a database object here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
621286870,MDU6SXNzdWU2MjEyODY4NzA=,113,Syntactic sugar for ATTACH DATABASE,9599,simonw,closed,0,,,,,2,2020-05-19T21:10:00Z,2021-02-19T05:09:12Z,2021-02-19T04:56:36Z,OWNER,,"https://www.sqlite.org/lang_attach.html
Maybe something like this:
```python
db.attach(""other_db"", ""other_db.db"")
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
811589344,MDU6SXNzdWU4MTE1ODkzNDQ=,1235,Upgrade Python version used by official Datasette Docker image,9599,simonw,closed,0,,,,,2,2021-02-19T00:47:40Z,2021-02-19T01:48:31Z,2021-02-19T01:48:30Z,OWNER,,"Currently uses 3.7.2:
https://github.com/simonw/datasette/blob/73bed175631a79e13a521eee82f8451dd0477eb3/Dockerfile#L1
There's a security fix for Python which it would be good to ship in this image (even though I'm reasonably confident it doesn't affect Datasette): https://bugs.python.org/issue42938",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
797159961,MDExOlB1bGxSZXF1ZXN0NTY0MjE1MDEx,225,fix for problem in Table.insert_all on search for columns per chunk of rows,261237,nieuwenhoven,closed,0,,,,,2,2021-01-29T20:16:07Z,2021-02-14T21:04:13Z,2021-02-14T21:04:13Z,NONE,simonw/sqlite-utils/pulls/225,"Hi,
I ran into a problem when trying to create a database from my Apple Healthkit data using [healthkit-to-sqlite](https://github.com/dogsheep/healthkit-to-sqlite). The program crashed because of an invalid insert statement that was generated for table `rDistanceCycling`.
The actual problem turned out to be in [sqlite-utils](https://github.com/simonw/sqlite-utils). `Table.insert_all` processes the data to be inserted in chunks of rows and checks for every chunk which columns are used, and it will collect all column names in the variable `all_columns`. The collection of columns is done using a nested list comprehension that is not completely correct.
I'm using a Windows machine and had to make a few adjustments to the tests in order to be able to run them because they had a posix dependency.
Thanks, kind regards,
Frans
```
# this is a (condensed) chunk of data from my Apple healthkit export that caused the problem.
# the 3 last items in the chunk have additional keys: metadata_HKMetadataKeySyncVersion and metadata_HKMetadataKeySyncIdentifier
chunk = [{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1',
'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>',
'unit': 'km', 'creationDate': '2020-10-10 12:29:09 +0100', 'startDate': '2020-10-10 12:29:06 +0100',
'endDate': '2020-10-10 12:29:07 +0100', 'value': '0.00518016'},
{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1',
'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>',
'unit': 'km', 'creationDate': '2020-10-10 12:29:10 +0100', 'startDate': '2020-10-10 12:29:07 +0100',
'endDate': '2020-10-10 12:29:08 +0100', 'value': '0.00544049'},
{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6',
'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>',
'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:40:50 +0100',
'endDate': '2020-07-15 16:42:49 +0100', 'value': '0.952092', 'metadata_HKMetadataKeySyncVersion': '1',
'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520450.99823:616520569.99360:119'},
{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6',
'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>',
'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:42:49 +0100',
'endDate': '2020-07-15 16:44:51 +0100', 'value': '0.848983', 'metadata_HKMetadataKeySyncVersion': '1',
'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520569.99360:616520691.98826:119'},
{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6',
'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>',
'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:44:51 +0100',
'endDate': '2020-07-15 16:46:50 +0100', 'value': '0.834403', 'metadata_HKMetadataKeySyncVersion': '1',
'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520691.98826:616520810.98305:119'}]
def all_columns_old():
all_columns = [col for col in chunk[0]]
all_columns += [column for record in chunk
for column in record if column not in all_columns]
return all_columns
def all_columns_new():
all_columns = [col for col in chunk[0]]
for record in chunk:
all_columns += [column for column in record if column not in all_columns]
return all_columns
if __name__ == '__main__':
from pprint import pprint
print('problem: ')
pprint(all_columns_old())
print('\nfix: ')
pprint(all_columns_new())
```
",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
808028757,MDU6SXNzdWU4MDgwMjg3NTc=,231,"limit=X, offset=Y parameters for more Python methods",9599,simonw,closed,0,,,,,2,2021-02-14T19:31:23Z,2021-02-14T20:03:08Z,2021-02-14T20:03:08Z,OWNER,,"> I'm going to add a `offset=` parameter to support this case. Thanks for the suggestion!
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/224#issuecomment-778828495_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
792297010,MDExOlB1bGxSZXF1ZXN0NTYwMjA0MzA2,224,Add fts offset docs.,37962604,polyrand,closed,0,,,,,2,2021-01-22T20:50:58Z,2021-02-14T19:31:06Z,2021-02-14T19:31:06Z,NONE,simonw/sqlite-utils/pulls/224,"The limit can be passed as a string to the query builder to have an offset. I have tested it using the shorthand `limit=f""15, 30""`, the standard syntax should work too.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
792851444,MDU6SXNzdWU3OTI4NTE0NDQ=,11,XML parse error,3613583,dskrad,closed,0,,,,,2,2021-01-24T17:38:54Z,2021-02-11T21:18:58Z,2021-02-11T21:18:48Z,NONE,,"I am on Windows 10 using Windows Subsystem for Linux, Python 3.8. I installed evernote-to-sqlite via pipx (in a venv). I tried using enex files from the latest version of Evernote for Windows (10.6.9 which only lets you export 50 notes at a time) and from Legacy Evernote (6.25.2.9198 which lets you export all your notes at once). The enex file from latest evernote gives this error:
File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text)
xml.etree.ElementTree.ParseError: XML or text declaration not at start of entity: line 2, column 6
The enex file from Legacy Evernote gives this error:
File ""/home/david/.local/pipx/venvs/evernote-to-sqlite/lib/python3.8/site-packages/evernote_to_sqlite/utils.py"", line 28, in save_note
updated = note.find(""updated"").text
AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
792890765,MDU6SXNzdWU3OTI4OTA3NjU=,1200,?_size=10 option for the arbitrary query page would be useful,9599,simonw,open,0,,,,,2,2021-01-24T20:55:35Z,2021-02-11T03:13:59Z,,OWNER,,"https://latest.datasette.io/fixtures?sql=select+*+from+compound_three_primary_keys&_size=10 - `_size=10` does not do anything at the moment. It would be useful if it did.
Would also be good if it persisted in a hidden form field.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
803929694,MDU6SXNzdWU4MDM5Mjk2OTQ=,1219,Try profiling Datasette using scalene,9599,simonw,open,0,,,,,2,2021-02-08T20:37:06Z,2021-02-08T22:13:00Z,,OWNER,,https://github.com/emeryberger/scalene looks like an interesting profiling tool.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1219/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
801780625,MDU6SXNzdWU4MDE3ODA2MjU=,9,SSL Error,12669260,jfeiwell,open,0,,,,,2,2021-02-05T02:12:56Z,2021-02-07T18:45:04Z,,NONE,,"Here's the error I get when running `pip install pocket-to-sqlite`:
```
Could not fetch URL https://pypi.python.org/simple/pocket-to-sqlite/: There was a problem confirming the ssl certificate: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:661) - skipping
Could not find a version that satisfies the requirement pocket-to-sqlite (from versions: )
No matching distribution found for pocket-to-sqlite
```
Does this require python 3? ",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
788527932,MDU6SXNzdWU3ODg1Mjc5MzI=,223,--delimiter option for CSV import,9599,simonw,closed,0,,,,,2,2021-01-18T20:25:03Z,2021-02-06T01:39:47Z,2021-02-06T01:34:54Z,OWNER,,"https://bruxellesdata.opendatasoft.com/explore/dataset/dog-toilets/export/?location=12,50.85802,4.38054 says:
> CSV uses semicolon (;) as a separator.
Would be useful to be able to do this:
sqlite-utils insert places.db places places.csv --delimiter ';'
`--delimiter` could imply `--csv`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
796234313,MDU6SXNzdWU3OTYyMzQzMTM=,1210,Immutable Database w/ Canned Queries,525780,heyarne,closed,0,,,,,2,2021-01-28T18:08:29Z,2021-02-05T11:30:34Z,2021-02-05T11:30:34Z,NONE,,"I have a database that I only want to read from; when instructing datasette to treat the database as immutable my defined canned queries disappear. Are these two features incompatible or have I hit an unintended bug? Thanks for datasette in any way, it's a joy to use!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
771208009,MDU6SXNzdWU3NzEyMDgwMDk=,1154,Documentation for new _internal database and tables,9599,simonw,closed,0,,,6346396,Datasette 0.54,2,2020-12-18T22:34:52Z,2021-01-25T00:09:22Z,2021-01-25T00:08:41Z,OWNER,,"> Needs documentation, but I can wait to write that until I've tested out the feature a bit more.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1150#issuecomment-748352106_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
792931244,MDU6SXNzdWU3OTI5MzEyNDQ=,1202,Documentation convention for marking unstable APIs.,9599,simonw,closed,0,,,6346396,Datasette 0.54,2,2021-01-24T23:47:18Z,2021-01-25T00:01:02Z,2021-01-25T00:01:02Z,OWNER,,"> I'm going to document this but mark it as unstable, using a new documentation convention for marking unstable APIs.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1154#issuecomment-766462197_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
743400216,MDU6SXNzdWU3NDM0MDAyMTY=,11,Error thrown: sqlite3.OperationalError: table users has no column named lastName,61791,beaugunderson,closed,0,,,,,2,2020-11-16T01:21:18Z,2021-01-18T04:35:22Z,2021-01-18T04:35:22Z,NONE,,"Just installed `swarm-to-sqlite-0.3.2` and tried according to the docs:
```
Traceback (most recent call last):
File ""/usr/local/bin/swarm-to-sqlite"", line 8, in
sys.exit(cli())
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main
rv = self.invoke(ctx)
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli
save_checkin(checkin, db)
File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/utils.py"", line 82, in save_checkin
checkins_table.m2m(""users"", user, m2m_table=""likes"", pk=""id"")
File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1914, in m2m
id = other_table.insert(record, pk=pk, replace=True).last_pk
File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1647, in insert
return self.insert_all(
File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1765, in insert_all
self.insert_chunk(
File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1575, in insert_chunk
result = self.db.execute(query, params)
File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 200, in execute
return self.conn.execute(sql, parameters)
sqlite3.OperationalError: table users has no column named lastName
```",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
782708469,MDU6SXNzdWU3ODI3MDg0Njk=,1183,"Take advantage of sqlite-utils cached table counts, if available",9599,simonw,open,0,,,,,2,2021-01-09T23:51:48Z,2021-01-12T02:42:08Z,,OWNER,,"sqlite-utils 3.2 now has a mechanism for creating a `_counts` table with triggers to maintain counts for individual tables. Datasette could look for this table and use it for counts, dramatically speeding up places that currently suffer from slow counts. Refs #859.
https://sqlite-utils.datasette.io/en/stable/python-api.html#cached-table-counts-using-triggers",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
780767542,MDU6SXNzdWU3ODA3Njc1NDI=,1180,Lazily evaluated arguments for call_with_supported_arguments,9599,simonw,open,0,,,,,2,2021-01-06T18:43:34Z,2021-01-07T18:56:24Z,,OWNER,,"While building https://github.com/simonw/datasette-export-notebook I thought it would be nice to be able to show a count of exported records on the page ""This will stream 10,422 records to your notebook"".
None of the documented arguments on https://docs.datasette.io/en/0.53/plugin_hooks.html#register-output-renderer-datasette expose the count. The closest is `sql` which could be executed as `select count(*) from ({sql})` but that's a bit inelegant.
So, idea: if your defined render function takes a `total` argument, the caller runs a `count(*)` query and passes you that total.
To implement this I would need to teach the `call_with_supported_arguments` that some arguments are lazy - they should execute a function (or an async function) but only if they are needed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
738514367,MDU6SXNzdWU3Mzg1MTQzNjc=,202,sqlite-utils insert -f colname - for configuring full-text search,9599,simonw,closed,0,,,,,2,2020-11-08T17:30:09Z,2021-01-03T05:00:36Z,2021-01-03T05:00:27Z,OWNER,,"A mechanism for specifying columns that should be configured for full-text search as part of the initial data import:
sqlite-utils insert mydb.db articles articles.csv --csv -f title -f body",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
777543336,MDU6SXNzdWU3Nzc1NDMzMzY=,217,Rename .escape() to .quote(),9599,simonw,closed,0,,,,,2,2021-01-02T23:40:52Z,2021-01-03T04:27:38Z,2021-01-03T04:15:23Z,OWNER,,"`.quote()` is a better name because it reflects that the method adds quotes around the value.
This method has never been documented so I'm going to rename it without a major version bump. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/217/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
777529979,MDU6SXNzdWU3Nzc1Mjk5Nzk=,213,db.enable_counts() method,9599,simonw,closed,0,,,,,2,2021-01-02T21:44:55Z,2021-01-02T22:04:02Z,2021-01-02T22:04:02Z,OWNER,,"Following #212 it would be useful if there was a utility method for enabling counts for ALL tables in a database:
```python
db.enable_counts()
```
Open question: should this setup triggers for virtual tables such as FTS tables? Could that break things?
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
450032134,MDU6SXNzdWU0NTAwMzIxMzQ=,495,facet_m2m gets confused by multiple relationships,9599,simonw,open,0,,,,,2,2019-05-29T21:37:28Z,2020-12-17T05:08:22Z,,OWNER,,"I got this for a database I was playing with:
I think this is because of these three tables:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
769376447,MDU6SXNzdWU3NjkzNzY0NDc=,2,killed by oomkiller on large location-history,231498,khimaros,open,0,,,,,2,2020-12-17T00:32:24Z,2020-12-17T00:48:32Z,,NONE,,"memory seems to grow unbounded and is oom-killed after about 20GB memory usage.
this is happening while loading a ~1GB uncompressed location history.",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
767561886,MDU6SXNzdWU3Njc1NjE4ODY=,1148,Syntax error with + symbol when deployed to Vercel,765871,kaihendry,closed,0,,,,,2,2020-12-15T12:53:12Z,2020-12-16T21:57:42Z,2020-12-16T21:51:54Z,NONE,,"Works locally:
```
(ins)[hendry@t14s aws-partners]$ sqlite3 partners.db < test.sql
5|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBI0IAN"",""label"":""Slalom""}
6|{""href"":""https://partners.amazonaws.com/partners/001E000000VHBQIIA5"",""label"":""Accenture""}
7|{""href"":""https://partners.amazonaws.com/partners/001E000000Rp588IAB"",""label"":""Druva""}
8|{""href"":""https://partners.amazonaws.com/partners/001E0000013FeQXIA0"",""label"":""Palo Alto Networks""}
9|{""href"":""https://partners.amazonaws.com/partners/001E000000Rl12lIAB"",""label"":""New Relic""}
10|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBHWIA3"",""label"":""Deloitte""}
11|{""href"":""https://partners.amazonaws.com/partners/001E000000Rp5GDIAZ"",""label"":""MegazoneCloud""}
12|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBHMIA3"",""label"":""iret, Inc.""}
(ins)[hendry@t14s aws-partners]$ cat test.sql
select A.launch_rank, A.partner_info from summary A INNER JOIN summary B
ON A.launch_rank>=B.launch_rank-3 AND A.launch_rank<=B.launch_rank+4
WHERE B.""partner_info"" LIKE '%Palo Alto%'
```
Copy the SQL into https://aws-partners-singapore.vercel.app/partners and get syntax error:
![image](https://user-images.githubusercontent.com/765871/102217393-78e4f280-3f17-11eb-9e13-ca79ed62ec34.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
769150394,MDU6SXNzdWU3NjkxNTAzOTQ=,58,Readme HTML has broken internal links,9599,simonw,closed,0,,,,,2,2020-12-16T17:58:11Z,2020-12-16T19:20:14Z,2020-12-16T19:20:14Z,MEMBER,,"From https://github.com/simonw/datasette.io/issues/46
```html
```
So this is a bug in GitHub's API, but we need to work around it.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
760960559,MDU6SXNzdWU3NjA5NjA1NTk=,205,"sqlite3.OperationalError: near ""("": syntax error",765871,kaihendry,closed,0,,,,,2,2020-12-10T06:44:40Z,2020-12-10T19:18:22Z,2020-12-10T07:24:23Z,NONE,,"The sqlite version is 3.22.0 2018-01-22 18:45:57 0c55d179733b46d8d0ba4d88e01a25e10677046ee3da1d5b1581e86726f2alt1
sqlite-utils, version 3.0
It fails here: https://github.com/kaihendry/aws-partners-datasette/runs/1528432635?check_suite_focus=true
I'm not sure where the problem is, since it works _fine locally_ on Archlinux system running 3.34.0 2020-12-01 16:14:00 a26b6597e3ae272231b96f9982c3bcc17ddec2f2b6eb4df06a224b91089fed5b
https://github.com/kaihendry/aws-partners-datasette/blob/main/create-summary-view.sh
Maybe I need to bump up from ubuntu-latest to ?
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
756867924,MDExOlB1bGxSZXF1ZXN0NTMyMzQyMDI1,1128,Fix startup error on windows,3243482,abdusco,closed,0,,,,,2,2020-12-04T07:12:26Z,2020-12-06T08:41:45Z,2020-12-05T19:35:04Z,CONTRIBUTOR,simonw/datasette/pulls/1128,"Fixes https://github.com/simonw/datasette/issues/1094
This import isn't used at all, and causes error on startup on Windows.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
754179035,MDExOlB1bGxSZXF1ZXN0NTMwMTI1Njk1,1122,Fix misaligned table actions cog,3243482,abdusco,closed,0,,,,,2,2020-12-01T08:41:46Z,2020-12-03T10:56:40Z,2020-12-03T00:33:37Z,CONTRIBUTOR,simonw/datasette/pulls/1122,Fixes https://github.com/simonw/datasette/issues/1121,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
747702144,MDU6SXNzdWU3NDc3MDIxNDQ=,1100,Error on OPTIONS request to database,1319404,akehrer,closed,0,,,,,2,2020-11-20T18:16:43Z,2020-12-03T00:57:35Z,2020-12-03T00:50:17Z,NONE,,"When I perform an OPTIONS request against a database or table datasette fails with an internal error.
All these tests result in the traceback below.
```
curl -XOPTIONS http://127.0.0.1:8001/test-db
curl -XOPTIONS http://127.0.0.1:8001/test-db/table1
curl -XOPTIONS http://127.0.0.1:8001/test-db/table1\?_search\=test
```
```
Traceback (most recent call last):
File ""[path-to-python]/site-packages/datasette/app.py"", line 1033, in route_path
response = await view(request, send)
File ""[path-to-python]/site-packages/datasette/views/base.py"", line 146, in view
request, **request.scope[""url_route""][""kwargs""]
File ""[path-to-python]/site-packages/datasette/views/base.py"", line 118, in dispatch_request
return await handler(request, *args, **kwargs)
TypeError: object Response can't be used in 'await' expression
```
Making the `options` function in the `DataView` class async fixed it for me.
```python
async def options(self, request, *args, **kwargs):
r = Response.text(""ok"")
if self.ds.cors:
r.headers[""Access-Control-Allow-Origin""] = ""*""
return r
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
681228542,MDExOlB1bGxSZXF1ZXN0NDY5NjUxNzMy,48,Add pull requests,755825,adamjonas,closed,0,,,,,2,2020-08-18T17:58:44Z,2020-11-29T23:51:09Z,2020-11-29T23:51:09Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/48,"ref #46
Issues don't have merge information on them, which means that PRs need to be pulled separately.
Did my best to mimic the API of issues.",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
750330029,MDU6SXNzdWU3NTAzMzAwMjk=,1110,datasette publish option for installing extra apt-get packages,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-25T03:03:43Z,2020-11-28T23:28:56Z,2020-11-25T03:05:41Z,OWNER,,I ran into a need for this while playing with https://github.com/simonw/datasette-ripgrep - I need to install the `ripgrep` Ubuntu package when I deploy the plugin using Cloud Run.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
749982022,MDU6SXNzdWU3NDk5ODIwMjI=,1105,Rebrand config as settings,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:35:12Z,2020-11-24T21:40:28Z,2020-11-24T21:40:28Z,OWNER,,"I realized I need a tracking ticket for this.
I want to start splitting things like plugin configuration and default facets / sort order out of `metadata.json` - so I want to start calling those things configuration. But the term configuration is already used for the `--config` family of global settings. So I'm rebranding that type of configuration as settings to free up the name ""configuration"" for more run-time concerns (default sort order) and plugin configuration.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
749981663,MDU6SXNzdWU3NDk5ODE2NjM=,1104,config.json in directory config mode should be settings.json,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:34:38Z,2020-11-24T20:37:42Z,2020-11-24T20:37:41Z,OWNER,,"Another knock-on effect of #992.
https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/docs/config.rst#L51-L55",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
749979454,MDU6SXNzdWU3NDk5Nzk0NTQ=,1103,Rename /-/config to /-/settings,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:31:00Z,2020-11-24T20:19:20Z,2020-11-24T20:19:19Z,OWNER,,"As part of rebranding config to settings, see also #992.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
315738696,MDU6SXNzdWUzMTU3Mzg2OTY=,226,Unit tests for installable plugins,9599,simonw,closed,0,,,,,2,2018-04-19T06:05:32Z,2020-11-24T19:52:51Z,2020-11-24T19:52:46Z,OWNER,,"I'd like more thorough unit test coverage of the plugins mechanism - in particular for installable plugins.
I think I can do this while still having the code live in the same repo, by creating a subdirectory in tests/example_plugin with its own setup.py and then running `python setup.py install` as part of the test runner.
I imagine I will need to bump the version number every time I change the plugin in case someone runs the test again in the same virtual environment.
If that doesn't work I can instead ship a datasette-plugins-tests two to PyPI and add that as a tests_require dependency.
Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
706167456,MDU6SXNzdWU3MDYxNjc0NTY=,168,Automate (as much as possible) updates published to Homebrew,9599,simonw,closed,0,,,,,2,2020-09-22T08:08:37Z,2020-11-09T07:43:30Z,2020-11-09T07:43:30Z,OWNER,,I'd like to get new `sqlite-utils` (and Datasette) releases submitted to Homebrew as painlessly as possible.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
730199464,MDU6SXNzdWU3MzAxOTk0NjQ=,1054,Switch from versioneer to concrete version in setup.py,9599,simonw,closed,0,,,6026070,0.51,2,2020-10-27T07:38:08Z,2020-10-29T03:38:18Z,2020-10-29T03:38:17Z,OWNER,,"The new PyPI resolver keeps on showing me warnings like this one when I install Datasette directly from GitHub using `pip install https://github.com/simonw/datasette/archive/main.zip`:
```
Successfully built datasette
Installing collected packages: datasette
Attempting uninstall: datasette
Found existing installation: datasette 0.50.2
Uninstalling datasette-0.50.2:
Successfully uninstalled datasette-0.50.2
ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.
We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.
datasette-upload-csvs 0.5 requires datasette>=0.47, but you'll have datasette 0+unknown which is incompatible.
datasette-publish-vercel 0.8 requires datasette>=0.44, but you'll have datasette 0+unknown which is incompatible.
datasette-psutil 0.2 requires datasette>=0.44, but you'll have datasette 0+unknown which is incompatible.
datasette-leaflet-geojson 0.6 requires datasette>=0.48, but you'll have datasette 0+unknown which is incompatible.
datasette-edit-schema 0.3 requires datasette>=0.44, but you'll have datasette 0+unknown which is incompatible.
datasette-cluster-map 0.13 requires datasette>=0.48, but you'll have datasette 0+unknown which is incompatible.
Successfully installed datasette-0+unknown
```
This is because we use versioneer. I'm going to drop that in favour of embedding the version directly in `setup.py`, like I do in other projects such as `sqlite-utils`.
I'll use a `.dev` suffix in the development version, as suggested by https://www.python.org/dev/peps/pep-0440/#developmental-releases",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
731740458,MDU6SXNzdWU3MzE3NDA0NTg=,191,Idea: @db.register_function(deterministic=True),9599,simonw,closed,0,,,,,2,2020-10-28T19:45:18Z,2020-10-28T21:31:06Z,2020-10-28T21:31:06Z,OWNER,,"Python 3.8 added a `deterministic` parameter to `db.create_function()`: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.create_function
`sqlite-utils` could expose this in the decorator, only actually applying it if the Python version supports it (using feature detection) - since nothing will break if it's not applied.
https://sqlite-utils.readthedocs.io/en/stable/python-api.html#registering-custom-sql-functions
```python
db = Database(memory=True)
@db.register_function(deterministic=True)
def reverse_string(s):
return """".join(reversed(list(s)))
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
731445447,MDExOlB1bGxSZXF1ZXN0NTExNTQ5Mzc0,1059,"Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7",27856297,dependabot-preview[bot],closed,0,,,,,2,2020-10-28T13:32:40Z,2020-10-28T17:08:29Z,2020-10-28T17:08:28Z,CONTRIBUTOR,simonw/datasette/pulls/1059,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
- `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme
Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):
- Update frequency (including time of day and day of week)
- Pull request limits (per update run and/or open at any time)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
708289783,MDU6SXNzdWU3MDgyODk3ODM=,976,Idea: -o could open to a more convenient location,9599,simonw,closed,0,,,,,2,2020-09-24T15:56:35Z,2020-10-26T05:07:10Z,2020-10-26T05:06:26Z,OWNER,,"> Idea: if a database only has a single table, this could open straight to `/db/table`. If it has multiple tables but a single database it could open straight to `/db`.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/970#issuecomment-698434236_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/976/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
728895193,MDU6SXNzdWU3Mjg4OTUxOTM=,1046,Link to blob downloads in the right places,9599,simonw,closed,0,,,6026070,0.51,2,2020-10-24T23:00:41Z,2020-10-25T00:13:21Z,2020-10-25T00:13:21Z,OWNER,,"Split from #1040, refs #1036.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1046/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
727916744,MDExOlB1bGxSZXF1ZXN0NTA4NzIwNjYw,1044,Add minimum supported python,45380,bollwyvl,closed,0,,,,,2,2020-10-23T05:08:03Z,2020-10-23T20:53:08Z,2020-10-23T20:53:08Z,CONTRIBUTOR,simonw/datasette/pulls/1044,"Thanks for `datasette`!
This PR adds `python_requires` to formally signal the [minimum supported python version](https://packaging.python.org/guides/dropping-older-python-versions/#specify-the-version-ranges-for-supported-python-distributions) (which is pointed out with classifiers, so seems pretty straightforward).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1044/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
608613033,MDU6SXNzdWU2MDg2MTMwMzM=,745,Extract the hash-URL mechanism out into a plugin,9599,simonw,closed,0,,,,,2,2020-04-28T21:00:38Z,2020-10-23T19:47:18Z,2020-10-23T19:47:10Z,OWNER,,"0.28 in May 2019 made this feature not-the-default: https://datasette.readthedocs.io/en/stable/changelog.html#v0-28 - see #418
I've not felt the need to use it myself since. I think I should move it into a plugin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
713209404,MDU6SXNzdWU3MTMyMDk0MDQ=,988,Mechanism for plugins to construct URLs that respect base_url,9599,simonw,closed,0,,,6026070,0.51,2,2020-10-01T21:54:15Z,2020-10-23T19:44:05Z,2020-10-15T23:01:02Z,OWNER,,"> Had a thought: this is likely to break in plugins too, such as `datasette-edit-schema` which constructs URLs for redirects e.g. here: https://github.com/simonw/datasette-edit-schema/blob/dbd0abee6dd3385b114cfe9671f7ead1c4855b60/datasette_edit_schema/__init__.py#L46-L48
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/865#issuecomment-702418045_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/988/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
707407567,MDU6SXNzdWU3MDc0MDc1Njc=,171,Idea: transitive closure tables for tree structures,649467,mhalle,closed,0,,,,,2,2020-09-23T14:17:33Z,2020-10-22T04:38:35Z,2020-10-22T04:07:14Z,NONE,,"I just read that sqlite has a transitive closure table extension using a virtual table in order to represent trees:
https://charlesleifer.com/blog/querying-tree-structures-in-sqlite-using-python-and-the-transitive-closure-extension/
Even without this extension, though, a util to build a transitive closure table would allow trees to be queried easily. Since it relies on self-referential foreign keys, the relationships might even be able to be automatically detected. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
726154220,MDExOlB1bGxSZXF1ZXN0NTA3MjY3MDg3,1038,DOC: Fix syntax error,194147,gerrymanoim,closed,0,,,,,2,2020-10-21T05:45:38Z,2020-10-21T22:57:21Z,2020-10-21T22:44:17Z,CONTRIBUTOR,simonw/datasette/pulls/1038,"If I understand https://docs.datasette.io/en/stable/plugin_hooks.html#register-routes correctly, `register_routes` should return a `List[Tuple[str, Callable]]`. I believe the current code in documentation has a syntax error (extra `)`). ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
721830990,MDExOlB1bGxSZXF1ZXN0NTAzNjg1MDc3,1022,Fix table name in spatialite example command,639012,jsfenfen,closed,0,,,,,2,2020-10-14T22:19:34Z,2020-10-14T23:46:46Z,2020-10-14T23:46:46Z,CONTRIBUTOR,simonw/datasette/pulls/1022,The example query for creating a new point geometry seems to be using a table called 'museums' but at one point it instead uses 'events'. I *believe* it is intended to be museums (the example makes more sense if so). ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
648245071,MDU6SXNzdWU2NDgyNDUwNzE=,8,Error thrown: table photos has no column named hasSticker,18504,harperreed,closed,0,,,,,2,2020-06-30T14:54:37Z,2020-10-12T20:35:06Z,2020-10-12T20:25:24Z,NONE,,"While running `swarm-to-sqlite` it throws an error:
harper@:~/dogsheep/swarm$ swarm-to-sqlite checkins.db --save=checkins.json
Please provide your Foursquare OAuth token:
Importing 8127 checkins [#################-------------------] 49% 00:01:52
Traceback (most recent call last):
File ""/home/harper/.local/bin/swarm-to-sqlite"", line 11, in
sys.exit(cli())
File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 782, in main
rv = self.invoke(ctx)
File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli
save_checkin(checkin, db)
File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/utils.py"", line 94, in save_checkin
photos_table.insert(photo, replace=True)
File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 963, in insert
alter = self.value_or_default(""alter"", alter)
File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1142, in insert_all
def upsert_all(
sqlite3.OperationalError: table photos has no column named hasSticker
Where should i dig in?",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
718949182,MDU6SXNzdWU3MTg5NDkxODI=,6,Better handling of OCR data,9599,simonw,closed,0,,,,,2,2020-10-11T23:20:52Z,2020-10-12T00:04:10Z,2020-10-12T00:04:10Z,MEMBER,,"> I haven't done the FTS on OCR yet. I'm going to move that to another ticket because it requires more thought.
_Originally posted by @simonw in https://github.com/dogsheep/evernote-to-sqlite/issues/4#issuecomment-706784028_",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
718938508,MDU6SXNzdWU3MTg5Mzg1MDg=,4,Configure FTS + add an index on the date columns,9599,simonw,closed,0,,,,,2,2020-10-11T22:14:40Z,2020-10-11T23:41:29Z,2020-10-11T23:41:29Z,MEMBER,,"Sort by date descending is likely the most common way of sorting, so that column should be indexed.
Also add FTS configuration for both notes and the OCR column on resources.",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
628572716,MDU6SXNzdWU2Mjg1NzI3MTY=,791,Tutorial: building a something-interesting with writable canned queries,9599,simonw,open,0,,,,,2,2020-06-01T16:32:05Z,2020-10-10T23:34:42Z,,OWNER,,"Initial idea: TODO list, as a tutorial for #698 writable canned queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/791/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
718264811,MDU6SXNzdWU3MTgyNjQ4MTE=,1006,Documentation for datasette.client,9599,simonw,closed,0,,,5971510,Datasette 0.50,2,2020-10-09T16:09:02Z,2020-10-09T17:22:31Z,2020-10-09T17:20:37Z,OWNER,,"> I'm going to document this in a separate issue.
_Originally posted by @simonw in https://github.com/simonw/datasette/pull/1000#issuecomment-706269271_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
715779909,MDExOlB1bGxSZXF1ZXN0NDk4NjMwNzA4,995,Document setting Google Cloud SDK properties,110420,ghing,closed,0,,,5971510,Datasette 0.50,2,2020-10-06T15:18:01Z,2020-10-08T23:55:30Z,2020-10-06T16:25:38Z,CONTRIBUTOR,simonw/datasette/pulls/995,Document setting Google Cloud SDK properties to avoid having to respond to interactive prompts when running `datasette publish cloudrun`.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/995/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
715072935,MDU6SXNzdWU3MTUwNzI5MzU=,993,Column action menu should show column type,9599,simonw,closed,0,,,5971510,Datasette 0.50,2,2020-10-05T18:40:49Z,2020-10-08T23:55:19Z,2020-10-06T00:33:15Z,OWNER,,"
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/993/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
710819020,MDU6SXNzdWU3MTA4MTkwMjA=,980,Another rendering glitch with column headers on mobile,9599,simonw,closed,0,,,5971510,Datasette 0.50,2,2020-09-29T06:53:13Z,2020-10-08T23:54:49Z,2020-09-29T19:21:50Z,OWNER,,"Similar to #978.
https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_rrule(%27FREQ%3DHOURLY%3BCOUNT%3D5%27)%2C%0D%0A++dateutil_rrule_date(%0D%0A++++%27FREQ%3DDAILY%3BCOUNT%3D3%27%2C%0D%0A++++%271st+jan+2020%27%0D%0A++)%3B
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/980/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
706486323,MDU6SXNzdWU3MDY0ODYzMjM=,973,'bool' object is not callable error,9599,simonw,closed,0,,,5971510,Datasette 0.50,2,2020-09-22T15:30:54Z,2020-10-08T23:54:32Z,2020-09-22T15:40:35Z,OWNER,,"I'm getting this when latest is deployed to Cloud Run:
```
Traceback (most recent call last):
File ""/usr/local/bin/datasette"", line 8, in
sys.exit(cli())
File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 782, in main
rv = self.invoke(ctx)
File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/usr/local/lib/python3.8/site-packages/click/core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""/usr/local/lib/python3.8/site-packages/datasette/cli.py"", line 406, in serve
inspect_data = json.load(open(inspect_file))
TypeError: 'bool' object is not callable
```
I think I may have broken things in #970 - a980199e61fe7ccf02c2123849d86172d2ae54ff",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
712889459,MDExOlB1bGxSZXF1ZXN0NDk2Mjk4MTgw,986,"Allow facet by primary keys, fixes #985",39452697,MrNaif2018,closed,0,,,,,2,2020-10-01T14:18:55Z,2020-10-01T16:51:45Z,2020-10-01T16:51:45Z,NONE,simonw/datasette/pulls/986,"Hello! This PR makes it possible to facet by primary keys.
Did I get it right that just removing the condition on UI side is enough? From testing it works fine with primary keys, just as with normal keys.
If so, should I also remove unused `data-is-pk`?",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/986/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
684118950,MDU6SXNzdWU2ODQxMTg5NTA=,138,extracts= doesn't configure foreign keys,9599,simonw,closed,0,,,,,2,2020-08-23T05:21:15Z,2020-09-24T22:47:01Z,2020-09-24T22:46:52Z,OWNER,,"In using `extracts=` for `shapefiles-to-sqlite` in https://github.com/simonw/shapefile-to-sqlite/issues/9 I've run into a couple of pretty serious flaws:
- The columns in the original table are still `TEXT` even when the foreign key they are supposed to reference is an `INTEGER` - which means Datasette foreign key features don't actually work
- Those foreign key relationships aren't setup automatically - creating them is left as an exercise for the developer",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
696045581,MDU6SXNzdWU2OTYwNDU1ODE=,155,rebuild-fts command and table.rebuild_fts() method,9599,simonw,closed,0,,,,,2,2020-09-08T17:19:26Z,2020-09-24T20:35:46Z,2020-09-08T23:16:10Z,OWNER,,"https://sqlite.org/forum/forumpost/fa777fff86
> Easiest thing would be to run a 'rebuild' to rebuild the FTS index from scratch based on the contents of the content table. i.e.
>
> INSERT INTO licenses_fts(licenses_fts) VALUES('rebuild');
>
> https://www.sqlite.org/fts5.html#the_rebuild_command",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
708293114,MDU6SXNzdWU3MDgyOTMxMTQ=,176,sqlite-utils transform column order option,9599,simonw,closed,0,,,,,2,2020-09-24T16:01:21Z,2020-09-24T20:34:51Z,2020-09-24T16:11:59Z,OWNER,,Split from #175,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/176/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
697179806,MDU6SXNzdWU2OTcxNzk4MDY=,157,sqlite-utils add-foreign-keys command,9599,simonw,closed,0,,,5896742,2.19,2,2020-09-09T21:44:30Z,2020-09-24T20:34:50Z,2020-09-20T20:14:30Z,OWNER,,"Like `add-foreign-key` but can do multiple foreign keys at once. Inspired by https://github.com/simonw/calands-datasette/blob/99de39dd80a906f5c1f16724467b0cd55ba4ef36/build.sh which does this:
```
sqlite-utils add-foreign-key calands.db units_with_maps ACCESS_TYP
sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_NAME
sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_LEV
sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_TYP
sqlite-utils add-foreign-key calands.db units_with_maps LAYER
sqlite-utils add-foreign-key calands.db units_with_maps MNG_AGENCY
sqlite-utils add-foreign-key calands.db units_with_maps MNG_AG_LEV
sqlite-utils add-foreign-key calands.db units_with_maps MNG_AG_TYP
sqlite-utils add-foreign-key calands.db units_with_maps COUNTY
sqlite-utils add-foreign-key calands.db units_with_maps DES_TP
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
705995722,MDU6SXNzdWU3MDU5OTU3MjI=,162,A decorator for registering custom SQL functions,9599,simonw,closed,0,,,,,2,2020-09-22T00:18:32Z,2020-09-22T00:40:44Z,2020-09-22T00:32:17Z,OWNER,,"Syntactic sugar for `db.conn.create_function` - it would work something like this:
```python
db = sqlite_utils.Database(""mydb.db"")
@db.register_function
def scramble(text):
chars = list(text)
random.shuffle(chars)
return """".join(chars)
```
The decorator would inspect the function to find its name and arity (number of arguments). Having run the above you could then do:
```python
db.execute(""select scramble('hello')"").fetchall()
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
704685890,MDU6SXNzdWU3MDQ2ODU4OTA=,25,template_debug mechanism,9599,simonw,closed,0,,,,,2,2020-09-18T22:11:09Z,2020-09-18T22:12:21Z,2020-09-18T22:12:03Z,MEMBER,,"> I'd prefer it if errors in these template fragments were displayed as errors inline where the fragment should have been inserted, rather than 500ing the whole page - especially since the template fragments are user-provided and could have all kinds of odd errors in them which should be as easy to debug as possible.
_Originally posted by @simonw in https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694554584_",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
522352520,MDU6SXNzdWU1MjIzNTI1MjA=,634,Don't run tests twice when releasing a tag,9599,simonw,closed,0,,,,,2,2019-11-13T17:02:42Z,2020-09-15T20:37:58Z,2020-09-15T20:37:58Z,OWNER,,"Shipping a release currently runs the tests twice: https://travis-ci.org/simonw/datasette/builds/611463728
It does a regular test run on Python 3.6/7/8 - then the ""Release tagged version"" step runs the tests again before publishing to PyPI! This second run is not necessary.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/634/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
687711713,MDU6SXNzdWU2ODc3MTE3MTM=,955,Release updated datasette-atom and datasette-ics,9599,simonw,closed,0,,,5818042,Datasette 0.49,2,2020-08-28T04:55:21Z,2020-09-14T22:19:46Z,2020-09-14T22:19:46Z,OWNER,,These should release straight after Datasette 0.49 with the change from #953.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/955/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
698791218,MDU6SXNzdWU2OTg3OTEyMTg=,50,"favorites --stop_after=N stops after min(N, 200)",370930,mikepqr,open,0,,,,,2,2020-09-11T03:38:14Z,2020-09-13T05:11:14Z,,CONTRIBUTOR,,"For any number greater than 200, `favorites --stop_after` stops after getting 200 tweets, e.g.
```
$ twitter-to-sqlite favorites tweets.db --stop_after=300
Importing favorites [####################################] 199
$
```
I don't _think_ this is a limitation of the API (if you omit `--stop_after` you get some very large number, possibly all of them), so I _think_ this is a bug.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
649429772,MDU6SXNzdWU2NDk0Mjk3NzI=,886,Reconsider how _actor_X magic parameter deals with missing values,9599,simonw,open,0,,,,,2,2020-07-02T00:00:38Z,2020-09-11T21:35:26Z,,OWNER,,"I had to build a custom `_actorornull` prefix for [datasette-saved-queries](https://github.com/simonw/datasette-saved-queries/blob/37c00e56ac398e1f9aa342d30357de013a9b37b4/datasette_saved_queries/__init__.py):
```python
def actorornull(key, request):
if request.actor is None:
return None
return request.actor.get(key)
@hookimpl
def register_magic_parameters():
return [
(""actorornull"", actorornull),
]
```
Maybe the `actor` magic in Datasette core should do that out of the box?
https://github.com/simonw/datasette/blob/f1f581b7ffcd5d8f3ae6c1c654d813a6641410eb/datasette/default_magic_parameters.py#L14-L17
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/886/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
696908389,MDU6SXNzdWU2OTY5MDgzODk=,961,Verification checks for metadata.json on startup,9599,simonw,open,0,,,,,2,2020-09-09T15:21:53Z,2020-09-09T15:24:31Z,,OWNER,,"I lost a bunch of time yesterday trying to figure out why a Datasette instance wasn't starting up - it turned out it was because I had a `facets:` reference that mentioned a column that did not exist.
Catching these on startup would be good.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/961/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
694500679,MDU6SXNzdWU2OTQ1MDA2Nzk=,17,"Rename ""table"" to ""type""",9599,simonw,closed,0,,,,,2,2020-09-06T19:34:41Z,2020-09-09T03:03:22Z,2020-09-09T03:03:22Z,MEMBER,,"I think ""table"" is the wrong name for the concept I'm using it for here.
Two reasons: firstly, `table` is a reserved word in SQLite. More importantly, it turns out there's not a direct mapping from tables to types of search result. In particular, for GitHub I ended up having two different ""tables"" of repositories - one for repos created by me, another for repos that I have starred.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
688659182,MDU6SXNzdWU2ODg2NTkxODI=,145,Bug when first record contains fewer columns than subsequent records,96218,simonwiles,closed,0,,,,,2,2020-08-30T05:44:44Z,2020-09-08T23:21:23Z,2020-09-08T23:21:23Z,CONTRIBUTOR,,"`insert_all()` selects the maximum batch size based on the number of fields in the first record. If the first record has fewer fields than subsequent records (and `alter=True` is passed), this can result in SQL statements with more than the maximum permitted number of host parameters. This situation is perhaps unlikely to occur, but could happen if the first record had, say, 10 columns, such that `batch_size` (based on `SQLITE_MAX_VARIABLE_NUMBER = 999`) would be 99. If the next 98 rows had 11 columns, the resulting SQL statement for the first batch would have `10 * 1 + 11 * 98 = 1088` host parameters (and subsequent batches, if the data were consistent from thereon out, would have `99 * 11 = 1089`).
I suspect that this bug is masked somewhat by the fact that while:
> [`SQLITE_MAX_VARIABLE_NUMBER`](https://www.sqlite.org/limits.html#max_variable_number) ... defaults to 999 for SQLite versions prior to 3.32.0 (2020-05-22) or 32766 for SQLite versions after 3.32.0.
it is common that it is increased at compile time. Debian-based systems, for example, seem to ship with a version of sqlite compiled with `SQLITE_MAX_VARIABLE_NUMBER` set to 250,000, and I believe this is the case for homebrew installations too.
A test for this issue might look like this:
```python
def test_columns_not_in_first_record_should_not_cause_batch_to_be_too_large(fresh_db):
# sqlite on homebrew and Debian/Ubuntu etc. is typically compiled with
# SQLITE_MAX_VARIABLE_NUMBER set to 250,000, so we need to exceed this value to
# trigger the error on these systems.
THRESHOLD = 250000
extra_columns = 1 + (THRESHOLD - 1) // 99
records = [
{""c0"": ""first record""}, # one column in first record -> batch_size = 100
# fill out the batch with 99 records with enough columns to exceed THRESHOLD
*[
dict([(""c{}"".format(i), j) for i in range(extra_columns)])
for j in range(99)
]
]
try:
fresh_db[""too_many_columns""].insert_all(records, alter=True)
except sqlite3.OperationalError:
raise
```
The best solution, I think, is simply to process all the records when determining columns, column types, and the batch size. In my tests this doesn't seem to be particularly costly at all, and cuts out a lot of complications (including obviating my implementation of #139 at #142). I'll raise a PR for your consideration.
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
695556681,MDU6SXNzdWU2OTU1NTY2ODE=,19,Figure out incremental re-indexing,9599,simonw,open,0,,,,,2,2020-09-08T05:23:31Z,2020-09-08T05:27:07Z,,MEMBER,,As tables get bigger reindexing everything on a schedule (essentially recreating the entire index from scratch) will start to become a performance bottleneck.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
695553522,MDU6SXNzdWU2OTU1NTM1MjI=,18,Deleted records stay in the search index,9599,simonw,open,0,,,,,2,2020-09-08T05:14:23Z,2020-09-08T05:15:51Z,,MEMBER,,"Here's why: https://github.com/dogsheep/dogsheep-beta/blob/24f7898d41a39218058f174c75ba62f7c0fcfff6/dogsheep_beta/utils.py#L44-L53
That should probably do `DELETE FROM index1.search_index WHERE [table] = ?` first.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
695441530,MDU6SXNzdWU2OTU0NDE1MzA=,154,OperationalError: cannot change into wal mode from within a transaction,9599,simonw,open,0,,,,,2,2020-09-07T23:42:44Z,2020-09-07T23:47:10Z,,OWNER,,"I'm getting this error when running:
sqlite-utils enable-wal beta.db
`OperationalError: cannot change into wal mode from within a transaction`
I'm worried that maybe that's because of this new code from #152:
https://github.com/simonw/sqlite-utils/blob/deb2eb013ff85bbc828ebc244a9654f0d9c3139e/sqlite_utils/db.py#L128-L129",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
695376054,MDU6SXNzdWU2OTUzNzYwNTQ=,152,Turn on recursive_triggers by default,9599,simonw,closed,0,,,,,2,2020-09-07T20:26:36Z,2020-09-07T21:17:48Z,2020-09-07T20:45:14Z,OWNER,,"https://www.sqlite.org/pragma.html#pragma_recursive_triggers says:
> Prior to SQLite [version 3.6.18](https://www.sqlite.org/releaselog/3_6_18.html) (2009-09-11), recursive triggers were not supported. The behavior of SQLite was always as if this pragma was set to OFF. Support for recursive triggers was added in version 3.6.18 but was initially turned OFF by default, for compatibility. Recursive triggers may be turned on by default in future versions of SQLite.
So I think the fix for the complex issue in #149 is to turn on `recursive_triggers` globally by default for `sqlite-utils`.
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688499924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
689809225,MDU6SXNzdWU2ODk4MDkyMjU=,2,Apply porter stemming,9599,simonw,closed,0,,,,,2,2020-09-01T04:57:55Z,2020-09-01T20:42:00Z,2020-09-01T20:40:24Z,MEMBER,,This can be on by default. You can turn it off for a table in the config file using `stemming: none` - or maybe `tokenize: none` to match the terminology used by SQLite and `sqlite-utils`: https://sqlite-utils.readthedocs.io/en/stable/python-api.html#enabling-full-text-search,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
688395275,MDU6SXNzdWU2ODgzOTUyNzU=,144,Run some tests against numpy,9599,simonw,closed,0,,,,,2,2020-08-28T22:53:00Z,2020-08-28T22:57:05Z,2020-08-28T22:57:04Z,OWNER,,"Accidentally removed in #143:
https://github.com/simonw/sqlite-utils/blob/d7d3f962861ef32c5ead8f514c8756f5b6f7c4a0/.travis.yml#L18-L19",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
679650632,MDExOlB1bGxSZXF1ZXN0NDY4MzcwNjU4,936,Don't hang in db.execute_write_fn() if connection fails,9599,simonw,closed,0,,,,,2,2020-08-15T22:20:12Z,2020-08-15T22:35:33Z,2020-08-15T22:35:32Z,OWNER,simonw/datasette/pulls/936,Refs #935,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/936/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
677250834,MDU6SXNzdWU2NzcyNTA4MzQ=,926,"datasette fixtures.db --get ""/fixtures.json""",9599,simonw,closed,0,,,,,2,2020-08-11T22:55:36Z,2020-08-12T00:26:17Z,2020-08-12T00:24:42Z,OWNER,,"I can expose ALL of Datasette's functionality on the command-line (without even running a web server) by adding `--get` and `--post` options to `datasette serve`.
datasette fixtures.db --get ""/fixtures.json""
This would instantiate the Datasette ASGI app, run a fake request for `/fixtures.json` through it, dump the results out to standard output and quit.
A `--post` option could do the same for a POST request. Treating that as a stretch goal for the moment.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/926/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
668308777,MDU6SXNzdWU2NjgzMDg3Nzc=,129,"""insert-files --sqlar"" for creating SQLite archives",9599,simonw,closed,0,,,,,2,2020-07-30T02:28:29Z,2020-07-30T22:41:01Z,2020-07-30T22:40:55Z,OWNER,,"A `--sqlar` option could cause `insert-files` to behave in the same way as SQLite's own sqlar mechanism.
https://www.sqlite.org/sqlar.html and https://sqlite.org/sqlar/doc/trunk/README.md",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
667467128,MDU6SXNzdWU2Njc0NjcxMjg=,909,AsgiFileDownload: filename not correctly passed,9599,simonw,closed,0,,,,,2,2020-07-29T00:41:43Z,2020-07-30T00:56:17Z,2020-07-29T21:34:48Z,OWNER,,"https://github.com/simonw/datasette/blob/3c33b421320c0be81a625ca7307b2e4416a9ed5b/datasette/utils/asgi.py#L396-L405
`self.filename` should be passed to `asgi_send_file()`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/909/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
665819048,MDU6SXNzdWU2NjU4MTkwNDg=,126,Ability to insert binary data on the CLI using JSON,9599,simonw,closed,0,,,,,2,2020-07-26T16:54:14Z,2020-07-27T04:00:33Z,2020-07-27T03:59:45Z,OWNER,,"> I could solve round tripping (at least a bit) by allowing insert to be run with a flag that says ""these columns are base64 encoded, store the decoded data in a BLOB"".
>
> That would solve inserting binary data using JSON too.
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/125#issuecomment-664012247_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
665403403,MDU6SXNzdWU2NjU0MDM0MDM=,907,Allow documentation doesn't explain what happens with multiple allow keys,9599,simonw,closed,0,,,5607421,Datasette 0.46,2,2020-07-24T20:34:40Z,2020-07-24T22:53:07Z,2020-07-24T22:53:07Z,OWNER,,"Documentation here: https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks
Doesn't explain that with the following ""allow"" block:
```json
{
""allow"": {
""id"": ""simonw"",
""role"": ""staff""
}
}
```
The rule will match if EITHER the id is simonw OR the role includes staff.
The tests are missing this case too: https://github.com/simonw/datasette/blob/028f193dd6233fa116262ab4b07b13df7dcec9be/tests/test_utils.py#L504
Related to #906",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/907/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
662439034,MDExOlB1bGxSZXF1ZXN0NDUzOTk1MTc5,902,Don't install tests package,32467826,abeyerpath,closed,0,,,,,2,2020-07-21T01:08:50Z,2020-07-24T20:39:54Z,2020-07-24T20:39:54Z,CONTRIBUTOR,simonw/datasette/pulls/902,"The `exclude` argument to `find_packages` needs an iterable of package
names.
Fixes: #456 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/902/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
663317875,MDU6SXNzdWU2NjMzMTc4NzU=,905,/database.db download should include content-length header,9599,simonw,closed,0,,,,,2,2020-07-21T21:23:48Z,2020-07-22T04:59:46Z,2020-07-22T04:52:45Z,OWNER,,I can do this by modifying this function: https://github.com/simonw/datasette/blob/02dc6298bdbfb1d63e0d2a39ff597b5fcc60e06b/datasette/utils/asgi.py#L248-L270,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/905/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
659873662,MDU6SXNzdWU2NTk4NzM2NjI=,898,datasette.utils.testing module,9599,simonw,open,0,,,,,2,2020-07-18T03:53:24Z,2020-07-18T03:57:46Z,,OWNER,,"The unit tests for plugins could benefit from reusing code from Datasette's own testing fixtures, e.g.:
> I may need to borrow this function from Datasette for the tests:
> https://github.com/simonw/datasette/blob/1f6a134369e6a7efaae9db469f15b1dd2b7f3709/tests/fixtures.py#L836-L851
>
> It's not importable (it lives in `fixtures.py` and not in the `datasette` package that gets packaged for PyPI) - maybe I should fix that in Datasette by adding a `from datasette.utils.testing` module.
_Originally posted by @simonw in https://github.com/simonw/datasette-update-api/issues/4#issuecomment-660419182_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/898/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
655465863,MDU6SXNzdWU2NTU0NjU4NjM=,892,"""latest"" in new documentation navbar is invisible",9599,simonw,closed,0,,,,,2,2020-07-12T19:57:21Z,2020-07-12T20:02:35Z,2020-07-12T20:02:17Z,OWNER,,"On https://datasette.readthedocs.io/en/latest/
Compare with https://datasette.readthedocs.io/en/0.45/
Some custom CSS should fix it.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
650305298,MDExOlB1bGxSZXF1ZXN0NDQzODIzMDQw,890,Load only python files from plugins-dir.,49260,amjith,closed,0,,,,,2,2020-07-03T02:47:32Z,2020-07-03T03:08:33Z,2020-07-03T03:08:33Z,CONTRIBUTOR,simonw/datasette/pulls/890,"The current behavior for `--plugins-dir` is to load every file in that folder as a python module. This can result in errors if there are non-python files in the plugins dir (such as .mypy_cache).
This PR restricts the module loading to only python files. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/890/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
276718605,MDU6SXNzdWUyNzY3MTg2MDU=,151,Set up a pattern portfolio,9599,simonw,closed,0,,,,,2,2017-11-25T02:09:49Z,2020-07-02T00:13:24Z,2020-05-03T03:13:16Z,OWNER,,"https://www.slideshare.net/nataliedowne/practical-maintainable-css/75
This will be a single page that demonstrates all of the different CSS styles and classes available to Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/151/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
648673556,MDU6SXNzdWU2NDg2NzM1NTY=,882,Release notes for 0.45,9599,simonw,closed,0,,,5533512,Datasette 0.45,2,2020-07-01T05:00:17Z,2020-07-01T21:48:08Z,2020-07-01T21:48:08Z,OWNER,,"These are mostly done thanks to the alphas, but I went to have more paragraphs of prose and less bullet points.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/882/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
644122661,MDU6SXNzdWU2NDQxMjI2NjE=,116,Documentation for table.pks introspection property,9599,simonw,closed,0,,,,,2,2020-06-23T20:27:24Z,2020-06-23T21:21:33Z,2020-06-23T21:03:14Z,OWNER,,https://github.com/simonw/sqlite-utils/blob/4d9a3204361d956440307a57bd18c829a15861db/sqlite_utils/db.py#L535-L540,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
642127307,MDU6SXNzdWU2NDIxMjczMDc=,855,Add instructions for using cookiecutter plugin template to plugin docs,9599,simonw,closed,0,,,5533512,Datasette 0.45,2,2020-06-19T17:33:25Z,2020-06-22T02:51:38Z,2020-06-22T02:51:38Z,OWNER,,Once I ship the `datasette-plugin` template: https://github.com/simonw/datasette-plugin/issues/1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/855/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
635049296,MDU6SXNzdWU2MzUwNDkyOTY=,820,Idea: Plugin hook for registering canned queries,9599,simonw,closed,0,,,,,2,2020-06-09T01:58:21Z,2020-06-18T17:58:02Z,2020-06-18T17:58:02Z,OWNER,,"Thought of this while thinking about possible permissions plugins (#818).
Imagine an API key plugin which allows access for API keys. It could let users register new API keys by providing a writable canned query for writing to the `api_keys` table.
To do this the plugin needs to register the query. At the moment queries have to be registered in `metadata.json` - a plugin hook for registering additional queries could help solve this.
One challenge: how does the plugin know which named database the query should be registered for?
It could default to the first attached database and allow users to optionally tell the plugin ""actually use this named database instead"" in plugin configuration.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/820/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
638229448,MDU6SXNzdWU2MzgyMjk0NDg=,843,Configure codecov.io,9599,simonw,closed,0,,,,,2,2020-06-13T20:45:00Z,2020-06-13T21:36:52Z,2020-06-13T21:36:52Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/datasette/issues/841#issuecomment-643660757_,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/843/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
637899539,MDU6SXNzdWU2Mzc4OTk1Mzk=,40,Demo deploy is broken,9599,simonw,closed,0,,,,,2,2020-06-12T17:20:17Z,2020-06-12T18:06:48Z,2020-06-12T18:06:48Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/766180404?check_suite_focus=true
```
The following NEW packages will be installed:
sqlite3
0 upgraded, 1 newly installed, 0 to remove and 11 not upgraded.
Need to get 752 kB of archives.
After this operation, 2482 kB of additional disk space will be used.
Ign:1 http://azure.archive.ubuntu.com/ubuntu bionic-updates/main amd64 sqlite3 amd64 3.22.0-1ubuntu0.3
Err:1 http://security.ubuntu.com/ubuntu bionic-updates/main amd64 sqlite3 amd64 3.22.0-1ubuntu0.3
404 Not Found [IP: 52.177.174.250 80]
E: Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/s/sqlite3/sqlite3_3.22.0-1ubuntu0.3_amd64.deb 404 Not Found [IP: 52.177.174.250 80]
E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
##[error]Process completed with exit code 100.
```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
314847571,MDU6SXNzdWUzMTQ4NDc1NzE=,220,Investigate syntactic sugar for plugins,9599,simonw,closed,0,,,,,2,2018-04-16T23:01:39Z,2020-06-11T21:50:06Z,2020-06-11T21:49:55Z,OWNER,,"Suggested by @andrewhayward on Twitter: https://twitter.com/arhayward/status/986015118965268480?s=21
> Have you considered a basic abstraction on top of that, for standard hook features?
```
@sql_function
random_integer(a,b):
return random.randint(a,b)
@template_filter
uppercase(str):
return str.upper()
```
Maybe `from datasette.plugins import template_filter`?
Would have to work out how to get this to play well with pluggy",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
629541395,MDU6SXNzdWU2Mjk1NDEzOTU=,795,response.set_cookie() method,9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-02T21:57:05Z,2020-06-09T22:33:33Z,2020-06-09T22:19:48Z,OWNER,,"Mainly to clean up this code:
https://github.com/simonw/datasette/blob/4fa7cf68536628344356d3ef8c92c25c249067a0/datasette/app.py#L439-L454",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/795/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
635077656,MDU6SXNzdWU2MzUwNzc2NTY=,822,request.url_vars helper property,9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-09T03:15:53Z,2020-06-09T03:40:07Z,2020-06-09T03:40:06Z,OWNER,,"This example: https://github.com/simonw/datasette/blob/f5e79adf26d0daa3831e3fba022f1b749a9efdee/docs/plugins.rst#register_routes
```python
from datasette.utils.asgi import Response
import html
async def hello_from(scope):
name = scope[""url_route""][""kwargs""][""name""]
return Response.html(""Hello from {}"".format(
html.escape(name)
))
@hookimpl
def register_routes():
return [
(r""^/hello-from/(?P.*)$""), hello_from)
]
```
Would be nicer if you could easily get `scope[""url_route""][""kwargs""][""name""]` directly from the request object, without looking at the `scope`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/822/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
634783573,MDU6SXNzdWU2MzQ3ODM1NzM=,816,Come up with a new example for extra_template_vars plugin,9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-08T16:57:59Z,2020-06-08T19:06:44Z,2020-06-08T19:06:11Z,OWNER,,"This example is obsolete, it's from a time before `request.actor` and authentication as a built-in concept (#699):
https://github.com/simonw/datasette/blob/0c064c5fe220b7b3d8dcf85b02b4e60452c47232/docs/plugins.rst#L696-L700
https://github.com/simonw/datasette/blob/0c064c5fe220b7b3d8dcf85b02b4e60452c47232/docs/plugins.rst#extra_template_varstemplate-database-table-view_name-request-datasette",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/816/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
632918799,MDU6SXNzdWU2MzI5MTg3OTk=,808,Permission check for every view in Datasette (plus docs),9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-07T01:59:23Z,2020-06-07T05:30:49Z,2020-06-07T05:30:49Z,OWNER,,"Every view in Datasette should perform a permission check to see if the current user/actor is allowed to view that page.
This permission check will default to allowed, but having this check will allow plugins to lock down access selectively or even to everything in a Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/808/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
628087971,MDU6SXNzdWU2MjgwODc5NzE=,786,Documentation page describing Datasette's authentication system,9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-01T01:10:06Z,2020-06-06T19:40:20Z,2020-06-06T19:40:20Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/datasette/issues/699#issuecomment-636562999_,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/786/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
628121234,MDU6SXNzdWU2MjgxMjEyMzQ=,788, /-/permissions debugging tool,9599,simonw,closed,0,,,5512395,Datasette 0.44,2,2020-06-01T03:13:47Z,2020-06-06T00:43:40Z,2020-06-01T05:01:01Z,OWNER,,"> Debugging tool idea: `/-/permissions` page which shows you the actor and lets you type in the strings for `action`, `resource_type` and `resource_identifier` - then shows you EVERY plugin hook that would have executed and what it would have said, plus when the chain would have terminated.
>
> Bonus: if you're logged in as the `root` user (or a user that matches some kind of permission check, maybe a check for `permissions_debug`) you get to see a rolling log of the last 30 permission checks and what the results were across the whole of Datasette. This should make figuring out permissions policies a whole lot easier.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/699#issuecomment-636576603_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/788/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
631789422,MDU6SXNzdWU2MzE3ODk0MjI=,799,TestResponse needs to handle multiple set-cookie headers,9599,simonw,closed,0,,,,,2,2020-06-05T17:39:52Z,2020-06-05T18:34:10Z,2020-06-05T18:34:10Z,OWNER,,"Seeing this test failure on #798:
```
_______________________ test_auth_token _______________________
app_client =
def test_auth_token(app_client):
""The /-/auth-token endpoint sets the correct cookie""
assert app_client.ds._root_token is not None
path = ""/-/auth-token?token={}"".format(app_client.ds._root_token)
response = app_client.get(path, allow_redirects=False,)
assert 302 == response.status
assert ""/"" == response.headers[""Location""]
> assert {""id"": ""root""} == app_client.ds.unsign(response.cookies[""ds_actor""], ""actor"")
E KeyError: 'ds_actor'
datasette/tests/test_auth.py:12: KeyError
```
It looks like that's happening because the ASGI middleware is adding another set-cookie header - but those two set-cookie headers are combined into one when the TestResponse is constructed:
https://github.com/simonw/datasette/blob/0c064c5fe220b7b3d8dcf85b02b4e60452c47232/tests/fixtures.py#L113-L127",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/799/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
629535669,MDU6SXNzdWU2Mjk1MzU2Njk=,794,Show hooks implemented by each plugin on /-/plugins,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2020-06-02T21:44:38Z,2020-06-02T22:30:17Z,2020-06-02T21:50:10Z,OWNER,,"e.g.
```json
{
""name"": ""qs_actor.py"",
""static"": false,
""templates"": false,
""version"": null,
""hooks"": [
""actor_from_request""
]
}
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/794/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
628156527,MDU6SXNzdWU2MjgxNTY1Mjc=,789,Mechanism for enabling pluggy tracing,9599,simonw,open,0,,,,,2,2020-06-01T05:10:14Z,2020-06-01T05:11:03Z,,OWNER,,"Could be useful for debugging plugins: https://pluggy.readthedocs.io/en/latest/#call-tracing
I tried this out by adding these two lines in `plugins.py`:
```python
pm = pluggy.PluginManager(""datasette"")
pm.add_hookspecs(hookspecs)
# Added these:
pm.trace.root.setwriter(print)
pm.enable_tracing()
```
Output looked something like this:
```
INFO: 127.0.0.1:52724 - ""GET /-/-/static/app.css HTTP/1.1"" 404 Not Found
actor_from_request [hook]
datasette:
request:
finish actor_from_request --> [] [hook]
extra_body_script [hook]
template: show_json.html
database: None
table: None
view_name: json_data
datasette:
finish extra_body_script --> [] [hook]
extra_template_vars [hook]
template: show_json.html
database: None
table: None
view_name: json_data
request:
datasette:
finish extra_template_vars --> [] [hook]
extra_css_urls [hook]
template: show_json.html
database: None
table: None
datasette:
finish extra_css_urls --> [] [hook]
extra_js_urls [hook]
template: show_json.html
database: None
table: None
datasette:
finish extra_js_urls --> [] [hook]
INFO: 127.0.0.1:52724 - ""GET /-/actor HTTP/1.1"" 200 OK
actor_from_request [hook]
datasette:
request:
finish actor_from_request --> [] [hook]
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/789/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
268453968,MDU6SXNzdWUyNjg0NTM5Njg=,37,Ability to serialize massive JSON without blocking event loop,9599,simonw,closed,0,,,,,2,2017-10-25T15:58:03Z,2020-05-30T17:29:20Z,2020-05-30T17:29:20Z,OWNER,,"We run the risk of someone attempting a select statement that returns thousands of rows and hence takes several seconds just to JSON encode the response, effectively blocking the event loop and pausing all other traffic.
The Twisted community have a solution for this, can we adapt that in some way? http://as.ynchrono.us/2010/06/asynchronous-json_18.html?m=1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
374953006,MDU6SXNzdWUzNzQ5NTMwMDY=,369,Interface should show same JSON shape options for custom SQL queries,416374,gfrmin,open,0,,,3268330,Datasette 1.0,2,2018-10-29T10:39:15Z,2020-05-30T17:24:06Z,,CONTRIBUTOR,,"At the moment the page returning a custom SQL query shows the JSON and CSV APIs, but not the multiple JSON shapes. However, adding the `_shape` parameter to the JSON API URL manually still works, so perhaps there should be consistency in the interface by having the same ""Advanced Export"" box for custom SQL queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
626001501,MDU6SXNzdWU2MjYwMDE1MDE=,773,All plugin hooks should have unit tests,9599,simonw,closed,0,,,5471110,Datasette 0.43,2,2020-05-27T20:17:41Z,2020-05-28T04:12:11Z,2020-05-28T04:09:25Z,OWNER,,"Four hooks currently missing tests:
- [x] prepare_jinja2_environment
- [x] publish_subcommand
- [x] register_facet_classes
- [x] register_output_renderer
```
$ pytest -k test_plugin_hooks_have_tests -vv
====================================== test session starts ======================================
platform darwin -- Python 3.7.7, pytest-5.2.4, py-1.8.1, pluggy-0.13.1 -- /Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/bin/python
cachedir: .pytest_cache
rootdir: /Users/simon/Dropbox/Development/datasette, inifile: pytest.ini
plugins: asyncio-0.10.0
collected 486 items / 475 deselected / 11 selected
tests/test_plugins.py::test_plugin_hooks_have_tests[asgi_wrapper] XPASS [ 9%]
tests/test_plugins.py::test_plugin_hooks_have_tests[extra_body_script] XPASS [ 18%]
tests/test_plugins.py::test_plugin_hooks_have_tests[extra_css_urls] XPASS [ 27%]
tests/test_plugins.py::test_plugin_hooks_have_tests[extra_js_urls] XPASS [ 36%]
tests/test_plugins.py::test_plugin_hooks_have_tests[extra_template_vars] XPASS [ 45%]
tests/test_plugins.py::test_plugin_hooks_have_tests[prepare_connection] XPASS [ 54%]
tests/test_plugins.py::test_plugin_hooks_have_tests[prepare_jinja2_environment] XFAIL [ 63%]
tests/test_plugins.py::test_plugin_hooks_have_tests[publish_subcommand] XFAIL [ 72%]
tests/test_plugins.py::test_plugin_hooks_have_tests[register_facet_classes] XFAIL [ 81%]
tests/test_plugins.py::test_plugin_hooks_have_tests[register_output_renderer] XFAIL [ 90%]
tests/test_plugins.py::test_plugin_hooks_have_tests[render_cell] XPASS [100%]
========================= 475 deselected, 4 xfailed, 7 xpassed in 1.70s =========================
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/771#issuecomment-634915104_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/773/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
620969465,MDU6SXNzdWU2MjA5Njk0NjU=,767,Allow to specify a URL fragment for canned queries,2657547,rixx,closed,0,,,5471110,Datasette 0.43,2,2020-05-19T13:17:42Z,2020-05-27T21:52:25Z,2020-05-27T21:52:25Z,CONTRIBUTOR,,"Canned queries are very useful to direct users to prepared data and views. I like to use them with charts using datasette-vega a lot, because people get a direct impression at first glance.
datasette-vega doesn't show up by default though, and users have to click through to it. Also, datasette-vega does not always guess the best way to render columns correctly though, so it would be nice if I could specify a URL fragment in my canned queries to make sure people see what I want them to see.
My current workaround is to include a fragement link in ``description_html`` and ask people to reload the page, like [here](https://data.rixx.de/songs/show_by_bpm#g.mark=bar&g.x_column=bpm_floor&g.x_type=ordinal&g.y_column=bpm_count&g.y_type=quantitative), which is a bit hacky.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
615477131,MDU6SXNzdWU2MTU0NzcxMzE=,111,sqlite-utils drop-table and drop-view commands,9599,simonw,closed,0,,,,,2,2020-05-10T21:10:42Z,2020-05-11T01:58:36Z,2020-05-11T00:44:26Z,OWNER,,Would be useful to be able to drop views and tables from the CLI.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
612089949,MDU6SXNzdWU2MTIwODk5NDk=,756,Add pipx to installation documentation,9599,simonw,closed,0,,,,,2,2020-05-04T18:49:01Z,2020-05-04T19:19:06Z,2020-05-04T19:10:33Z,OWNER,,"Add to this page: https://datasette.readthedocs.io/en/stable/installation.html
Here's how to install plugins: https://twitter.com/simonw/status/1257348687979778050
```
$ datasette plugins
[]
$ pipx inject datasette datasette-json-html
injected package datasette-json-html into venv datasette
done! ✨ 🌟 ✨
$ datasette plugins
[
{
""name"": ""datasette-json-html"",
""static"": false,
""templates"": false,
""version"": ""0.6""
}
]
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
611997130,MDU6SXNzdWU2MTE5OTcxMzA=,754,Clean up aiofiles warnings on 3.8,9599,simonw,closed,0,,,,,2,2020-05-04T16:14:59Z,2020-05-04T16:22:30Z,2020-05-04T16:22:30Z,OWNER,,"https://travis-ci.org/github/simonw/datasette/jobs/682624476
Lots of warnings like this:
```
/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33
/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33
/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33: DeprecationWarning: ""@coroutine"" decorator is deprecated since Python 3.8, use ""async def"" instead
def method(self, *args, **kwargs):
/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27
/home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27: DeprecationWarning: ""@coroutine"" decorator is deprecated since Python 3.8, use ""async def"" instead
def _open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None,
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/754/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
611222968,MDU6SXNzdWU2MTEyMjI5Njg=,107,sqlite-utils create-view CLI command,9599,simonw,closed,0,,,,,2,2020-05-02T16:15:13Z,2020-05-03T15:36:58Z,2020-05-03T15:36:37Z,OWNER,,Can go with #27 - `sqlite-utils create-table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
611326701,MDU6SXNzdWU2MTEzMjY3MDE=,108,Documentation unit tests for CLI commands,9599,simonw,closed,0,,,,,2,2020-05-03T03:58:42Z,2020-05-03T04:13:57Z,2020-05-03T04:13:57Z,OWNER,,Have a test that ensures all CLI commands are documented.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
610842926,MDU6SXNzdWU2MTA4NDI5MjY=,36,Add view for better display of dependent repos,9599,simonw,closed,0,,,,,2,2020-05-01T16:33:44Z,2020-05-02T16:50:31Z,2020-05-02T16:30:11Z,MEMBER,,"```sql
select
repos.full_name as repo,
'https://github.com/' || repos2.full_name as dependent,
repos2.created_at as dependent_repo_created,
repos2.updated_at as dependent_repo_updated,
repos2.stargazers_count as dependent_repo_stars,
repos2.watchers_count as dependent_repo_watchers
from
dependents
join repos as repos2 on dependents.dependent = repos2.id
join repos on dependents.repo = repos.id
order by
repos2.created_at desc
```
https://dogsheep.simonwillison.net/github?sql=select%0D%0A++repos.full_name+as+repo%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos2.full_name+as+dependent%2C%0D%0A++repos2.created_at+as+dependent_repo_created%2C%0D%0A++repos2.updated_at+as+dependent_repo_updated%2C%0D%0A++repos2.stargazers_count+as+dependent_repo_stars%2C%0D%0A++repos2.watchers_count+as+dependent_repo_watchers%0D%0Afrom%0D%0A++dependents%0D%0A++join+repos+as+repos2+on+dependents.dependent+%3D+repos2.id%0D%0A++join+repos+on+dependents.repo+%3D+repos.id%0D%0Aorder+by%0D%0A++repos2.created_at+desc",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
605806386,MDU6SXNzdWU2MDU4MDYzODY=,735,"Error when I click on ""View and edit SQL""",30607,aborruso,closed,0,,,,,2,2020-04-23T19:31:32Z,2020-04-28T06:10:20Z,2020-04-27T19:00:30Z,NONE,,"Hi,
when I do it [here](https://my-database.now.sh/commissioniComunePalermo/youtube), I have ""unrecognized token: ""["""" error.
Is it normal?
Thank you",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
607888367,MDU6SXNzdWU2MDc4ODgzNjc=,13,Also upload movie files,9599,simonw,open,0,,,,,2,2020-04-27T22:11:25Z,2020-04-28T00:39:45Z,,MEMBER,,"The `upload` command currently only handles static images:
https://github.com/dogsheep/photos-to-sqlite/blob/d939455af00e07866686457ee2fcb9b2d1b7194e/photos_to_sqlite/utils.py#L26-L33
Need to cover movies taken by my phone and DSLR too.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
605938063,MDU6SXNzdWU2MDU5MzgwNjM=,9,"upload command should be resumable, should only upload photos not already uploaded",9599,simonw,closed,0,,,,,2,2020-04-23T23:31:08Z,2020-04-23T23:39:14Z,2020-04-23T23:39:14Z,MEMBER,,Follow on from #4. ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
605147638,MDU6SXNzdWU2MDUxNDc2Mzg=,8,Should I have used MD5 instead of SHA256?,9599,simonw,closed,0,,,,,2,2020-04-23T00:02:08Z,2020-04-23T00:03:35Z,2020-04-23T00:03:35Z,MEMBER,,"https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html
> Objects created by the PUT Object, POST Object, or Copy operation, or through the AWS Management Console, and are encrypted by SSE-S3 or plaintext, have ETags that are an MD5 digest of their object data.
",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
603618244,MDU6SXNzdWU2MDM2MTgyNDQ=,30,Issues milestone column is the wrong type,9599,simonw,closed,0,,,,,2,2020-04-21T00:24:34Z,2020-04-21T00:45:23Z,2020-04-21T00:36:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?milestone=2857392
![2A4C1185-2434-4F29-9EA0-3246E2F03F77](https://user-images.githubusercontent.com/9599/79811760-b7e08b00-832b-11ea-9ad7-684a6ae097a6.jpeg)
It is TEXT when it should be an INTEGER - which is why the foreign key label is not correctly displayed.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
602575575,MDU6SXNzdWU2MDI1NzU1NzU=,6,Add progress bar to upload command,9599,simonw,closed,0,,,,,2,2020-04-18T23:32:41Z,2020-04-19T00:15:24Z,2020-04-19T00:15:24Z,MEMBER,,Upload was added in #4 ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
601271612,MDU6SXNzdWU2MDEyNzE2MTI=,26,Topics are missing from repositories,9599,simonw,closed,0,,,,,2,2020-04-16T17:36:32Z,2020-04-16T17:41:11Z,2020-04-16T17:41:11Z,MEMBER,,"I'm sure this used to work, but right now repositories are fetched without their topics.
https://developer.github.com/v3/repos/ says you need to send a custom `Accept` header of `application/vnd.github.mercy-preview+json` to get topics.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
598640234,MDU6SXNzdWU1OTg2NDAyMzQ=,99,.upsert_all() should maybe error if dictionaries passed to it do not have the same keys,9599,simonw,closed,0,,,,,2,2020-04-13T03:02:25Z,2020-04-13T03:05:20Z,2020-04-13T03:05:04Z,OWNER,,"While investigating #98 I stumbled across this:
```
def test_upsert_compound_primary_key(fresh_db):
table = fresh_db[""table""]
table.upsert_all(
[
{""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 4},
{""species"": ""cat"", ""id"": 1, ""name"": ""Catbag""},
],
pk=(""species"", ""id""),
)
table.upsert_all(
[
{""species"": ""dog"", ""id"": 1, ""age"": 5},
{""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1},
],
pk=(""species"", ""id""),
)
> assert [
{""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 5},
{""species"": ""cat"", ""id"": 1, ""name"": ""Catbag"", ""age"": None},
{""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1},
] == list(table.rows)
E AssertionError: assert [{'age': 5, '...cies': 'dog'}] == [{'age': 5, '...cies': 'dog'}]
E At index 0 diff: {'species': 'dog', 'id': 1, 'name': 'Cleo', 'age': 5} != {'species': 'dog', 'id': 1, 'name': None, 'age': 5}
E Full diff:
E - [{'age': 5, 'id': 1, 'name': 'Cleo', 'species': 'dog'},
E ? ^^^ --
E + [{'age': 5, 'id': 1, 'name': None, 'species': 'dog'},
E ? ^^^
E {'age': None, 'id': 1, 'name': 'Catbag', 'species': 'cat'},
E {'age': 1, 'id': 2, 'name': 'New Dog', 'species': 'dog'}]
```
If you run `.upsert_all()` with multiple dictionaries it doesn't quite have the effect you might expect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
521323012,MDExOlB1bGxSZXF1ZXN0MzM5NzIyNzkw,627,"Support Python 3.8, stop supporting Python 3.5",9599,simonw,closed,0,,,,,2,2019-11-12T04:36:33Z,2020-04-05T10:23:58Z,2019-11-12T05:09:12Z,OWNER,simonw/datasette/pulls/627,Refs #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
587322443,MDU6SXNzdWU1ODczMjI0NDM=,710,Remove Zeit Now v1 support,9599,simonw,closed,0,,,,,2,2020-03-24T22:39:49Z,2020-04-04T23:05:12Z,2020-04-04T23:05:12Z,OWNER,,It will remain supported as a plugin but since no-one can sign up for Docker hosting any more (for over a year now) there's no point including it in Datasette core.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
589801352,MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3,96,Add type conversion for Panda's Timestamp,32605365,b0b5h4rp13,closed,0,,,,,2,2020-03-29T14:13:09Z,2020-03-31T04:40:49Z,2020-03-31T04:40:48Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/96,"Add type conversion for Panda's Timestamp, if Panda library is present in system
(thanks for this project, I was about to do the same thing from scratch)",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
587398703,MDU6SXNzdWU1ODczOTg3MDM=,711,Release notes for Datasette 0.39,9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-25T02:31:13Z,2020-03-25T04:06:55Z,2020-03-25T04:06:55Z,OWNER,,Then I can ship it.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/711/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
587302139,MDExOlB1bGxSZXF1ZXN0MzkzMjc0NDMz,708,"base_url configuration setting, refs #394",9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-24T21:52:00Z,2020-03-25T00:18:44Z,2020-03-25T00:18:44Z,OWNER,simonw/datasette/pulls/708,Pull request implementing #394,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
586561727,MDU6SXNzdWU1ODY1NjE3Mjc=,21,Turn GitHub API errors into exceptions,9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T22:37:24Z,2020-03-23T23:48:23Z,2020-03-23T23:48:22Z,MEMBER,,"This would have really helped in debugging the mess in #13. Running with this `auth.json` is a useful demo:
```json
{""github_personal_token"": """"}
```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
493671014,MDU6SXNzdWU0OTM2NzEwMTQ=,5,"Add ""incomplete"" boolean to users table for incomplete profiles",9599,simonw,closed,0,,,,,2,2019-09-14T22:01:50Z,2020-03-23T19:23:31Z,2020-03-23T19:23:30Z,MEMBER,,"User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc.
Add a `incomplete` boolean flag to the `users` table to record this. Then later I can add a `backfill-users` command which loops through and fetches missing data for those incomplete profiles.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
585850715,MDU6SXNzdWU1ODU4NTA3MTU=,19,"Enable full-text search for more stuff (like commits, issues and issue_comments)",9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T00:19:56Z,2020-03-23T19:06:39Z,2020-03-23T19:06:39Z,MEMBER,,Currently FTS is only enabled for repos and releases.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
585266763,MDU6SXNzdWU1ODUyNjY3NjM=,34,IndexError running user-timeline command,9599,simonw,closed,0,,,,,2,2020-03-20T18:54:08Z,2020-03-20T19:20:52Z,2020-03-20T19:20:37Z,MEMBER,,"```
$ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines
Traceback (most recent call last):
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in
load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')()
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__
return self.main(*args, **kwargs)
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main
rv = self.invoke(ctx)
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke
return callback(*args, **kwargs)
File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 256, in user_timeline
utils.save_tweets(db, chunk)
File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 289, in save_tweets
db[""users""].upsert(user, pk=""id"", alter=True)
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1128, in upsert
conversions=conversions,
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1157, in upsert_all
upsert=True,
File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1096, in insert_all
row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0]
IndexError: list index out of range
```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
578883725,MDU6SXNzdWU1Nzg4ODM3MjU=,17,Command for importing commits,9599,simonw,closed,0,,,,,2,2020-03-10T21:55:12Z,2020-03-11T02:47:37Z,2020-03-11T02:47:37Z,MEMBER,,Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
534530973,MDU6SXNzdWU1MzQ1MzA5NzM=,649,Reduce table counts on index page with many databases,9599,simonw,closed,0,,,,,2,2019-12-08T11:56:37Z,2020-02-29T01:08:29Z,2020-02-29T01:08:29Z,OWNER,,"Since #467 the index page has attempted to optimistically count times.
My personal Dogsheep has enough connected databases and tables that the page can still take way too long to load - sometimes more than twenty seconds.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/649/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
539204432,MDU6SXNzdWU1MzkyMDQ0MzI=,70,Implement ON DELETE and ON UPDATE actions for foreign keys,26292069,LucasElArruda,open,0,,,,,2,2019-12-17T17:19:10Z,2020-02-27T04:18:53Z,,NONE,,"Hi! I did not find any mention on the library about ON DELETE and ON UPDATE actions for foreign keys. Are those expected to be implemented? If not, it would be a nice thing to include!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
569268612,MDU6SXNzdWU1NjkyNjg2MTI=,679,Release 0.36,9599,simonw,closed,0,,,,,2,2020-02-22T02:41:01Z,2020-02-22T03:52:13Z,2020-02-22T03:52:13Z,OWNER,,"I think we have enough changes to warrant a release - and I want to take advantage of the changes to the `prepare_connection()` plugin hook in #678
Changes since 0.35 so far: https://github.com/simonw/datasette/compare/0.35...be2265b0e811d0ac2875c2f748125c17b0f9289e
- [x] Update ecosystem page
- [x] Write release notes
- [x] Ship the release",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
565837965,MDU6SXNzdWU1NjU4Mzc5NjU=,87,Should detect collections.OrderedDict as a regular dictionary,9599,simonw,closed,0,,,,,2,2020-02-16T02:06:34Z,2020-02-16T02:20:59Z,2020-02-16T02:20:59Z,OWNER,,"```
File ""...python3.7/site-packages/sqlite_utils/db.py"", line 292, in create_table
column_type=COLUMN_TYPE_MAPPING[column_type],
KeyError:
```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
562787785,MDU6SXNzdWU1NjI3ODc3ODU=,667,Allow injecting configuration data from plugins,870184,xrotwang,closed,0,,,,,2,2020-02-10T19:50:15Z,2020-02-12T16:18:22Z,2020-02-12T09:21:22Z,NONE,,"I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md).
Part of this customization would be support for the ""special"" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file.
It would be nice to be able to programmatically inject config data from plugins as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
550293770,MDU6SXNzdWU1NTAyOTM3NzA=,658,How do I use the app.css as style sheet?,49656826,null92,open,0,,,,,2,2020-01-15T16:27:57Z,2020-02-07T00:29:50Z,,NONE,,"Simon,
I'm trying to use the app.css (in static folder) as style sheet but the datasette on Heroku simply ignore it! I read everything about customization here and on readthedocs but still can't.
Is this possible?
Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/658/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
529376481,MDExOlB1bGxSZXF1ZXN0MzQ2MjY0OTI2,67,Run tests against 3.5 too,9599,simonw,closed,0,,,,,2,2019-11-27T14:20:35Z,2019-12-31T01:29:44Z,2019-12-31T01:29:43Z,OWNER,simonw/sqlite-utils/pulls/67,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
361764460,MDExOlB1bGxSZXF1ZXN0MjE2NjUxMzE3,365,fix small doc typo,418191,jaywgraves,closed,0,,,,,2,2018-09-19T14:02:02Z,2019-12-19T02:30:33Z,2018-09-19T17:15:43Z,CONTRIBUTOR,simonw/datasette/pulls/365,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
464987783,MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz,546,Facet by delimiter,9599,simonw,open,0,,,,,2,2019-07-07T20:06:05Z,2019-11-18T23:46:01Z,,OWNER,simonw/datasette/pulls/546,Refs #510,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
521329771,MDU6SXNzdWU1MjEzMjk3NzE=,628,Render jinja2 templates in async mode,9599,simonw,closed,0,,,,,2,2019-11-12T05:01:55Z,2019-11-14T23:28:09Z,2019-11-14T23:14:24Z,OWNER,,"I started playing with this in #404 and got good results but it didn't work in Python 3.5. As of #627 I don't support 3.5 any more so this can go ahead.
Rendering templates in async mode will mean that template plugins can include async code... which opens the door to custom template functions that execute SQL queries!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/628/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
382471625,MDExOlB1bGxSZXF1ZXN0MjMyMTcyMTA2,389,Bump dependency versions,9599,simonw,closed,0,,,,,2,2018-11-20T02:23:12Z,2019-11-13T19:13:41Z,2019-11-13T19:13:41Z,OWNER,simonw/datasette/pulls/389,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
521995039,MDU6SXNzdWU1MjE5OTUwMzk=,632,Upgrade datasette publish Heroku runtime,9599,simonw,closed,0,,,,,2,2019-11-13T06:46:19Z,2019-11-13T16:44:07Z,2019-11-13T16:43:23Z,OWNER,,"```
Python has released a security update! Please consider upgrading to python-3.6.9
```
https://devcenter.heroku.com/articles/python-support#supported-runtimes shows 3.8.0 is now supported.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
520718056,MDExOlB1bGxSZXF1ZXN0MzM5MjM2NjQ3,623,Test against Python 3.8 in Travis,9599,simonw,closed,0,,,,,2,2019-11-11T03:24:54Z,2019-11-11T03:45:35Z,2019-11-11T03:45:35Z,OWNER,simonw/datasette/pulls/623,Needed for #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
520507306,MDU6SXNzdWU1MjA1MDczMDY=,618,Mechanism for seeing indexes on a specific table,9599,simonw,closed,0,,,,,2,2019-11-09T20:10:41Z,2019-11-10T01:40:05Z,2019-11-10T01:30:25Z,OWNER,,"The only way to see the indexes that apply to a specific table at the moment is to run the following SQL manually:
```sql
select * from sqlite_master where type = 'index' and tbl_name=?
```
For example:
It would be good if this list of indexes was displayed in a neater way on the table page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
518739697,MDU6SXNzdWU1MTg3Mzk2OTc=,30,`followers` fails because `transform_user` is called twice,21148,jacobian,closed,0,,,,,2,2019-11-06T20:44:52Z,2019-11-09T20:15:28Z,2019-11-09T19:55:52Z,CONTRIBUTOR,,"Trying to run `twitter-to-sqlite followers` errors out:
```
Traceback (most recent call last):
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in
sys.exit(cli())
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__
return self.main(*args, **kwargs)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main
rv = self.invoke(ctx)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke
return callback(*args, **kwargs)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 130, in followers
go(bar.update)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 116, in go
utils.save_users(db, [profile])
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 302, in save_users
transform_user(user)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 181, in transform_user
user[""created_at""] = parser.parse(user[""created_at""])
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 1374, in parse
return DEFAULTPARSER.parse(timestr, **kwargs)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 646, in parse
res, skipped_tokens = self._parse(timestr, **kwargs)
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 725, in _parse
l = _timelex.split(timestr) # Splits the timestr into tokens
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 207, in split
return list(cls(s))
File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 76, in __init__
'{itype}'.format(itype=instream.__class__.__name__))
TypeError: Parser must be a string or character stream, not datetime
```
This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls `transform_user`, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls `transform_user` again, which fails because the user is already transformed.
I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116.
Shall I work up a patch for that, or is there a better approach?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
518506242,MDU6SXNzdWU1MTg1MDYyNDI=,616,Datasette FTS detection bug,49656826,null92,closed,0,,,,,2,2019-11-06T14:25:47Z,2019-11-08T15:31:33Z,2019-11-08T02:06:56Z,NONE,,"I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here:
With search: http://teste-templates.herokuapp.com/amazonia_protege/car
Without search: http://bases.vortex.media/amazonia_protege/car
![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
491219910,MDU6SXNzdWU0OTEyMTk5MTA=,61,importing CSV to SQLite as library,17739,witeshadow,closed,0,,,,,2,2019-09-09T17:12:40Z,2019-11-04T16:25:01Z,2019-11-04T16:25:01Z,NONE,,"CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
406055201,MDU6SXNzdWU0MDYwNTUyMDE=,406,Support nullable foreign keys in _labels mode,9599,simonw,closed,0,9599,simonw,,,2,2019-02-03T05:34:20Z,2019-11-02T22:39:28Z,2019-11-02T22:30:27Z,OWNER,,"Currently if there's a null in a foreign key we get ""None"" displayed in the inflated view:
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
494685791,MDU6SXNzdWU0OTQ2ODU3OTE=,574,Improve usage description of --host option,132978,terrycojones,closed,0,,,,,2,2019-09-17T15:12:12Z,2019-11-01T21:58:17Z,2019-11-01T21:57:54Z,NONE,,"It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for ""example running datasette on a host"" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
470542938,MDU6SXNzdWU0NzA1NDI5Mzg=,562,Facet by array shouldn't suggest for arrays that are not arrays-of-strings,9599,simonw,closed,0,,,,,2,2019-07-19T20:51:29Z,2019-11-01T19:42:10Z,2019-11-01T19:37:55Z,OWNER,,"It's triggering for arrays that look like this at the moment:
```json
[
{
""type"": ""HKWorkoutEventTypeSegment"",
""date"": ""2019-05-21 09:43:50 -0700"",
""duration"": ""12.2780519704024"",
""durationUnit"": ""min""
},
{
""type"": ""HKWorkoutEventTypeSegment"",
""date"": ""2019-05-21 09:43:50 -0700"",
""duration"": ""19.467273102204"",
""durationUnit"": ""min""
}
]
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
509612217,MDExOlB1bGxSZXF1ZXN0MzMwMTI5MzU4,603,always pop as_format off args dict,6025893,chris48s,closed,0,,,,,2,2019-10-20T15:44:22Z,2019-10-30T19:12:22Z,2019-10-21T02:03:09Z,CONTRIBUTOR,simonw/datasette/pulls/603,closes #563,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
510076368,MDU6SXNzdWU1MTAwNzYzNjg=,605,Support queries at the table level,12617395,bsilverm,open,0,,,,,2,2019-10-21T15:58:30Z,2019-10-30T18:55:37Z,,NONE,,"Per the issue described in [issue #588](https://github.com/simonw/datasette/issues/588), it was determined queries are not supported at the table level. Per my last comment in the issue, I'd like to request support for this as it would help eliminate errors in the event certain tables are not present in the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
508190730,MDU6SXNzdWU1MDgxOTA3MzA=,23,Extremely simple migration system,9599,simonw,closed,0,,,,,2,2019-10-17T02:13:57Z,2019-10-17T16:57:17Z,2019-10-17T16:57:17Z,MEMBER,,"Needed for #12. This is going to be an incredibly simple version of the Django migration system.
* A `migrations` table, keeping track of which migrations were applied (and when)
* A `migrate()` function which applies any pending migrations
* A `MIGRATIONS` constant which is a list of functions to be applied
The function names will be detected and used as the names of the migrations.
Every time you run the CLI tool it will call the `migrate()` function before doing anything else.
Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
508578780,MDU6SXNzdWU1MDg1Nzg3ODA=,25,Ensure migrations don't accidentally create foreign key twice,9599,simonw,closed,0,,,,,2,2019-10-17T16:08:50Z,2019-10-17T16:56:47Z,2019-10-17T16:56:47Z,MEMBER,,"Is it possible for these lines to run against a database table that already has these foreign keys?
https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
504238461,MDU6SXNzdWU1MDQyMzg0NjE=,6,sqlite3.OperationalError: table users has no column named bio,1055831,dazzag24,closed,0,,,,,2,2019-10-08T19:39:52Z,2019-10-13T05:31:28Z,2019-10-13T05:30:19Z,NONE,,"```
$ github-to-sqlite repos github.db
$ github-to-sqlite starred github.db dazzag24
Traceback (most recent call last):
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite"", line 10, in
sys.exit(cli())
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 764, in __call__
return self.main(*args, **kwargs)
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 717, in main
rv = self.invoke(ctx)
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 555, in invoke
return callback(*args, **kwargs)
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 106, in starred
utils.save_stars(db, user, stars)
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 177, in save_stars
user_id = save_user(db, user)
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 61, in save_user
return db[""users""].upsert(to_save, pk=""id"").last_pk
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1067, in upsert
extracts=extracts,
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 916, in insert
extracts=extracts,
File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1024, in insert_all
result = self.db.conn.execute(sql, values)
sqlite3.OperationalError: table users has no column named bio
```
```
$ pipenv graph
github-to-sqlite==0.4
- requests [required: Any, installed: 2.22.0]
- certifi [required: >=2017.4.17, installed: 2019.9.11]
- chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4]
- idna [required: >=2.5,<2.9, installed: 2.8]
- urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6]
- sqlite-utils [required: ~=1.11, installed: 1.11]
- click [required: Any, installed: 7.0]
- click-default-group [required: Any, installed: 1.2.2]
- click [required: Any, installed: 7.0]
- tabulate [required: Any, installed: 0.8.5]
Python 3.6.8
```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
505674949,MDU6SXNzdWU1MDU2NzQ5NDk=,17,import command should empty all archive-* tables first,9599,simonw,closed,0,,,,,2,2019-10-11T06:58:43Z,2019-10-11T15:40:08Z,2019-10-11T15:40:08Z,MEMBER,,Can have a CLI option for NOT doing that.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
488835586,MDU6SXNzdWU0ODg4MzU1ODY=,4,Command for importing data from a Twitter Export file,9599,simonw,closed,0,,,,,2,2019-09-03T21:34:13Z,2019-10-11T06:45:02Z,2019-10-11T06:45:02Z,MEMBER,,"Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data
A command for importing this data into SQLite would be extremely useful.
$ twitter-to-sqlite import twitter.db path-to-archive.zip
",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
503218205,MDU6SXNzdWU1MDMyMTgyMDU=,586,Enable browser caching for plugin statics with datasette-auth,9599,simonw,closed,0,,,,,2,2019-10-07T03:47:14Z,2019-10-07T15:46:04Z,2019-10-07T15:46:03Z,OWNER,,"An authenticated Datasette I run is seeing delays on every page load. On looking at the network inspector it turns out it's because datasette-vega is nearly 1MB and a `cache-control: private` is preventing it from being cached!
This may well turn out to be a bug in `datasette-auth-github` but it's still worth tracking here because caching of static assets from plugins is very important.
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
493670426,MDU6SXNzdWU0OTM2NzA0MjY=,3,Command to fetch all repos belonging to a user or organization,9599,simonw,closed,0,,,,,2,2019-09-14T21:54:21Z,2019-09-17T00:17:53Z,2019-09-17T00:17:53Z,MEMBER,,"How about this:
$ github-to-sqlite repos simonw",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
487847945,MDExOlB1bGxSZXF1ZXN0MzEzMDA3NDgz,56,Escape the table name in populate_fts and search.,49260,amjith,closed,0,,,,,2,2019-09-01T06:29:05Z,2019-09-02T17:23:21Z,2019-09-02T17:23:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/56,"The table names weren't escaped using double quotes in the populate_fts method.
Reproducible case:
```
>>> import sqlite_utils
>>> db = sqlite_utils.Database(""abc.db"")
>>> db[""http://example.com""].insert_all([
... {""id"": 1, ""age"": 4, ""name"": ""Cleo""},
... {""id"": 2, ""age"": 2, ""name"": ""Pancakes""}
... ], pk=""id"")
>>> db[""http://example.com""].enable_fts([""name""])
Traceback (most recent call last):
File """", line 1, in
db[""http://example.com""].enable_fts([""name""])
File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l
ine 705, in enable_fts
self.populate_fts(columns)
File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l
ine 715, in populate_fts
self.db.conn.executescript(sql)
sqlite3.OperationalError: unrecognized token: "":""
>>>
```",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
462430920,MDU6SXNzdWU0NjI0MzA5MjA=,35,table.update(...) method,9599,simonw,closed,0,,,,,2,2019-06-30T18:06:15Z,2019-07-28T15:43:52Z,2019-07-28T15:43:52Z,OWNER,,"Spun off from #23 - this method will allow a user to update a specific row.
Currently the only way to do that it is to call `.upsert({full record})` with the primary key field matching an existing record - but this does not support partial updates.
```python
db[""events""].update(3, {""name"": ""Renamed""})
```
This method only works on an existing table, so there's no need for a `pk=""id""` specifier - it can detect the primary key by looking at the table.
If the primary key is compound the first argument can be a tuple:
```python
db[""events_venues""].update((3, 2), {""custom_label"": ""Label""})
```
The method can be called without the second dictionary argument. Doing this selects the row specified by the primary key (throwing an error if it does not exist) and remembers it so that chained operations can be carried out - see proposal in https://github.com/simonw/sqlite-utils/issues/23#issuecomment-507055345
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
467862459,MDExOlB1bGxSZXF1ZXN0Mjk3NDEyNDY0,38,table.update() method,9599,simonw,closed,0,,,,,2,2019-07-14T17:03:49Z,2019-07-28T15:43:51Z,2019-07-28T15:43:51Z,OWNER,simonw/sqlite-utils/pulls/38,"Refs #35
Still to do:
- [x] Unit tests
- [x] Switch to using `.get()`
- [x] Better exceptions, plus unit tests for what happens if pk does not exist
- [x] Documentation
- [x] Ensure compound primary keys work properly
- [x] `alter=True` support",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
471628483,MDU6SXNzdWU0NzE2Mjg0ODM=,44,Utilities for building lookup tables,9599,simonw,closed,0,,,,,2,2019-07-23T10:59:58Z,2019-07-23T13:07:01Z,2019-07-23T13:07:01Z,OWNER,,"While building https://github.com/dogsheep/healthkit-to-sqlite I found a need for a neat mechanism for easily building lookup tables - tables where each unique value in a column is replaced by a foreign key to a separate table.
csvs-to-sqlite currently creates those with its ""extract"" mechanism - but that's written as custom code against Pandas. I'd like to eventually replace Pandas with sqlite-utils there.
See also #42 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
470691622,MDU6SXNzdWU0NzA2OTE2MjI=,5,Add progress bar,9599,simonw,closed,0,,,,,2,2019-07-20T16:29:07Z,2019-07-22T03:30:13Z,2019-07-22T02:49:22Z,MEMBER,,"Showing a progress bar would be nice, using Click.
The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser:
https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
465003070,MDU6SXNzdWU0NjUwMDMwNzA=,551,Ship many-to-many faceting support (and facet-by-delimiter),9599,simonw,open,0,,,,,2,2019-07-07T23:11:45Z,2019-07-08T15:45:23Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
464990184,MDU6SXNzdWU0NjQ5OTAxODQ=,547,Release notes for 0.29,9599,simonw,closed,0,,,4471010,Datasette 0.29,2,2019-07-07T20:30:28Z,2019-07-08T03:31:59Z,2019-07-08T03:31:59Z,OWNER,,There's a lot of stuff... https://github.com/simonw/datasette/compare/0.28...master,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
464779810,MDU6SXNzdWU0NjQ3Nzk4MTA=,541,Plugin hook for adding extra template context variables,9599,simonw,closed,0,,,,,2,2019-07-05T21:37:05Z,2019-07-06T00:05:59Z,2019-07-06T00:05:59Z,OWNER,,"It turns out I need this for https://github.com/simonw/datasette-auth-github/issues/5
It can be modelled on the `extra_body_script` hook: https://datasette.readthedocs.io/en/stable/plugins.html#extra-body-script-template-database-table-view-name-datasette",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
464040911,MDExOlB1bGxSZXF1ZXN0Mjk0NDAwNDQ2,539,Secret plugin configuration options,9599,simonw,closed,0,,,,,2,2019-07-04T03:21:20Z,2019-07-04T05:36:45Z,2019-07-04T05:36:45Z,OWNER,simonw/datasette/pulls/539,Refs #538 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
462423839,MDU6SXNzdWU0NjI0MjM4Mzk=,33,index_foreign_keys / index-foreign-keys utilities,9599,simonw,closed,0,,,,,2,2019-06-30T16:42:03Z,2019-06-30T23:54:11Z,2019-06-30T23:50:55Z,OWNER,,"Sometimes it's good to have indices on all columns that are foreign keys, to allow for efficient reverse lookups.
This would be a useful utility:
$ sqlite-utils index-foreign-keys database.db
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
438048318,MDExOlB1bGxSZXF1ZXN0Mjc0MTc0NjE0,437,Add inspect and prepare_sanic hooks,45057,russss,closed,0,,,,,2,2019-04-28T11:53:34Z,2019-06-24T16:38:57Z,2019-06-24T16:38:56Z,CONTRIBUTOR,simonw/datasette/pulls/437,"This adds two new plugin hooks:
The `inspect` hook allows plugins to add data to the inspect dictionary.
The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now...
On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer).
Ref #14",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/437/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
459714943,MDU6SXNzdWU0NTk3MTQ5NDM=,525,Add section on sqite-utils enable-fts to the search documentation,9599,simonw,closed,0,9599,simonw,,,2,2019-06-24T06:39:16Z,2019-06-24T16:36:35Z,2019-06-24T16:29:43Z,OWNER,,"https://datasette.readthedocs.io/en/stable/full_text_search.html already has a section about csvs-to-sqlite, sqlite-utils is even more relevant.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
276455748,MDU6SXNzdWUyNzY0NTU3NDg=,146,datasette publish gcloud,9599,simonw,closed,0,,,,,2,2017-11-23T18:55:03Z,2019-06-24T06:48:20Z,2019-06-24T06:48:20Z,OWNER,,"See also #103
It looks like you can start a Google Cloud VM with a ""docker container"" option - and the Google Cloud Registry is easy to push containers to. So it would be feasible to have `datasette publish gcloud ...` automatically build a container, push it to GCR, then start a new VM instance with it:
https://cloud.google.com/container-registry/docs/pushing-and-pulling
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/146/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
291639118,MDU6SXNzdWUyOTE2MzkxMTg=,183,Custom Queries - escaping strings,82988,psychemedia,closed,0,,,,,2,2018-01-25T16:49:13Z,2019-06-24T06:45:07Z,2019-06-24T06:45:07Z,CONTRIBUTOR,,"If a SQLite table column name contains spaces, they are usually referred to in double quotes:
`SELECT * FROM mytable WHERE ""gappy column name""=""my value"";`
In the JSON metadata file, this is passed by escaping the double quotes:
`""queries"": {""my query"": ""SELECT * FROM mytable WHERE \""gappy column name\""=\""my value\"";""}`
When specifying a custom query in `metadata.json` using double quotes, these are then rendered in the *datasette* query box using single quotes:
`SELECT * FROM mytable WHERE 'gappy column name'='my value';`
which does not work.
Alternatively, a valid custom query can be passed using backticks (\`) to quote the column name and single (unescaped) quotes for the matched value:
``""queries"": {""my query"": ""SELECT * FROM mytable WHERE `gappy column name`='my value';""}``
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
329147284,MDU6SXNzdWUzMjkxNDcyODQ=,305,Add contributor guidelines to docs,9599,simonw,closed,0,,,,,2,2018-06-04T17:25:30Z,2019-06-24T06:40:19Z,2019-06-24T06:40:19Z,OWNER,,https://channels.readthedocs.io/en/latest/contributing.html is a nice example of this done well.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459627549,MDU6SXNzdWU0NTk2Mjc1NDk=,523,Show total/unfiltered row count when filtering,2657547,rixx,closed,0,,,,,2,2019-06-23T22:56:48Z,2019-06-24T01:38:14Z,2019-06-24T01:38:14Z,CONTRIBUTOR,,"When I'm seeing a filtered view of a table, I'd like to be able to see something like '2 rows where status != ""closed"" (of 1000 total)' to have a context for the data I'm seeing – e.g. currently my database is being filled by an importer, so this information would be super helpful.
Since this information would be a performance hit, maybe something like '12 rows where status != ""closed"" (of ??? total)' with lazy-loading on-click(?) could be applied (Or via a ""How many total?"" tooltip, or …)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
457201907,MDU6SXNzdWU0NTcyMDE5MDc=,513,Is it possible to publish to Heroku despite slug size being too large?,7936571,chrismp,closed,0,,,,,2,2019-06-18T00:12:02Z,2019-06-21T22:35:54Z,2019-06-21T22:35:54Z,NONE,,"I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command.
Compiled slug size: 535.5M is too large (max is 500M).
Can I publish the databases and make datasette work on Heroku despite the large slug size?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
448395665,MDU6SXNzdWU0NDgzOTU2NjU=,22,Release notes for 1.0,9599,simonw,closed,0,,,4348046,1.0,2,2019-05-25T00:58:03Z,2019-05-25T01:18:27Z,2019-05-25T01:06:52Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/0.14...251e473,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
413871266,MDU6SXNzdWU0MTM4NzEyNjY=,18,.insert/.upsert/.insert_all/.upsert_all should add missing columns,9599,simonw,closed,0,,,4348046,1.0,2,2019-02-24T21:36:11Z,2019-05-25T00:42:11Z,2019-05-25T00:42:11Z,OWNER,,"This is a larger change, but it would be incredibly useful: if you attempt to insert or update a document with a field that does not currently exist in the underlying table, sqlite-utils should add the appropriate column for you.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
448189298,MDU6SXNzdWU0NDgxODkyOTg=,486,Ability to add extra routes and related templates,2181410,clausjuhl,closed,0,,,,,2,2019-05-24T14:04:25Z,2019-05-24T14:43:28Z,2019-05-24T14:43:09Z,NONE,,"Hi Simon
Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics).
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
421985685,MDU6SXNzdWU0MjE5ODU2ODU=,421,Documentation for ?_hash=1 and Datasette's hashed URL caching,9599,simonw,closed,0,,,4305096,0.28,2,2019-03-17T23:08:36Z,2019-05-19T05:32:37Z,2019-05-19T05:31:27Z,OWNER,,Follow on from #418 - the Datasette documentation needs an entire section (probably a new page) describing exactly how the hash-in-URL caching mechanism works.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
444711254,MDU6SXNzdWU0NDQ3MTEyNTQ=,467,Index page row counts only for DBs with < 30 tables (10ms count limit per table),9599,simonw,closed,0,,,4305096,0.28,2,2019-05-16T01:21:36Z,2019-05-16T03:03:45Z,2019-05-16T03:03:45Z,OWNER,,"Split out from #460.
If a database is mutable, calculating row counts gets expensive. I'm only going to calculate row counts for the index page if it has less than X tables (both hidden and non-hidden) AND each table can be counted in less than 10ms.
If any count takes longer than 10ms I'll cancel the counting entirely. We currently show an inaccurate count if this happens, which is just confusing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
275755475,MDU6SXNzdWUyNzU3NTU0NzU=,140,Heatmap visualization plugin,9599,simonw,open,0,,,,,2,2017-11-21T15:34:23Z,2019-05-13T18:33:51Z,,OWNER,,Could use https://github.com/scottbedard/svelte-heatmap,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
288438570,MDU6SXNzdWUyODg0Mzg1NzA=,179,More metadata options for template authors ,9599,simonw,open,0,,,,,2,2018-01-14T20:51:04Z,2019-05-13T18:33:33Z,,OWNER,,See this thread on Twitter: https://twitter.com/simonw/status/952637152797458432,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
327383759,MDU6SXNzdWUzMjczODM3NTk=,295,Extract unit tests for inspect out to test_inspect.py,9599,simonw,closed,0,,,,,2,2018-05-29T15:55:04Z,2019-05-11T21:40:32Z,2019-05-11T21:40:32Z,OWNER,,"Right now they are bundled up as API unit tests for a relatively unimportant endpoint. They should be their own thing.
Blocks #294",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
440304714,MDExOlB1bGxSZXF1ZXN0Mjc1OTA5MTk3,450,Coalesce hidden table count to 0,45057,russss,closed,0,,,,,2,2019-05-04T09:37:10Z,2019-05-11T18:10:09Z,2019-05-11T18:10:09Z,CONTRIBUTOR,simonw/datasette/pulls/450,"For some reason I'm hitting a `None` here with a FTS table. I'm not
entirely sure why but this makes the logic work the same as with
non-hidden tables.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
442330564,MDU6SXNzdWU0NDIzMzA1NjQ=,457,"Ability to ""publish cloudrun"" with no user input",9599,simonw,closed,0,,,,,2,2019-05-09T16:42:51Z,2019-05-09T19:41:31Z,2019-05-09T16:45:08Z,OWNER,,"If you attempt to deploy a new version of a cloudrun deployment, the script currently pauses and asks for user input for the service name like this:
```77d4d7de-3dfc-4acc-9a23-efe16230f318 2019-05-09T15:01:48+00:00 52S gs://datasette-222320_cloudbuild/source/1557414063.1-3a82df8096e9434b93511b0588d8d155.tgz gcr.io/datasette-222320/sf-trees (+1 more) SUCCESS
Service name: (sf-trees): USER INPUT REQUIRED HERE
Deploying container to Cloud Run service [sf-trees] in project [datasette-222320] region [us-central1]
✓ Deploying... Done.
✓ Creating Revision...
✓ Routing traffic...
✓ Setting IAM Policy...
```
This is incompatible with running under CI.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
374675798,MDExOlB1bGxSZXF1ZXN0MjI2MzE0ODYy,367,Mark codemirror files as vendored,48517,jaap3,closed,0,,,,,2,2018-10-27T18:41:25Z,2019-05-03T21:12:09Z,2019-05-03T21:11:20Z,CONTRIBUTOR,simonw/datasette/pulls/367,"GitHub lists datasette as a Javascript project, primarily because of the vendored codemirror files. This is somewhat confusing when you're looking for datasette, knowing it's written in Python.
Luckily it's possible exclude certain files from GitHub's code statistics: https://github.com/github/linguist#using-gitattributes",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/367/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
438450757,MDExOlB1bGxSZXF1ZXN0Mjc0NDc4NzYx,442,Suppress rendering of binary data,45057,russss,closed,0,,,,,2,2019-04-29T18:36:41Z,2019-05-03T18:26:48Z,2019-05-03T16:44:49Z,CONTRIBUTOR,simonw/datasette/pulls/442,"Binary columns (including spatialite geographies) get shown as ugly
binary strings in the HTML by default. Nobody wants to see that mess.
Show the size of the column in bytes instead. If you want to decode
the binary data, you can use a plugin to do it.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
438200529,MDU6SXNzdWU0MzgyMDA1Mjk=,438,Plugins are loaded when running pytest,45057,russss,closed,0,,,,,2,2019-04-29T08:25:58Z,2019-05-02T05:09:18Z,2019-05-02T05:09:11Z,CONTRIBUTOR,,"If I have a datasette plugin installed on my system, its hooks are called when running the main datasette tests. This is probably undesirable, especially with the inspect hook in #437, as the plugin may rely on inspected state that the tests don't know about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
438240541,MDExOlB1bGxSZXF1ZXN0Mjc0MzEzNjI1,439,[WIP] Add primary key to the extra_body_script hook arguments,45057,russss,closed,0,,,,,2,2019-04-29T10:08:23Z,2019-05-01T09:58:32Z,2019-05-01T09:58:30Z,CONTRIBUTOR,simonw/datasette/pulls/439,"This allows the row to be identified on row pages. The context here is that I want to access the row's data to plot it on a map.
I considered passing the entire template context through to the hook function. This would expose the actual row data and potentially avoid a further fetch request in JS, but it does make the plugin API a lot more leaky.
(At any rate, using the selected row data is tricky in my case because of Spatialite's infuriating custom binary representation...)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
427429265,MDExOlB1bGxSZXF1ZXN0MjY2MDM1Mzgy,424,Column types in inspected metadata,45057,russss,closed,0,,,,,2,2019-03-31T18:46:33Z,2019-04-29T18:30:50Z,2019-04-29T18:30:46Z,CONTRIBUTOR,simonw/datasette/pulls/424,"This PR does two things:
* Adds the sqlite column type for each column to the inspected table info.
* Stops binary columns from being rendered to HTML, unless a plugin handles it.
There's a bit more detail in the changeset descriptions.
These changes are intended as a precursor to a plugin which adds first-class support for Spatialite geographic primitives, and perhaps more useful geo-stuff.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
430103450,MDU6SXNzdWU0MzAxMDM0NTA=,425,Submitting SQL on hide page is broken,9599,simonw,closed,0,,,,,2,2019-04-07T04:21:31Z,2019-04-12T05:12:13Z,2019-04-12T05:00:53Z,OWNER,,Clicking the submit button here doesn't work correctly: https://3a208a4.datasette.io/fixtures?sql=select+%2A+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101&_hide_sql=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
432371762,MDU6SXNzdWU0MzIzNzE3NjI=,428,Make ?_fts_table=x and ?_fts_pk=y available as URL parameters on table view,9599,simonw,closed,0,,,,,2,2019-04-12T03:30:55Z,2019-04-12T04:30:29Z,2019-04-12T04:21:25Z,OWNER,,"These can currently only be set using `metadata.json`: https://datasette.readthedocs.io/en/0.27/full_text_search.html#configuring-full-text-search-for-a-table-or-view
There's no reason not to support these as URL parameters as well. That way it would be easy to use FTS search against a view without having to use `metadata.json`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/428/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
411257981,MDU6SXNzdWU0MTEyNTc5ODE=,412,Linked Data(sette),43340,sfkeller,open,0,,,,,2,2019-02-18T00:38:14Z,2019-03-19T10:09:46Z,,NONE,,"I've a radical feature idea (possible first as an extension in order to experiment?):
I'd like to link to a remote table from a remote database, e.g. with a function ""linked_datasette()"". So one could do following query:
```
SELECT foo.id, foo.a, remote_party.b
FROM foo
JOIN linked_datasette(""https://parlgov.datasettes.com/parlgov-b42a2f2"") AS remote_party
ON foo.id=remote_party.id
```
This is inspired by SPARQL's SERVICE keyword for remote RDF ""endpoints"".
There's a foundation in the SQL Standard called SQL/MED (https://rhaas.blogspot.com/2011/01/why-sqlmed-is-cool.html ).
And here's an implementation from me in Postgres FDW to connect another Postgres ""endpoint"": https://pastebin.com/Fz2v64Cz .",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/412/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
413740684,MDU6SXNzdWU0MTM3NDA2ODQ=,11,Detect numpy types when creating tables,9599,simonw,closed,0,,,,,2,2019-02-23T21:09:35Z,2019-02-24T04:02:20Z,2019-02-24T04:02:20Z,OWNER,,Inspired by #8,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
403617881,MDU6SXNzdWU0MDM2MTc4ODE=,405,.json?_nl=on option for exporting newline-delimited JSON,9599,simonw,closed,0,,,,,2,2019-01-28T01:10:45Z,2019-01-28T01:49:00Z,2019-01-28T01:48:37Z,OWNER,,"The neat thing about newline-delimited JSON is that you don't have to read an entire array (of potentially thousands of objects) into memory in order to parse it - you can parse things a line at a time instead.
It will look like this:
`https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on`
```
{""pk"": 1, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Mission""}
{""pk"": 2, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Dogpatch""}
{""pk"": 3, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""SOMA""}
{""pk"": 4, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Tenderloin""}
{""pk"": 5, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Bernal Heights""}
```
I added this as part of the `sqlite-utils json` CLI command is this commit - I think Datasette should offer it as well: https://github.com/simonw/sqlite-utils/commit/5466c9745dfef858286146ea158ffd5a71391d10
It can be offered alongside `_stream=on` (which currently only works for CSV, but it could work for JSON as well thanks to this trick).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
400511206,MDU6SXNzdWU0MDA1MTEyMDY=,403,How does persistence work?,1794527,ccorcos,closed,0,,,,,2,2019-01-17T23:41:57Z,2019-01-19T05:47:55Z,2019-01-18T06:51:14Z,NONE,,I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
398089089,MDU6SXNzdWUzOTgwODkwODk=,399,/-/versions for official Docker image returns wrong Datasette version,9599,simonw,closed,0,,,,,2,2019-01-11T01:19:58Z,2019-01-13T23:31:59Z,2019-01-13T23:10:45Z,OWNER,,"```
docker run -p 8001:8001 datasetteproject/datasette datasette -p 8001 -h 0.0.0.0
```
http://0.0.0.0:8001/-/versions returns this:
```
{
""datasette"": {
""version"": ""0+unknown""
},
...
```
This is because the Docker image is built by copying in the Datasette source code, which confuses versioneer. Maybe the Docker image should install the code using a wheel or similar?
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/399/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
369716228,MDU6SXNzdWUzNjk3MTYyMjg=,366,Default built image size over Zeit Now 100MiB limit,416374,gfrmin,closed,0,,,,,2,2018-10-12T21:27:17Z,2018-11-05T06:23:32Z,2018-11-05T06:23:32Z,CONTRIBUTOR,,"Using `dataset publish now` with no other custom options on a small (43KB) sqlite database leads to the error ""The built image size (373.5M) exceeds the 100MiB limit"". I think this is because of a recent Zeit change: https://github.com/zeit/now-cli/issues/1523",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
330826972,MDU6SXNzdWUzMzA4MjY5NzI=,308,"Support extra Heroku apps:create options - region, space, team",78156,annapowellsmith,open,0,,,,,2,2018-06-08T23:08:33Z,2018-09-21T14:09:28Z,,NONE,,"It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
344701755,MDU6SXNzdWUzNDQ3MDE3NTU=,350,Don't list default plugins on /-/plugins,9599,simonw,closed,0,,,,,2,2018-07-26T05:38:00Z,2018-08-28T17:13:50Z,2018-08-28T16:48:19Z,OWNER,,"https://dbbe707.datasette.io/-/plugins is showing ""datasette.publish.now"" and ""datasette.publish.heroku""",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
339095976,MDU6SXNzdWUzMzkwOTU5NzY=,334,extra_options not passed to heroku publisher,719357,kamicut,closed,0,,,,,2,2018-07-06T23:26:12Z,2018-07-24T04:53:21Z,2018-07-10T01:46:04Z,NONE,,"I might be wrong but I was not able to publish to `heroku` with `--extra-options`, I think `extra_options` is not being used in this function [here](https://github.com/simonw/datasette/blob/master/datasette/utils.py#L369).
Any help appreciated! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
327459829,MDU6SXNzdWUzMjc0NTk4Mjk=,298,URLify URLs in results from custom SQL statements / views,9599,simonw,closed,0,,,,,2,2018-05-29T19:41:07Z,2018-07-24T04:53:20Z,2018-07-24T03:56:50Z,OWNER,,"Consider this custom query:
https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+user%2C+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+user%29+as+user_url%2C+created_at%2C+text%2C+url+from+%5Btwitter-ratio%2Fsenators%5D+limit+10%3B
```select user, ('https://twitter.com/' || user) as user_url, created_at, text, url from [twitter-ratio/senators] limit 10;```
![2018-05-29 at 12 38 pm](https://user-images.githubusercontent.com/9599/40681177-44a36d5c-633d-11e8-935b-c49dad7ac682.png)
It would be nice if these URLs were turned into links, as happens on the table view page: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/twitter-ratio%2Fsenators
![2018-05-29 at 12 39 pm](https://user-images.githubusercontent.com/9599/40681206-5c69c47c-633d-11e8-9f3a-08899f8659b8.png)
This currently does not happen because the table view render logic takes a different path through `display_columns_and_rows()` which includes this bit:
https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/views/table.py#L195-L202",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
275493851,MDU6SXNzdWUyNzU0OTM4NTE=,139,Build a visualization plugin for Vega,9599,simonw,closed,0,,,,,2,2017-11-20T20:47:41Z,2018-07-10T17:48:18Z,2018-07-10T17:48:18Z,OWNER,,"https://vega.github.io/vega/examples/population-pyramid/ for example looks pretty easy to hook up to Datasette.
Depends on #14 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
334149717,MDU6SXNzdWUzMzQxNDk3MTc=,319,Incorrect display of compound primary keys with foreign key relationships,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-20T16:09:36Z,2018-06-21T15:58:15Z,2018-06-21T14:56:41Z,OWNER,,"https://registry.datasette.io/registry-7d4f81f/datasette_tags
![2018-06-20 at 9 07 am](https://user-images.githubusercontent.com/9599/41670542-68cc4dec-7469-11e8-9521-3bbc6465eccb.png)
Underlying JSON looks [like this](https://registry.datasette.io/registry-7d4f81f/datasette_tags.json?_labels=on):
```
{
""database"": ""registry"",
""table"": ""datasette_tags"",
""is_view"": false,
""human_description_en"": """",
""rows"": [
{
""datasette_id"": {
""value"": 1,
""label"": ""Global Power Plant Database""
},
""tag"": {
""value"": ""geospatial"",
""label"": ""geospatial""
}
},
````
Bug is likely somewhere in here: https://github.com/simonw/datasette/blob/e04f5b0d348ef7275a0a5ab9eb53527105132885/datasette/views/table.py#L143-L207",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
333326107,MDU6SXNzdWUzMzMzMjYxMDc=,317,Travis CI fails to upload new releases to PyPI,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-18T15:44:26Z,2018-06-21T15:45:47Z,2018-06-21T15:45:47Z,OWNER,,"https://travis-ci.org/simonw/datasette/jobs/393684139
```
...
removing build/bdist.linux-x86_64/wheel
Uploading distributions to https://upload.pypi.org/legacy/
Uploading datasette-0.23-py3-none-any.whl
100%|██████████| 201k/201k [00:00<00:00, 1.02MB/s]
HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://upload.pypi.org/legacy/
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
331343824,MDU6SXNzdWUzMzEzNDM4MjQ=,309,On 404s with a trailing slash redirect to that page without a trailing slash,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-11T20:46:49Z,2018-06-21T15:22:02Z,2018-06-21T15:13:15Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
328171513,MDU6SXNzdWUzMjgxNzE1MTM=,302,test-2.3.sqlite database filename throws a 404,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-05-31T14:50:58Z,2018-06-21T15:21:17Z,2018-06-21T15:21:16Z,OWNER,,"The following almost works:
datasette test-2.3.sqlite
http://127.0.0.1:8001test-2.3-c88bc35/HighWays loads OK, but http://127.0.0.1:8001test-2.3-c88bc35 throws a 404:
![2018-05-31 at 7 50 am](https://user-images.githubusercontent.com/9599/40789434-447ae934-64a7-11e8-9a07-4eeba87147d5.png)
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
332998752,MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx,311,"?_labels=1 to expand foreign keys (in csv and json), refs #233",9599,simonw,closed,0,,,,,2,2018-06-16T16:31:12Z,2018-06-16T22:20:31Z,2018-06-16T22:20:31Z,OWNER,simonw/datasette/pulls/311,"Output looks something like this:
{
""rowid"": 233,
""TreeID"": 121240,
""qLegalStatus"": {
""value"" 2,
""label"": ""Private""
}
""qSpecies"": {
""value"": 16,
""label"": ""Sycamore""
}
""qAddress"": ""91 Commonwealth Ave"",
...
}",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
318737808,MDU6SXNzdWUzMTg3Mzc4MDg=,243,--spatialite option for datasette publish commands,9599,simonw,closed,0,,,,,2,2018-04-29T18:19:32Z,2018-05-31T14:17:53Z,2018-05-31T14:17:53Z,OWNER,,Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
273296178,MDU6SXNzdWUyNzMyOTYxNzg=,73,_nocache=1 query string option for use with sort-by-random,9599,simonw,closed,0,,,,,2,2017-11-13T02:57:10Z,2018-05-28T17:25:15Z,2018-05-28T17:25:15Z,OWNER,,The one place where we wouldn’t want cdching is if we have something which uses sort by random to return random items. We can offer a _nocache=1 querystring argument to support this.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
326189744,MDU6SXNzdWUzMjYxODk3NDQ=,285,num_threads and cache_max_age should be --config options,9599,simonw,closed,0,,,,,2,2018-05-24T16:04:51Z,2018-05-27T00:53:35Z,2018-05-27T00:43:33Z,OWNER,,"https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/app.py#L106
And
https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/views/base.py#L325
Refs #275 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
326599525,MDU6SXNzdWUzMjY1OTk1MjU=,286,Database hash should include current datasette version,9599,simonw,open,0,,,,,2,2018-05-25T17:03:42Z,2018-05-25T17:07:36Z,,OWNER,,"Right now deploying a new version of datasette doesn't invalidate existing URLs, so users may still see a cached copy of the old templates.
We can fix this by including the current datasette version in the input to the hash function (which currently just the database file contents).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
324451322,MDU6SXNzdWUzMjQ0NTEzMjI=,273,Figure out a way to have /-/version return current git commit hash,9599,simonw,closed,0,,,,,2,2018-05-18T15:16:56Z,2018-05-22T19:35:22Z,2018-05-22T19:35:22Z,OWNER,,"https://fivethirtyeight.datasettes.com/-/versions reports Datasette version `0.21`
This isn't actually correct. The deploy script for that site actually deploys current master using `https://github.com/simonw/datasette/archive/master.zip`: https://github.com/simonw/fivethirtyeight-datasette/blob/66b4b0dfedd7237bc8c02d3e26d905bca7b84069/Dockerfile#L9
Ideally this would show the current commit hash, but I'm not at all sure if it's possible to derive that from `pip install https://github.com/simonw/datasette/archive/master.zip`. Is there another mechanism that could be used to reliably `pip install` current master but still provide access to the most recent commit hash?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
324836533,MDExOlB1bGxSZXF1ZXN0MTg5MzE4NDUz,277,Refactor inspect logic,45057,russss,closed,0,,,,,2,2018-05-21T08:49:31Z,2018-05-22T16:07:24Z,2018-05-22T14:03:07Z,CONTRIBUTOR,simonw/datasette/pulls/277,"This pulls the logic for inspect out into a new file which makes it a bit easier to understand.
This was going to be the first part of an implementation for #276, but it seems like that might take a while so I'm going to PR a few bits of refactoring individually.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
324652142,MDU6SXNzdWUzMjQ2NTIxNDI=,274,"Rename --limit to --config, add --help-config",9599,simonw,closed,0,,,,,2,2018-05-19T18:57:42Z,2018-05-20T17:04:55Z,2018-05-20T17:04:11Z,OWNER,,"#270 introduced `--limit` but on further thought it should be called `--config` instead.
`--page_size` should becomes `--config default_page_size:1000`
Add `--help-config` to show full help showing all config settings.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
317475156,MDU6SXNzdWUzMTc0NzUxNTY=,237,Support for ?_search_colname=blah searches,9599,simonw,closed,0,,,,,2,2018-04-25T04:29:53Z,2018-05-05T22:56:42Z,2018-05-05T22:33:23Z,OWNER,,"Right now the `_search=` argument searches across all fields in a full-text index, for example:
https://san-francisco.datasettes.com/sf-film-locations-84594a7/Film_Locations_in_San_Francisco?_search=justin
SQLite FTS also supports searches within a specified field, for example:
https://san-francisco.datasettes.com/sf-film-locations-84594a7?sql=select+rowid%2C+*+from+Film_Locations_in_San_Francisco+where+rowid+in+%28select+rowid+from+%5BFilm_Locations_in_San_Francisco_fts%5D+where+%5BLocations%5D+match+%3Asearch%29+order+by+rowid+limit+101&search=justin
```
select rowid, * from Film_Locations_in_San_Francisco
where rowid in (
select rowid from [Film_Locations_in_San_Francisco_fts]
where [Locations] match :search
) order by rowid limit 101
```
The `_search=` parameter could be extended to support this using `_search_colname=`.
This should also be able to support columns with spaces and special characters in their names, something like this:
`_search_Column%20With%20Spaces=foo`
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/237/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
319954545,MDU6SXNzdWUzMTk5NTQ1NDU=,248,/-/plugins should show version of each installed plugin,9599,simonw,closed,0,,,,,2,2018-05-03T14:50:45Z,2018-05-04T18:25:40Z,2018-05-04T18:05:04Z,OWNER,,"Refs #244
https://stackoverflow.com/questions/20180543/how-to-check-version-of-python-modules
```
>>> import pkg_resources
>>> pkg_resources.get_distribution('datasette_cluster_map').version
'0.4'
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
275135393,MDU6SXNzdWUyNzUxMzUzOTM=,125,Plot rows on a map with Leaflet and Leaflet.markercluster,9599,simonw,closed,0,,,,,2,2017-11-19T06:05:05Z,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,,"https://github.com/Leaflet/Leaflet.markercluster would allow us to paginate-load in an enormous set of rows with latitude/longitude points, e.g. https://australian-dunnies.now.sh/
Here's a demo of it loading 50,000 markers: https://leaflet.github.io/Leaflet.markercluster/example/marker-clustering-realworld.50000.html - and it looks like it's easy to support progress bars for if we were iteratively loading 1,000 markers at a time using datasette pagination.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
317760361,MDU6SXNzdWUzMTc3NjAzNjE=,239,Support for hidden tables in metadata.json,9599,simonw,closed,0,,,,,2,2018-04-25T19:21:17Z,2018-04-26T03:45:12Z,2018-04-26T03:43:10Z,OWNER,,"Since we already have a hidden feature, let's expose it more to our users ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/239/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
316621102,MDU6SXNzdWUzMTY2MjExMDI=,235,Add limit on the size in KB of data returned from a single query,9599,simonw,open,0,,,,,2,2018-04-22T23:01:15Z,2018-04-24T00:30:02Z,,OWNER,,"Datasette limits the number of rows returned to 1,000 and limits the time spent executing a SQL query to 1000ms - and both of these limits can be customized.
It does not have a limit on the size of the response returned. It's possible to compose maliciously large SQL responses in a small number of rows using mechanisms like the `group_concat()` aggregate function. It would be good to avoid malicious SQL creating 100MB+ responses and potentially crashing the server.
I think the easiest place to implement that is here:
https://github.com/simonw/datasette/blob/f3f42957128c1e7ece584d45d9167f2ac003a3b8/datasette/app.py#L175-L190
Currently we use `cursor.fetchmany()` to fetch up to 1,001 rows at once. Instead, we could switch to iterating through `cursor.fetchone()` (or just using `for row in cursor`) and keeping a running tally of the size of the response as we go - maybe just using `rough_response_size += len(str(row))`. If that goes above a certain threshold we can terminate the response with an error, like we do with timelimits.
The bigger challenge here is understanding how well this approach works and what impact it will have on overall Datasette performance. I think I need #33 for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
313785206,MDExOlB1bGxSZXF1ZXN0MTgxMjQ3NTY4,202,Raise 404 on nonexistent table URLs,45057,russss,closed,0,,,,,2,2018-04-12T15:47:06Z,2018-04-13T19:22:56Z,2018-04-13T18:19:15Z,CONTRIBUTOR,simonw/datasette/pulls/202,"Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything.
This is issue #184.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
312312125,MDU6SXNzdWUzMTIzMTIxMjU=,194,Rename table_rows and filtered_table_rows to have _count suffix,9599,simonw,closed,0,,,,,2,2018-04-08T14:53:37Z,2018-04-09T05:25:22Z,2018-04-09T05:25:22Z,OWNER,,"These fields represent counts of items:
""table_rows"": 131,
""filtered_table_rows"": 8,
But the names make it sound like they might be arrays full of rows. Adding a `_count` suffix would make this more clear:
""table_rows_count"": 131,
""filtered_table_rows_count"": 8,
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
273026602,MDU6SXNzdWUyNzMwMjY2MDI=,52,Solution for temporarily uploading DB so it can be built by docker,9599,simonw,closed,0,,,,,2,2017-11-10T18:55:25Z,2017-12-10T03:02:57Z,2017-12-10T03:02:57Z,OWNER,,For the `datasette publish` command I ideally need a way of uploading the specified DB to somewhere temporary on the internet so that when the Dockerfile is built by the final hosting location it can download that database as part of the build process.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
280014287,MDU6SXNzdWUyODAwMTQyODc=,165,metadata.json support for per-database and per-table information,9599,simonw,closed,0,,,2949431,Custom templates edition,2,2017-12-07T06:15:34Z,2017-12-07T16:48:34Z,2017-12-07T16:47:29Z,OWNER,,"Every database and every table should be able to support the following optional metadata:
title
description
description_html
license
license_url
source
source_url
If `description_html` is provided it over-rides `description` and will be displayed unescaped.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/165/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
275135719,MDU6SXNzdWUyNzUxMzU3MTk=,127,"Filtered tables should show count of all matching rows, if fast enough",9599,simonw,closed,0,,,2919870,Foreign key edition,2,2017-11-19T06:13:29Z,2017-11-24T22:02:01Z,2017-11-24T22:02:01Z,OWNER,,"Relates to #86. If you are viewing a filtered page e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/bob-ross%2Felements-by-episode?CLOUDS=1 we should show the count of matching rows.
Since this could be an expensive operation, we will run it with a strict time limit (maybe 50ms). If the time limit is exceeded we will display ""many"" instead, perhaps? Maybe even link to a count(*) query that would get the full 1000ms time limit which the user can click on if they like (that could even Ajax-in the result).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
275164558,MDU6SXNzdWUyNzUxNjQ1NTg=,129,Hide FTS-created tables by default on the database index page,9599,simonw,closed,0,,,,,2,2017-11-19T14:50:42Z,2017-11-22T20:22:02Z,2017-11-22T20:19:04Z,OWNER,,"SQLite databases that use FTS include a number of automatically generated tables, e.g.:
https://sf-trees-search.now.sh/sf-trees-search-a899b92
Of these, only the `Street_Tree_List` table is actually relevant to the user.
We can detect which tables are FTS tables by first finding the virtual tables:
sqlite> .headers on
sqlite> select * from sqlite_master where rootpage = 0;
type|name|tbl_name|rootpage|sql
table|Search|Search|0|CREATE VIRTUAL TABLE ""Street_Tree_List_fts"" USING FTS4 (""qAddress"", ""qCaretaker"", ""qSpecies"")
Then parsing the above to figure out which ones are USING FTS? - then assume that any table which starts with that `Street_Tree_List_fts` prefix was created to support search:
sqlite> select * from sqlite_master where type='table' and tbl_name like 'Street_Tree_List_fts%';
type|name|tbl_name|rootpage|sql
table|Search_content|Search_content|10355|CREATE TABLE 'Street_Tree_List_fts_content'(docid INTEGER PRIMARY KEY, 'c0qAddress', 'c1qCaretaker', 'c2qSpecies')
table|Search_segments|Search_segments|10356|CREATE TABLE 'Street_Tree_List_fts_segments'(blockid INTEGER PRIMARY KEY, block BLOB)
table|Search_segdir|Search_segdir|10357|CREATE TABLE 'Street_Tree_List_fts_segdir'(level INTEGER,idx INTEGER,start_block INTEGER,leaves_end_block INTEGER,end_block INTEGER,root BLOB,PRIMARY KEY(level, idx))
table|Search_docsize|Search_docsize|10359|CREATE TABLE 'Street_Tree_List_fts_docsize'(docid INTEGER PRIMARY KEY, size BLOB)
table|Search_stat|Search_stat|10360|CREATE TABLE 'Street_Tree_List_fts_stat'(id INTEGER PRIMARY KEY, value BLOB)
We won't hide these completely - instead, we'll default the database index view to not showing them with a message that says ""5 hidden tables"" and support ?_hidden=1 to display them.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
274578142,MDU6SXNzdWUyNzQ1NzgxNDI=,110,Add --load-extension option to datasette for loading extra SQLite extensions,9599,simonw,closed,0,,,,,2,2017-11-16T16:26:19Z,2017-11-16T18:38:30Z,2017-11-16T16:58:50Z,OWNER,,"This would allow users with extra SQLite extensions installed (like spatialite) to load them at runtime.
Inspired by this comment: https://github.com/simonw/datasette/issues/46#issuecomment-344810525",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,coisnepe,closed,0,,,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though...
```
2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last):
File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request
response = await response
File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get
return await self.view_get(request, name, hash, **kwargs)
File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get
**context,
File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render
return html(self.render_string(template, request, **context))
File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string
return self.env.get_template(template).render(**context)
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template
return self._load_template(name, self.make_globals(globals))
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template
template = self.loader.load(self, name, globals)
File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load
code = environment.compile(source, name, filename)
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile
self.handle_exception(exc_info, source_hint=source_hint)
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception
reraise(exc_type, exc_value, tb)
File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise
raise value.with_traceback(tb)
File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate
return generate(source, self, name, filename, defer_init=defer_init)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate
generator.visit(node)
File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit
return f(node, *args, **kwargs)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template
self.blockvisit(block.body, block_frame)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit
self.visit(node, frame)
File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit
return f(node, *args, **kwargs)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If
self.blockvisit(node.body, if_frame)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit
self.visit(node, frame)
File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit
return f(node, *args, **kwargs)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output
self.visit(argument, frame)
File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit
return f(node, *args, **kwargs)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter
self.fail('no filter named %r' % node.name, node.lineno)
File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail
raise TemplateAssertionError(msg, lineno, self.name, self.filename)
jinja2.exceptions.TemplateAssertionError: no filter named 'tojson'
2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144
2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0
```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
267857622,MDU6SXNzdWUyNjc4NTc2MjI=,25,Endpoint that returns SQL ready to be piped into DB,9599,simonw,closed,0,,,,,2,2017-10-24T00:19:26Z,2017-11-15T05:11:12Z,2017-11-15T05:11:11Z,OWNER,,It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
273595473,MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw,81,:fire: Removes DS_Store,50527,jefftriplett,closed,0,,,,,2,2017-11-13T22:07:52Z,2017-11-14T02:24:54Z,2017-11-13T22:16:55Z,CONTRIBUTOR,simonw/datasette/pulls/81,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
273569477,MDU6SXNzdWUyNzM1Njk0Nzc=,80,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination),9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-13T20:37:46Z,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,,Final versions should be deployed using the first released version of datasette.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
273127117,MDU6SXNzdWUyNzMxMjcxMTc=,55,Ship first version to PyPI,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-11T07:38:48Z,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,,"Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
273192789,MDU6SXNzdWUyNzMxOTI3ODk=,67,Command that builds a local docker container,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-12T02:13:29Z,2017-11-13T16:17:52Z,2017-11-13T16:17:52Z,OWNER,,Be nice to indicate that this isn't just for Now. Shouldn't be too hard either.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
272694136,MDU6SXNzdWUyNzI2OTQxMzY=,50,Unit tests against application itself,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-09T19:31:49Z,2017-11-11T22:23:22Z,2017-11-11T22:23:22Z,OWNER,,"Use Sanic’s testing mechanism. Test should create a temporary SQLite database file on disk by executing sql that is stored in the test themselves.
For the moment we can just test the JSON API more thoroughly and just sanity check that the HTML output doesn’t throw any errors.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
267828746,MDU6SXNzdWUyNjc4Mjg3NDY=,24,Implement full URL design,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T21:49:05Z,2017-10-24T14:12:00Z,2017-10-24T14:12:00Z,OWNER,,"Full URL design:
/database-name
/database-name.json
/database-name-7sha256
/database-name-7sha256.json
/database-name/table-name
/database-name/table-name.json
/database-name-7sha256/table-name
/database-name-7sha256/table-name.json
/database-name-7sha256/table-name/compound-pk
/database-name-7sha256/table-name/compound-pk.json
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
267517348,MDU6SXNzdWUyNjc1MTczNDg=,9,Initial test suite,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T01:28:46Z,2017-10-24T05:55:33Z,2017-10-24T05:55:33Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed