github

Custom SQL query returning 101 rows (hide)

idnode_idnumbertitleuserstatelockedassigneemilestonecommentscreated_atupdated_atclosed_atauthor_associationpull_requestbodyrepotypeactive_lock_reasonperformed_via_github_app
675594325 MDU6SXNzdWU2NzU1OTQzMjU= 917 Idea: "datasette publish" option for "only if the data has changed 9599 open 0     0 2020-08-08T21:58:27Z 2020-08-08T21:58:27Z   OWNER   This is a pattern I often find myself needing. I usually implement this in GitHub Actions like this: https://github.com/simonw/covid-19-datasette/blob/efa01c39abc832b8641fc2a92840cc3acae2fb08/.github/workflows/scheduled.yml#L52-L63 ```yaml - name: Set variables to decide if we should deploy id: decide_variables run: |- echo "##[set-output name=latest;]$(datasette inspect covid.db | jq '.covid.hash' -r)" echo "##[set-output name=deployed;]$(curl -s https://covid-19.datasettes.com/-/databases.json | jq '.[0].hash' -r)" - name: Set up Cloud Run if: github.event_name == 'workflow_dispatch' || steps.decide_variables.outputs.latest != steps.decide_variables.outputs.deployed uses: GoogleCloudPlatform/github-actions/setup-gcloud@master ``` This is pretty fiddly. It might be good for `datasette publish` to grow a helper option that does effectively this - hashes the databases (and the `metadata.json`) and compares them to the deployed version. 107914493 issue    
648245071 MDU6SXNzdWU2NDgyNDUwNzE= 8 Error thrown: table photos has no column named hasSticker 18504 open 0     1 2020-06-30T14:54:37Z 2020-08-05T14:55:20Z   NONE   While running `swarm-to-sqlite` it throws an error: harper@:~/dogsheep/swarm$ swarm-to-sqlite checkins.db --save=checkins.json Please provide your Foursquare OAuth token: Importing 8127 checkins [#################-------------------] 49% 00:01:52 Traceback (most recent call last): File "/home/harper/.local/bin/swarm-to-sqlite", line 11, in <module> sys.exit(cli()) File "/home/harper/.local/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/home/harper/.local/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/home/harper/.local/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/harper/.local/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/cli.py", line 73, in cli save_checkin(checkin, db) File "/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/utils.py", line 94, in save_checkin photos_table.insert(photo, replace=True) File "/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py", line 963, in insert alter = self.value_or_default("alter", alter) File "/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py", line 1142, in insert_all def upsert_all( sqlite3.OperationalError: table photos has no column named hasSticker Where should i dig in? 205429375 issue    
673602857 MDU6SXNzdWU2NzM2MDI4NTc= 9 Define a view that displays photos correctly 9599 open 0     0 2020-08-05T14:53:39Z 2020-08-05T14:53:39Z   MEMBER   The `photos` table stores data like this: id | createdAt | source | prefix | suffix | width | height | visibility | created ▲ | user -- | -- | -- | -- | -- | -- | -- | -- | -- | -- 5e12c9708506bc000840262a | January 06, 2020 - 05:45:20 UTC | Swarm for iOS 1 | https://fastly.4sqi.net/img/general/ | /15889193_AXxGk4I1nbzUZuyYqObgbXdJNyEHiwj6AUDq0tPZWtw.jpg | 1920 | 1440 | public | 2020-01-06T05:45:20 | 15889193 The photo URL can be derived from those pieces - define a SQL view which does that (using `datasette-json-html` to display the pictures) 205429375 issue    
672421411 MDU6SXNzdWU2NzI0MjE0MTE= 916 Support reverse pagination (previous page, has-previous-items) 9599 open 0     0 2020-08-04T00:32:06Z 2020-08-04T00:32:20Z   OWNER   I need this for `datasette-graphql` for full compatibility with the way Relay likes to paginate - using cursors for paginating backwards as well as for paginating forwards. > This may be the kick I need to get Datasette pagination to work in reverse too. _Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-668305853_ 107914493 issue    
671763164 MDU6SXNzdWU2NzE3NjMxNjQ= 915 Refactor TableView class so things like datasette-graphql can reuse the logic 9599 open 0     0 2020-08-03T03:13:33Z 2020-08-03T03:13:40Z   OWNER   _Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-667780040_ 107914493 issue    
671130371 MDU6SXNzdWU2NzExMzAzNzE= 130 Support tokenize option for FTS 9599 closed 0     3 2020-08-01T19:27:22Z 2020-08-01T20:51:28Z 2020-08-01T20:51:14Z OWNER   FTS5 supports things like porter stemming using a `tokenize=` option: https://www.sqlite.org/fts5.html#tokenizers Something like this in code: ``` CREATE VIRTUAL TABLE [{table}_fts] USING {fts_version} ( {columns}, tokenize='porter', content=[{table}] ); ``` I tried this out just now and it worked exactly as expected. So... `db[table].enable_fts(...) should accept a 'tokenize=` argument, and `sqlite-utils enable-fts ...` should support a `--tokenize` option. 140912432 issue    
671056788 MDU6SXNzdWU2NzEwNTY3ODg= 914 "Object of type bytes is not JSON serializable" for _nl=on 9599 open 0     0 2020-08-01T17:43:10Z 2020-08-01T17:43:15Z   OWNER   https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array returns this: ```json [ { "rowid": 1, "data": "this is binary data" } ] ``` But adding `&_nl=on` returns this: https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array&_nl=on ```json { "ok": false, "error": "Object of type bytes is not JSON serializable", "status": 500, "title": null } ``` I found this error by running `wget -r 127.0.0.1:8001` against my local `fixtures.db`. 107914493 issue    
661605489 MDU6SXNzdWU2NjE2MDU0ODk= 900 Some links don't honor base_url 50220 open 0     1 2020-07-20T09:40:50Z 2020-07-31T23:56:38Z   NONE   Hi, I've been playing with Datasette behind Nginx (awesome tool, thanks !). It seems some URLs are OK but some aren't. For instance in https://github.com/simonw/datasette/blob/master/datasette/templates/query.html#L61 it seems that `url_csv` includes a `/` prefix, resulting in the `base_url` not beeing honored. Actually here, it seems that dropping the prefix `/` to make the link relative is enough (so it may not be strictly related to `base_url`). Additional information: ``` datasette, version 0.45+0.gf1f581b.dirty ``` Relevant Nginx configuration (note that all the trailing slashes have some effect): ``` location /datasette/ { proxy_pass http://127.0.0.1:9001/; proxy_set_header Host $host; } ``` Relelvant Datasette configuration (slashes matter too): ``` --config base_url:/datasette/ ``` 107914493 issue    
660827546 MDU6SXNzdWU2NjA4Mjc1NDY= 899 How to setup a request limit per user 133845 closed 0     1 2020-07-19T13:08:25Z 2020-07-31T23:54:42Z 2020-07-31T23:54:42Z NONE   Hello, Until now I'm using datasette without any authentication system but I would like to setup a configuration or limiting the number of requests per user (eventually by IP or with a cookie mechanism) and eventually allowing me to ban specific users/IPs. Is there a plugin available for this use case ? If not what are your insights regarding this UC ? Should I write a plugin ? Should I deploy datasette behind a reverse proxy to manage this ? 107914493 issue    
670209331 MDU6SXNzdWU2NzAyMDkzMzE= 913 Mechanism for passing additional options to `datasette my.db` that affect plugins 9599 open 0     3 2020-07-31T20:38:26Z 2020-07-31T23:52:11Z   OWNER   > It's a shame there's no obvious mechanism for passing additional options to `datasette my.db` that affect how plugins work. > >The only way I can think of at the moment is via environment variables: > > DATASETTE_INSERT_UNSAFE=1 datasette my.db > >This will have to do for the moment - it's ugly enough that people will at least know they are doing something unsafe, which is the goal here. _Originally posted by @simonw in https://github.com/simonw/datasette-insert/issues/15#issuecomment-667346438_ 107914493 issue    
639072811 MDU6SXNzdWU2MzkwNzI4MTE= 849 Rename master branch to main 9599 open 0   3268330 9 2020-06-15T19:05:54Z 2020-07-31T23:23:24Z   OWNER   I was waiting for consensus to form around this (and kind-of hoping for `trunk` since I like the tree metaphor) and it looks like `main` is it. I've seen convincing arguments against `trunk` too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So `main` is better anyway. 107914493 issue    
662322234 MDExOlB1bGxSZXF1ZXN0NDUzODkwMjky 901 Use None as a default arg 56323389 closed 0     1 2020-07-20T22:18:38Z 2020-07-31T18:42:39Z 2020-07-31T18:42:39Z CONTRIBUTOR simonw/datasette/pulls/901 When passing a mutable value as a default argument in a function, the default argument is mutated anytime that value is mutated. This poses a bug risk. Instead, use None as a default and assign the mutable value inside the function. 107914493 pull    
668064778 MDU6SXNzdWU2NjgwNjQ3Nzg= 912 Add "publishing to Vercel" to the publish docs 9599 closed 0     0 2020-07-29T18:50:58Z 2020-07-31T17:06:35Z 2020-07-31T17:06:35Z OWNER   https://datasette.readthedocs.io/en/0.45/publish.html#datasette-publish currently only lists Cloud Run, Heroku and Fly. It should list Vercel too. (I should probably rename `datasette-publish-now` to `datasette-publish-vercel`) 107914493 issue    
665802405 MDU6SXNzdWU2NjU4MDI0MDU= 124 sqlite-utils query should support named parameters 9599 closed 0     1 2020-07-26T15:25:10Z 2020-07-30T22:57:51Z 2020-07-27T03:53:58Z OWNER   To help out with escaping - so you can run this: sqlite-utils query "insert into foo (blah) values (:blah)" --param blah `something here` 140912432 issue    
668308777 MDU6SXNzdWU2NjgzMDg3Nzc= 129 "insert-files --sqlar" for creating SQLite archives 9599 closed 0     2 2020-07-30T02:28:29Z 2020-07-30T22:41:01Z 2020-07-30T22:40:55Z OWNER   A `--sqlar` option could cause `insert-files` to behave in the same way as SQLite's own sqlar mechanism. https://www.sqlite.org/sqlar.html and https://sqlite.org/sqlar/doc/trunk/README.md 140912432 issue    
666040390 MDU6SXNzdWU2NjYwNDAzOTA= 127 Ability to insert files piped to insert-files stdin 9599 closed 0     3 2020-07-27T07:09:33Z 2020-07-30T03:08:52Z 2020-07-30T03:08:18Z OWNER   > Inserting files by piping them in should work - but since a filename cannot be derived this will need a `--name blah.gif` option. > > cat blah.gif | sqlite-utils insert-files files.db files - --name=blah.gif > _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/122#issuecomment-664128071_ 140912432 issue    
666639051 MDU6SXNzdWU2NjY2MzkwNTE= 128 Support UUID and memoryview types 9599 closed 0     1 2020-07-27T23:08:34Z 2020-07-30T01:10:43Z 2020-07-30T01:10:43Z OWNER   `psycopg2` can return data from PostgreSQL as `uuid.UUID` or `memoryview` objects. These should to be supported by `sqlite-utils` - mainly for https://github.com/simonw/db-to-sqlite 140912432 issue    
667467128 MDU6SXNzdWU2Njc0NjcxMjg= 909 AsgiFileDownload: filename not correctly passed 9599 closed 0     2 2020-07-29T00:41:43Z 2020-07-30T00:56:17Z 2020-07-29T21:34:48Z OWNER   https://github.com/simonw/datasette/blob/3c33b421320c0be81a625ca7307b2e4416a9ed5b/datasette/utils/asgi.py#L396-L405 `self.filename` should be passed to `asgi_send_file()` 107914493 issue    
667840539 MDExOlB1bGxSZXF1ZXN0NDU4NDM1NTky 910 Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0 27856297 closed 0     1 2020-07-29T13:21:17Z 2020-07-29T21:26:05Z 2020-07-29T21:26:04Z CONTRIBUTOR simonw/datasette/pulls/910 Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/pytest-dev/pytest/releases">pytest's releases</a>.</em></p> <blockquote> <h2>6.0.0</h2> <h1>pytest 6.0.0 (2020-07-28)</h1> <p>(<strong>Please see the full set of changes for this release also in the 6.0.0rc1 notes below</strong>)</p> <h2>Breaking Changes</h2> <ul> <li> <p><a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/5584">#5584</a>: <strong>PytestDeprecationWarning are now errors by default.</strong></p> <p>Following our plan to remove deprecated features with as little disruption as possible, all warnings of type <code>PytestDeprecationWarning</code> now generate errors instead of warning messages.</p> <p><strong>The affected features will be effectively removed in pytest 6.1</strong>, so please consult the <a href="https://docs.pytest.org/en/latest/deprecations.html">Deprecations and Removals</a> section in the docs for directions on how to update existing code.</p> <p>In the pytest <code>6.0.X</code> series, it is possible to change the errors back into warnings as a stopgap measure by adding this to your <code>pytest.ini</code> file:</p> <pre lang="{.sourceCode" data-meta=".ini}"><code>[pytest] filterwarnings = ignore::pytest.PytestDeprecationWarning </code></pre> <p>But this will stop working when pytest <code>6.1</code> is released.</p> <p><strong>If you have concerns</strong> about the removal of a specific feature, please add a comment to <a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/5584">#5584</a>.</p> </li> <li> <p><a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7472">#7472</a>: The <code>exec_()</code> and <code>is_true()</code> methods of <code>_pytest._code.Frame</code> have been removed.</p> </li> </ul> <h2>Features</h2> <ul> <li><a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7464">#7464</a>: Added support for NO_COLOR and FORCE_COLOR environment variables to control colored output.</li> </ul> <h2>Improvements</h2> <ul> <li><a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7467">#7467</a>: <code>--log-file</code> CLI option and <code>log_file</code> ini marker now create subdirectories if needed.</li> <li><a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7489">#7489</a>: The pytest.raises function has a clearer error message when <code>match</code> equals the obtained string but is not a regex match. In this case it is suggested to escape the regex.</li> </ul> <h2>Bug Fixes</h2> <ul> <li><a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7392">#7392</a>: Fix the reported location of tests skipped with <code>@pytest.mark.skip</code> when <code>--runxfail</code> is used.</li> </ul> <!-- raw HTML omitted --> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst">pytest's changelog</a>.</em></p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/pytest-dev/pytest/commit/41a453959441d9b03cba3e47730efca27fa2f252"><code>41a4539</code></a> Add link to 6.0.0rc1 changelog</li> <li><a href="https://github.com/pytest-dev/pytest/commit/45ced1dc056d586fe3714823fa033cab27055c9f"><code>45ced1d</code></a> Update doc/en/announce/release-6.0.0.rst</li> <li><a href="https://github.com/pytest-dev/pytest/commit/1e4b8d447cfaaf4ee7c4636d2a03cf484d06f1cd"><code>1e4b8d4</code></a> Prepare release version 6.0.0</li> <li><a href="https://github.com/pytest-dev/pytest/commit/38029828d1d6bdc15b63a873142b1e91265e1a1c"><code>3802982</code></a> Support generating major releases using issue comments (<a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7548">#7548</a>)</li> <li><a href="https://github.com/pytest-dev/pytest/commit/c2c0b7a5426ed74ba0df1bbc4997ffe45e302ccf"><code>c2c0b7a</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7545">#7545</a> from asottile/pylib_in_docs</li> <li><a href="https://github.com/pytest-dev/pytest/commit/9818899df4eacb65a4ec1397a93b5ebcf43e825c"><code>9818899</code></a> remove usage of pylib in docs</li> <li><a href="https://github.com/pytest-dev/pytest/commit/3a060b77e81ebf990159e59cc8f8de26ad998e12"><code>3a060b7</code></a> Revert change to traceback repr (<a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7535">#7535</a>)</li> <li><a href="https://github.com/pytest-dev/pytest/commit/7ec6401ffabf79d52938ece5b8ff566a8b9c260e"><code>7ec6401</code></a> Change pytest deprecation warnings into errors for 6.0 release (<a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7362">#7362</a>)</li> <li><a href="https://github.com/pytest-dev/pytest/commit/a9799f0b35fe8fcc88ebd19b007cd2c6ac6b2a4f"><code>a9799f0</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7531">#7531</a> from bluetech/changelog-mypy-version</li> <li><a href="https://github.com/pytest-dev/pytest/commit/102360b49051e7d2407003445157b0115ae18234"><code>102360b</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/pytest-dev/pytest/issues/7519">#7519</a> from hroncok/pytest_warning_captured_deprecated</li> <li>Additional commits viewable in <a href="https://github.com/pytest-dev/pytest/compare/5.2.2...6.0.0">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired) </details> 107914493 pull    
668064026 MDU6SXNzdWU2NjgwNjQwMjY= 911 Rethink the --name option to "datasette publish" 9599 open 0   3268330 0 2020-07-29T18:49:49Z 2020-07-29T18:49:49Z   OWNER   `--name` works inconsistently across the different publish providers - on Cloud Run you should use `--service` instead for example. Need to review it across all of them and either remove it or clarify what it does. 107914493 issue    
665700495 MDU6SXNzdWU2NjU3MDA0OTU= 122 CLI utility for inserting binary files into SQLite 9599 closed 0     10 2020-07-26T03:27:39Z 2020-07-27T07:10:41Z 2020-07-27T07:09:03Z OWNER   SQLite BLOB columns can store entire binary files. The challenge is inserting them, since they don't neatly fit into JSON objects. It would be great if the `sqlite-utils` CLI had a trick for helping with this. Inspired by https://github.com/simonw/datasette-media/issues/14 140912432 issue    
621989740 MDU6SXNzdWU2MjE5ODk3NDA= 114 table.transform_table() method for advanced alter table 9599 open 0     12 2020-05-20T18:20:46Z 2020-07-27T04:01:13Z   OWNER   SQLite's `ALTER TABLE` can only do the following: * Rename a table * Rename a column * Add a column Notably, it cannot drop columns - so tricks like "add a float version of this text column, populate it, then drop the old one and rename" won't work. The docs here https://www.sqlite.org/lang_altertable.html describe a way of implementing full alters safely within a transaction, but it's fiddly. 1. Create new table 2. Copy data 3. Drop old table 4. Rename new into old It would be great if `sqlite-utils` provided an abstraction to help make these kinds of changes safely. 140912432 issue    
665819048 MDU6SXNzdWU2NjU4MTkwNDg= 126 Ability to insert binary data on the CLI using JSON 9599 closed 0     2 2020-07-26T16:54:14Z 2020-07-27T04:00:33Z 2020-07-27T03:59:45Z OWNER   > I could solve round tripping (at least a bit) by allowing insert to be run with a flag that says "these columns are base64 encoded, store the decoded data in a BLOB". > > That would solve inserting binary data using JSON too. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/125#issuecomment-664012247_ 140912432 issue    
665817570 MDU6SXNzdWU2NjU4MTc1NzA= 125 Output binary columns in "sqlite-utils query" JSON 9599 closed 0     4 2020-07-26T16:47:02Z 2020-07-27T00:49:41Z 2020-07-27T00:48:45Z OWNER   You get an error if you try to run a query that returns data from a BLOB. 140912432 issue    
665701216 MDU6SXNzdWU2NjU3MDEyMTY= 123 --raw option for outputting binary content 9599 closed 0     0 2020-07-26T03:35:39Z 2020-07-26T16:44:11Z 2020-07-26T16:44:11Z OWNER   Related to the `insert-files` work in #122 - it should be easy to get binary data back out of the database again. One way to do that could be: sqlite-utils files.db "select content from files where key = 'foo.jpg'" --raw The `--raw` option would cause just the contents of the first column to be output directly to stdout. 140912432 issue    
646448486 MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0 868 initial windows ci setup 702729 open 0     3 2020-06-26T18:49:13Z 2020-07-26T01:29:00Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/868 Picking up the work done on #557 with a new PR. Seeing if I can get this working. 107914493 pull    
665400224 MDU6SXNzdWU2NjU0MDAyMjQ= 906 "allow": true for anyone, "allow": false for nobody 9599 closed 0   5607421 3 2020-07-24T20:28:10Z 2020-07-25T00:07:10Z 2020-07-25T00:05:04Z OWNER   The "allow" syntax described at https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks currently says this: > An allow block can specify "no-one is allowed to do this" using an empty `{}`: > > ``` > { > "allow": {} > } > ``` `"allow": null` allows all access, though this isn't documented (it should be though). These are not very intuitive. How about also supporting `"allow": true` for "allow anyone" and `"allow": false` for "allow nobody"? 107914493 issue    
665407663 MDU6SXNzdWU2NjU0MDc2NjM= 908 Interactive debugging tool for "allow" blocks 9599 closed 0   5607421 3 2020-07-24T20:43:44Z 2020-07-25T00:06:15Z 2020-07-24T22:56:52Z OWNER   > It might be good to have a little interactive tool which helps debug these things, since there are quite a few edge-cases and the damage caused if people use them incorrectly is substantial. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/907#issuecomment-663726146_ 107914493 issue    
665403403 MDU6SXNzdWU2NjU0MDM0MDM= 907 Allow documentation doesn't explain what happens with multiple allow keys 9599 closed 0   5607421 2 2020-07-24T20:34:40Z 2020-07-24T22:53:07Z 2020-07-24T22:53:07Z OWNER   Documentation here: https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks Doesn't explain that with the following "allow" block: ```json { "allow": { "id": "simonw", "role": "staff" } } ``` The rule will match if EITHER the id is simonw OR the role includes staff. The tests are missing this case too: https://github.com/simonw/datasette/blob/028f193dd6233fa116262ab4b07b13df7dcec9be/tests/test_utils.py#L504 Related to #906 107914493 issue    
442327592 MDU6SXNzdWU0NDIzMjc1OTI= 456 Installing installs the tests package 7725188 closed 0     3 2019-05-09T16:35:16Z 2020-07-24T20:39:54Z 2020-07-24T20:39:54Z CONTRIBUTOR   Because `setup.py` uses `find_packages` and `tests` is on the top-level, `pip install datasette` will install a top-level package called `tests`, which is probably not desired behavior. The offending line is here: https://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40 And only `pip uninstall datasette` with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling: ``` $ pip uninstall datasette Uninstalling datasette-0.27: Would remove: /usr/local/bin/datasette /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/* /usr/local/lib/python3.7/site-packages/datasette/* /usr/local/lib/python3.7/site-packages/tests/* Would not remove (might be manually added): [ .. snip .. ] Proceed (y/n)? ``` This should be a relatively simple fix, and I could drop a PR if desired! Cheers 107914493 issue    
662439034 MDExOlB1bGxSZXF1ZXN0NDUzOTk1MTc5 902 Don't install tests package 32467826 closed 0     2 2020-07-21T01:08:50Z 2020-07-24T20:39:54Z 2020-07-24T20:39:54Z CONTRIBUTOR simonw/datasette/pulls/902 The `exclude` argument to `find_packages` needs an iterable of package names. Fixes: #456 107914493 pull    
664793260 MDU6SXNzdWU2NjQ3OTMyNjA= 2 Yak shave 145425 open 0     0 2020-07-23T22:04:18Z 2020-07-23T22:04:18Z   NONE   Just a quick note... The 23andme data is not exactly your genome, but a SNP chip of your genome. It's "some of your genotypes." Or about 0.1% of your genome. Nice work in any case! It deserves to be liberated!!!!! 209590345 issue    
663976976 MDU6SXNzdWU2NjM5NzY5NzY= 48 Add a table of contents to the README 9599 closed 0     3 2020-07-22T18:54:33Z 2020-07-23T17:46:07Z 2020-07-22T19:03:02Z MEMBER   Using https://github.com/jonschlinkert/markdown-toc 206156866 issue    
664485022 MDU6SXNzdWU2NjQ0ODUwMjI= 46 Suggestion: Add pull requests & reviews 1326704 open 0     0 2020-07-23T13:43:45Z 2020-07-23T13:43:45Z   NONE   Hi there! I saw your [presentation at Boston Python](https://www.meetup.com/bostonpython/events/271887195). I'm already a light user of Datasette (thank you!), but wasn't aware of this project. I've been working on a "pull request dashboard" to get a comprehensive view of the state of open PR's, esp. related to reviews (i.e., pending, approved, changes requested). Currently it's a CLI command, but I thought a Datasette UI might be fun. I see that PR's are available from the `issues` command, but I don't see reviews anywhere. From the [API docs](https://docs.github.com/en/rest/reference/pulls#reviews), it looks like there are separate endpoints for those (as well as pull requests in general). What do you think about adding that? Would you accept a PR? Any sense of the level of effort? 207052882 issue    
663317875 MDU6SXNzdWU2NjMzMTc4NzU= 905 /database.db download should include content-length header 9599 closed 0     2 2020-07-21T21:23:48Z 2020-07-22T04:59:46Z 2020-07-22T04:52:45Z OWNER   I can do this by modifying this function: https://github.com/simonw/datasette/blob/02dc6298bdbfb1d63e0d2a39ff597b5fcc60e06b/datasette/utils/asgi.py#L248-L270 107914493 issue    
663228985 MDU6SXNzdWU2NjMyMjg5ODU= 904 Make database_url and other helpers available within .render_template() for plugins 9599 open 0     0 2020-07-21T18:42:52Z 2020-07-21T18:43:05Z   OWNER   I tried using this block of template in a plugin and got an error: ```html {% block nav %} <p class="crumbs"> <a href="{{ base_url }}">home</a> / <a href="{{ database_url(database) }}">{{ database }}</a> / <a href="{{ database_url(database) }}/{{ table|quote_plus }}">{{ table }}</a> </p> {{ super() }} {% endblock %} ``` Error: `'database_url' is undefined` That's because `database_url` is only made available by the BaseView template here: https://github.com/simonw/datasette/blob/d6e03b04302a0852e7133dc030eab50177c37be7/datasette/views/base.py#L110-L125 107914493 issue    
663145122 MDU6SXNzdWU2NjMxNDUxMjI= 903 Add temporary plugin testing pattern to the testing docs 9599 open 0     0 2020-07-21T16:22:34Z 2020-07-21T16:22:45Z   OWNER   https://github.com/simonw/til/blob/master/pytest/registering-plugins-in-tests.md Would be useful to include this pattern on https://datasette.readthedocs.io/en/stable/testing_plugins.html 107914493 issue    
442402832 MDExOlB1bGxSZXF1ZXN0Mjc3NTI0MDcy 458 setup: add tests to package exclusion 7725188 closed 0     1 2019-05-09T19:47:21Z 2020-07-21T01:14:42Z 2019-05-10T01:54:51Z CONTRIBUTOR simonw/datasette/pulls/458 This PR fixes #456 by adding `tests` to the package exclusion list. Cheers 107914493 pull    
660355904 MDU6SXNzdWU2NjAzNTU5MDQ= 43 github-to-sqlite tags command for fetching tags 9599 closed 0     4 2020-07-18T20:14:12Z 2020-07-18T23:05:56Z 2020-07-18T21:52:15Z MEMBER   Fetches paginated data from https://api.github.com/repos/simonw/datasette/tags 207052882 issue    
660429601 MDU6SXNzdWU2NjA0Mjk2MDE= 45 Fix the demo - it breaks because of the tags table change 9599 closed 0     5 2020-07-18T22:49:32Z 2020-07-18T23:03:14Z 2020-07-18T23:03:13Z MEMBER   https://github.com/dogsheep/github-to-sqlite/runs/885773677 ``` File "/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py", line 476, in save_tags db["tags"].insert_all( File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/sqlite_utils/db.py", line 1145, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: table tags has no column named repo ``` That's because I changed the name in #44. I thought this would be safe since no-one else could possibly be using this yet (it hadn't shipped in a release) but turns out I broke my demo! 207052882 issue    
660413281 MDU6SXNzdWU2NjA0MTMyODE= 44 Rename tags.repo_id column to tags.repo 9599 closed 0     0 2020-07-18T22:13:46Z 2020-07-18T22:15:12Z 2020-07-18T22:15:12Z MEMBER   For improved consistency with other tables. https://observablehq.com/@simonw/datasette-table-diagram ![datasette-table-diagram(1)](https://user-images.githubusercontent.com/9599/87862843-3cca4900-c909-11ea-9c76-58b3f4aca43f.png) 207052882 issue    
659873662 MDU6SXNzdWU2NTk4NzM2NjI= 898 datasette.utils.testing module 9599 open 0     2 2020-07-18T03:53:24Z 2020-07-18T03:57:46Z   OWNER   The unit tests for plugins could benefit from reusing code from Datasette's own testing fixtures, e.g.: > I may need to borrow this function from Datasette for the tests: > https://github.com/simonw/datasette/blob/1f6a134369e6a7efaae9db469f15b1dd2b7f3709/tests/fixtures.py#L836-L851 > > It's not importable (it lives in `fixtures.py` and not in the `datasette` package that gets packaged for PyPI) - maybe I should fix that in Datasette by adding a `from datasette.utils.testing` module. _Originally posted by @simonw in https://github.com/simonw/datasette-update-api/issues/4#issuecomment-660419182_ 107914493 issue    
659580487 MDU6SXNzdWU2NTk1ODA0ODc= 897 Request method for retrieving the unparsed request body 9599 closed 0     1 2020-07-17T19:51:40Z 2020-07-17T20:16:02Z 2020-07-17T20:12:50Z OWNER   I'm writing a plugin (https://github.com/simonw/datasette-update-api/issues/2) that implements an API for inserting JSON data. As such, I'd like to `POST` a JSON blob rather than using `key=value` form encoded data. Right now there's a `request.post_vars()` method but no `request.post_body()` one: https://github.com/simonw/datasette/blob/c5f06bc356fb5917ef7fbb6fe4693f30d711cdb3/datasette/utils/asgi.py#L93-L103 107914493 issue    
658476055 MDU6SXNzdWU2NTg0NzYwNTU= 896 Use white-space: pre-wrap on ALL table cell contents 9599 closed 0     4 2020-07-16T19:05:21Z 2020-07-17T01:26:08Z 2020-07-17T01:26:08Z OWNER   Is there any reason NOT to apply `white-space: pre-wrap` to the contents of all table cells in Datasette? The default display mechanism of HTML (stripping leading/trailing slashes and collapsing all other whitespace) doesn't really make sense for displaying the kind of data that Datasette works with. 107914493 issue    
657747959 MDU6SXNzdWU2NTc3NDc5NTk= 895 SQL query output should show numeric values in a different colour 9599 open 0     1 2020-07-16T00:28:03Z 2020-07-16T00:32:47Z   OWNER   Compare https://latest.datasette.io/fixtures/sortable with https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+order+by+pk1%2C+pk2+limit+101 <img width="1574" alt="fixtures__select_pk1__pk2__content__sortable__sortable_with_nulls__sortable_with_nulls_2__text_from_sortable_order_by_pk1__pk2_limit_101_and_fixtures__sortable__201_rows" src="https://user-images.githubusercontent.com/9599/87612845-82e09c00-c6c0-11ea-806e-93764ca468c4.png"> 107914493 issue    
657572753 MDU6SXNzdWU2NTc1NzI3NTM= 894 Feature: sort by column cast to integer 9599 open 0     0 2020-07-15T18:47:48Z 2020-07-15T18:48:02Z   OWNER   If a text column actually contains numbers, being able to "sort by column, treated as numeric" would be really useful. Probably depends on column actions enabled by #690 107914493 issue    
656959584 MDU6SXNzdWU2NTY5NTk1ODQ= 893 pip3 install datasette not serving static on linuxbrew. 44167 open 0     0 2020-07-14T23:33:38Z 2020-07-14T23:33:58Z   NONE   *This error wasn't thrown* ``` Traceback (most recent call last): File "/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/site-packages/datasette/utils/asgi.py", line 289, in inner_static full_path.relative_to(root_path) File "/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/pathlib.py", line 904, in relative_to raise ValueError("{!r} does not start with {!r}" ValueError: '/home/linuxbrew/.linuxbrew/lib/python3.8/site-packages/datasette/static/app.css' does not start with '/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/site-packages/datasette/static' ``` Linuxbrew install python@3.8 with symbolic links when You call the full_path.relative_to(root_path) throw ValueError. This happened when you install from pip3 when you install with python3 setup.py develop , works good. Well at the end the static wasn't serving. 107914493 issue    
655974395 MDExOlB1bGxSZXF1ZXN0NDQ4MzU1Njgw 30 Handle empty bucket on first upload. Allow specifying the endpoint_url for services other than S3 (like b2 and digitalocean spaces) 110038 open 0     0 2020-07-13T16:15:26Z 2020-07-13T16:15:26Z   FIRST_TIME_CONTRIBUTOR dogsheep/dogsheep-photos/pulls/30 Finally got around to trying dogsheep-photos but I want to use backblaze's b2 service instead of AWS S3. Had to add a way to optionally specify the endpoint_url to connect to. Then with the bucket being empty the initial key retrieval would fail. Probably a better way to see that the bucket is empty than doing a test inside the paginator loop. Also probably a better way to specify the endpoint_url as we get and test for it twice using the same code in two different places but did not want to spend too much time worrying about it. 256834907 pull    
655465863 MDU6SXNzdWU2NTU0NjU4NjM= 892 "latest" in new documentation navbar is invisible 9599 closed 0     2 2020-07-12T19:57:21Z 2020-07-12T20:02:35Z 2020-07-12T20:02:17Z OWNER   On https://datasette.readthedocs.io/en/latest/ <img width="572" alt="Datasette_—_Datasette_documentation" src="https://user-images.githubusercontent.com/9599/87255376-1b67e980-c43f-11ea-8470-781ee8842336.png"> Compare with https://datasette.readthedocs.io/en/0.45/ <img width="584" alt="Datasette_—_Datasette_documentation" src="https://user-images.githubusercontent.com/9599/87255387-2fabe680-c43f-11ea-875e-c474364691ca.png"> Some custom CSS should fix it. 107914493 issue    
654405302 MDU6SXNzdWU2NTQ0MDUzMDI= 42 Option for importing just specific repos 9599 closed 0     0 2020-07-09T23:20:15Z 2020-07-09T23:25:35Z 2020-07-09T23:25:35Z MEMBER   For if you know which specific repos you care about, as opposed to loading everything owned by the authenticated user. github-to-sqlite repos specific.db -r simonw/datasette -r simonw/github-contents 207052882 issue    
652961907 MDU6SXNzdWU2NTI5NjE5MDc= 121 Improved (and better documented) support for transactions 9599 open 0     3 2020-07-08T04:56:51Z 2020-07-09T22:40:48Z   OWNER   _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/118#issuecomment-655283393_ We should put some thought into how this library supports and encourages smart use of transactions. 140912432 issue    
653529088 MDU6SXNzdWU2NTM1MjkwODg= 891 Consider using enable_callback_tracebacks(True) 9599 open 0     0 2020-07-08T19:07:16Z 2020-07-08T19:07:16Z   OWNER   From https://docs.python.org/3/library/sqlite3.html#sqlite3.enable_callback_tracebacks > `sqlite3.``enable_callback_tracebacks`(*flag*)[¶](https://docs.python.org/3/library/sqlite3.html#sqlite3.enable_callback_tracebacks "Permalink to this definition") > > By default you will not get any tracebacks in user-defined functions, aggregates, converters, authorizer callbacks etc. If you want to debug them, you can call this function with *flag* set to `True`. Afterwards, you will get tracebacks from callbacks on `sys.stderr`. Use [`False`](https://docs.python.org/3/library/constants.html#False "False") to disable the feature again. Maybe turn this on for all of Datasette? Are there any disadvantages to doing that? 107914493 issue    
652700770 MDU6SXNzdWU2NTI3MDA3NzA= 119 Ability to remove a foreign key 9599 open 0     1 2020-07-07T22:31:37Z 2020-07-08T18:10:18Z   OWNER   Useful if you add one but make a mistake and need to undo it without recreating the database from scratch. 140912432 issue    
651844316 MDExOlB1bGxSZXF1ZXN0NDQ1MDIzMzI2 118 Add insert --truncate option 79913 closed 0     9 2020-07-06T21:58:40Z 2020-07-08T17:26:21Z 2020-07-08T17:26:21Z CONTRIBUTOR simonw/sqlite-utils/pulls/118 Deletes all rows in the table (if it exists) before inserting new rows. SQLite doesn't implement a TRUNCATE TABLE statement but does optimize an unqualified DELETE FROM. This can be handy if you want to refresh the entire contents of a table but a) don't have a PK (so can't use --replace), b) don't want the table to disappear (even briefly) for other connections, and c) have to handle records that used to exist being deleted. Ideally the replacement of rows would appear instantaneous to other connections by putting the DELETE + INSERT in a transaction, but this is very difficult without breaking other code as the current transaction handling is inconsistent and non-systematic. There exists the possibility for the DELETE to succeed but the INSERT to fail, leaving an empty table. This is not much worse, however, than the current possibility of one chunked INSERT succeeding and being committed while the next chunked INSERT fails, leaving a partially complete operation. 140912432 pull    
652816158 MDExOlB1bGxSZXF1ZXN0NDQ1ODMzOTA4 120 Fix query command's support for DML 79913 closed 0     1 2020-07-08T01:36:34Z 2020-07-08T05:14:04Z 2020-07-08T05:14:04Z CONTRIBUTOR simonw/sqlite-utils/pulls/120 See commit messages for details. I ran into this while investigating another feature/issue. 140912432 pull    
628003707 MDU6SXNzdWU2MjgwMDM3MDc= 784 Ability to sign in to Datasette as a root account 9599 closed 0   5512395 5 2020-05-31T17:10:15Z 2020-07-06T19:31:53Z 2020-06-01T01:18:20Z OWNER   > I'm going to draw the line here: default Datasette supports authentication but only for a single user account ("admin"). Plugins can then add support for multiple user accounts, social auth, SSO etc. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/699#issuecomment-636498770_ 107914493 issue    
651159727 MDU6SXNzdWU2NTExNTk3Mjc= 41 Demo is failing to deploy 9599 closed 0     7 2020-07-05T22:40:33Z 2020-07-06T01:07:03Z 2020-07-06T01:07:02Z MEMBER   https://github.com/dogsheep/github-to-sqlite/runs/837714622?check_suite_focus=true ``` Creating Revision.........................................................................................................................................failed Deployment failed ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. Traceback (most recent call last): File "/opt/hostedtoolcache/Python/3.8.3/x64/bin/datasette", line 8, in <module> sys.exit(cli()) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/datasette/publish/cloudrun.py", line 138, in cloudrun check_call( File "/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/subprocess.py", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/datasette-222320/datasette github-to-sqlite' returned non-zero exit status 1. ##[error]Process completed with exit code 1. ``` 207052882 issue    
650305298 MDExOlB1bGxSZXF1ZXN0NDQzODIzMDQw 890 Load only python files from plugins-dir. 49260 closed 0     2 2020-07-03T02:47:32Z 2020-07-03T03:08:33Z 2020-07-03T03:08:33Z CONTRIBUTOR simonw/datasette/pulls/890 The current behavior for `--plugins-dir` is to load every file in that folder as a python module. This can result in errors if there are non-python files in the plugins dir (such as .mypy_cache). This PR restricts the module loading to only python files. 107914493 pull    
638270441 MDExOlB1bGxSZXF1ZXN0NDM0MDg1MjM1 848 Reload support for config_dir mode. 49260 closed 0     1 2020-06-14T02:34:46Z 2020-07-03T02:44:54Z 2020-07-03T02:44:53Z CONTRIBUTOR simonw/datasette/pulls/848 A reference implementation for adding support to reload when datasette is in the config_dir mode. This implementation is flawed since it is watching the entire directory and any changes to the database will reload the server and adding unrelated files to the directory will also reload the server. 107914493 pull    
573755726 MDU6SXNzdWU1NzM3NTU3MjY= 690 Mechanism for plugins to add UI to pages in specific locations 9599 open 0   5607421 5 2020-03-02T06:48:36Z 2020-07-02T17:11:25Z   OWNER   Now that we have support for plugins that can write I'm seeing all sorts of places where a plugin might need to add UI to the table page. Some examples: - `datasette-configure-fts` needs to add a "configure search for this table" link - a plugin that lets you render or delete tables needs to add a link or button somewhere - existing plugins like `datasette-vega` and `datasette-cluster-map` already do this with JavaScript The challenge here is that multiple plugins may want to do this, so simply overriding templates and populating names blocks doesn't entirely work as templates may override each other. 107914493 issue    
627794879 MDU6SXNzdWU2Mjc3OTQ4Nzk= 782 Redesign default JSON format in preparation for Datasette 1.0 9599 open 0   5607421 2 2020-05-30T18:47:07Z 2020-07-02T17:11:15Z   OWNER   The default JSON just isn't right. I find myself using `?_shape=array` for almost everything I build against the API. 107914493 issue    
649907676 MDU6SXNzdWU2NDk5MDc2NzY= 889 asgi_wrapper plugin hook is crashing at startup 49260 open 0     2 2020-07-02T12:53:13Z 2020-07-02T13:22:14Z   CONTRIBUTOR   Steps to reproduce: 1. Install datasette-media plugin `pip install datasette-media` 2. Launch datasette `datasette databasename.db` 3. Error ``` INFO: Started server process [927704] INFO: Waiting for application startup. ERROR: Exception in 'lifespan' protocol Traceback (most recent call last): File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/lifespan/on.py", line 48, in main await app(scope, self.receive, self.send) File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__ return await self.app(scope, receive, send) File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/datasette_media/__init__.py", line 9, in wrapped_app path = scope["path"] KeyError: 'path' ERROR: Application startup failed. Exiting. ``` 107914493 issue    
649702801 MDU6SXNzdWU2NDk3MDI4MDE= 888 URLs in release notes point to 127.0.0.1 3243482 open 0     0 2020-07-02T07:28:04Z 2020-07-02T07:28:04Z   CONTRIBUTOR   Just a quick heads up: Release notes for 0.45 include urls that point to localhost. https://github.com/simonw/datasette/releases/tag/0.45 107914493 issue    
649429772 MDU6SXNzdWU2NDk0Mjk3NzI= 886 Reconsider how _actor_X magic parameter deals with missing values 9599 open 0   5607421 2 2020-07-02T00:00:38Z 2020-07-02T01:52:02Z   OWNER   I had to build a custom `_actorornull` prefix for [datasette-saved-queries](https://github.com/simonw/datasette-saved-queries/blob/37c00e56ac398e1f9aa342d30357de013a9b37b4/datasette_saved_queries/__init__.py): ```python def actorornull(key, request): if request.actor is None: return None return request.actor.get(key) @hookimpl def register_magic_parameters(): return [ ("actorornull", actorornull), ] ``` Maybe the `actor` magic in Datasette core should do that out of the box? https://github.com/simonw/datasette/blob/f1f581b7ffcd5d8f3ae6c1c654d813a6641410eb/datasette/default_magic_parameters.py#L14-L17 107914493 issue    
649437530 MDU6SXNzdWU2NDk0Mzc1MzA= 887 Canned query page should show the name of the canned query 9599 closed 0   5607421 3 2020-07-02T00:10:39Z 2020-07-02T00:31:33Z 2020-07-02T00:23:45Z OWNER   This page here - the URL is http://127.0.0.1:8001/data/all_tables but "all_tables" is not shown in the UI: <img width="784" alt="data__select_sqlite_master_name_as_table_name__table_info___from_sqlite_master_join_pragma_table_info_sqlite_master_name__as_table_info_order_by_sqlite_master_name__table_info_cid_and_data__insert_into_saved_queries__name__sql__author_id__v" src="https://user-images.githubusercontent.com/9599/86302528-d978b100-bbbd-11ea-9a07-7b5d7d3de321.png"> 107914493 issue    
648749062 MDExOlB1bGxSZXF1ZXN0NDQyNTA1MDg4 883 Skip counting hidden tables 3243482 open 0     4 2020-07-01T07:38:08Z 2020-07-02T00:25:44Z   CONTRIBUTOR simonw/datasette/pulls/883 Potential fix for https://github.com/simonw/datasette/issues/859. Disabling table counts for hidden tables speeds up database page quite a bit. In my setup it reduced load time by 2/3 (~300 -> ~90ms) 107914493 pull    
276718605 MDU6SXNzdWUyNzY3MTg2MDU= 151 Set up a pattern portfolio 9599 closed 0     2 2017-11-25T02:09:49Z 2020-07-02T00:13:24Z 2020-05-03T03:13:16Z OWNER   https://www.slideshare.net/nataliedowne/practical-maintainable-css/75 This will be a single page that demonstrates all of the different CSS styles and classes available to Datasette. 107914493 issue    
647103735 MDU6SXNzdWU2NDcxMDM3MzU= 875 "Logged in as: XXX - logout" navigation item 9599 closed 0   5533512 3 2020-06-29T04:31:14Z 2020-07-02T00:13:24Z 2020-06-29T18:43:50Z OWNER   _Originally posted by @simonw in https://github.com/simonw/datasette/issues/840#issuecomment-650895874_ 107914493 issue    
646992096 MDU6SXNzdWU2NDY5OTIwOTY= 872 Release non-alpha plugins when 0.45 is out 9599 closed 0   5533512 0 2020-06-28T19:42:01Z 2020-07-01T23:48:51Z 2020-07-01T23:48:51Z OWNER   I have several plugins currently marked as alphas because they depend on `0.45a3`. When 0.45 is released I can ship new versions of these plugins that are full releases, not alphas - and switch them to depending on 0.45 (as opposed to the alpha): - [x] https://github.com/simonw/datasette-init - [x] https://github.com/simonw/datasette-glitch - [x] https://github.com/simonw/datasette-saved-queries - [x] https://github.com/simonw/datasette-write 107914493 issue    
649373451 MDU6SXNzdWU2NDkzNzM0NTE= 885 Blog entry about the release 9599 closed 0   5533512 1 2020-07-01T22:44:37Z 2020-07-01T22:44:48Z 2020-07-01T22:44:47Z OWNER     107914493 issue    
632724154 MDU6SXNzdWU2MzI3MjQxNTQ= 805 Writable canned queries live demo on Glitch 9599 open 0     11 2020-06-06T20:52:13Z 2020-07-01T22:44:01Z   OWNER   Needs to run somewhere with a mutable disk drive, so not Cloud Run or Heroku or Vercel. I think I'll put it on Glitch. 107914493 issue    
648673556 MDU6SXNzdWU2NDg2NzM1NTY= 882 Release notes for 0.45 9599 closed 0   5533512 2 2020-07-01T05:00:17Z 2020-07-01T21:48:08Z 2020-07-01T21:48:08Z OWNER   These are mostly done thanks to the alphas, but I went to have more paragraphs of prose and less bullet points. 107914493 issue    
649329013 MDU6SXNzdWU2NDkzMjkwMTM= 884 Only show "log out" button if user is authenticated using a ds_actor cookie 9599 closed 0   5533512 0 2020-07-01T21:21:28Z 2020-07-01T21:26:07Z 2020-07-01T21:26:06Z OWNER   Right now the "Log out" button in the navigation will show up even if the user was authenticated by a plugin using a mechanism other than the `ds_actor` cookie. It should only show if the logged-in user has that cookie. 107914493 issue    
648637666 MDU6SXNzdWU2NDg2Mzc2NjY= 880 POST to /db/canned-query.json should be supported 9599 open 0   5607421 2 2020-07-01T03:14:43Z 2020-07-01T21:06:21Z   OWNER   Now that CSRF is solved for API requests (#835) it would be good to support API requests to the `.json` extension. 107914493 issue    
648421105 MDU6SXNzdWU2NDg0MjExMDU= 877 Consider dropping explicit CSRF protection entirely? 9599 open 0     8 2020-06-30T19:00:55Z 2020-07-01T19:12:16Z   OWNER   https://scotthelme.co.uk/csrf-is-dead/ from Feb 2017 has background here. The `SameSite=lax` cookie property effectively eliminates CSRF in modern browsers. https://caniuse.com/#search=SameSite shows 92.13% global support for it. Datasette already uses `SameSite=lax` when it sets cookies by default: https://github.com/simonw/datasette/blob/af350ba4571b8e3f9708c40f2ddb48fea7ac1084/datasette/utils/asgi.py#L327-L341 A few options then. I could ditch CSRF protection entirely. I could make it optional - turn it off by default, but let users who care about that remaining 7.87% of global users opt back into it. One catch: login CSRF: I don't see how `SameSite=lax` protects against that attack. 107914493 issue    
648659536 MDU6SXNzdWU2NDg2NTk1MzY= 881 Figure out why restore_working_directory is needed in some places 9599 open 0     0 2020-07-01T04:19:25Z 2020-07-01T04:19:25Z   OWNER   This is a frustrating workaround. I have a `restore_working_directory` fixture that I wrote to solve errors that look like this: ``` /Users/simon/Dropbox/Development/datasette/tests/test_publish_cloudrun.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/local/opt/python/Frameworks/Python.framework/Versions/3.7/lib/python3.7/contextlib.py:112: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <click.testing.CliRunner object at 0x1135ad110> @contextlib.contextmanager def isolated_filesystem(self): """A context manager that creates a temporary folder and changes the current working directory to it for isolated filesystem tests. """ > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory ``` Here's an example of it in use: removing the `restore_working_directory` argument from this function causes the failure. https://github.com/simonw/datasette/blob/549b1c2063db48c4622ee5c7b478a1e3cbc1ac07/tests/test_plugins.py#L689-L690 I'd like to not have to do this. 107914493 issue    
634112607 MDU6SXNzdWU2MzQxMTI2MDc= 812 Ability to customize what happens when a view permission fails 9599 closed 0   5533512 3 2020-06-08T04:26:14Z 2020-07-01T04:17:46Z 2020-07-01T04:17:45Z OWNER   Currently view permission failures raise a `Forbidden` error which is transformed into a 403. It would be good if this page could offer a way forward - maybe just by linking to (or redirecting to) a login screen. This behaviour will vary based on authentication plugins, so a new plugin hook is probably the best way to do this. 107914493 issue    
642572841 MDU6SXNzdWU2NDI1NzI4NDE= 859 Database page loads too slowly with many large tables (due to table counts) 3243482 open 0     17 2020-06-21T14:23:17Z 2020-07-01T03:10:21Z   CONTRIBUTOR   Hey, I have a database that I save in HTML from couple of web scrapers. There are around 200k+, 50+ rows in a couple of tables, with sqlite file weighing around 600MB. The app runs on a VPS with 2 core CPU, 4GB RAM and refreshing database page regularly takes more than 10 seconds. I was suspecting that counting tables was the culprit, but manually running `select count(*) from table_name` for the largest table finishes under a second. I've looked at the source code. There's a check for index page for mutable databases larger than 100MB https://github.com/simonw/datasette/blob/799c5d53570d773203527f19530cf772dc2eeb24/datasette/views/index.py#L15 but this check is not performed for database page. I've manually crippled `Database::table_counts` method ```py async def table_counts(self, limit=10): if not self.is_mutable and self.cached_table_counts is not None: return self.cached_table_counts # Try to get counts for each table, $limit timeout for each count counts = {} for table in await self.table_names(): try: # table_count = ( # await self.execute( # "select count(*) from [{}]".format(table), # custom_time_limit=limit, # ) # ).rows[0][0] counts[table] = 10 # table_count # In some cases I saw "SQL Logic Error" here in addition to # QueryInterrupted - so we catch that too: except (QueryInterrupted, sqlite3.OperationalError, sqlite3.DatabaseError): counts[table] = None if not self.is_mutable: self.cached_table_counts = counts return counts ``` now the page loads in <100ms. Is it possible to apply size check on database page too? <details> <summary> /-/versions output </summary> <pre> { "python": { "version": "3.8.0", "full": "3.8.0 (default, Oct 28 2019, 16:14:01) \n[GCC 8.3.0]" }, "datasette": { "version": "0.44" }, "asgi": "3.0", "uvicorn": "0.11.5", "sqlite": { "version": "3.22.0", "fts_versions": [ "FTS5", "FTS4", "FTS3" ], "extensions": { "json1": null }, "compile_options": [ "COMPILER=gcc-7.4.0", "ENABLE_COLUMN_METADATA", "ENABLE_DBSTAT_VTAB", "ENABLE_FTS3", "ENABLE_FTS3_PARENTHESIS", "ENABLE_FTS3_TOKENIZER", "ENABLE_FTS4", "ENABLE_FTS5", "ENABLE_JSON1", "ENABLE_LOAD_EXTENSION", "ENABLE_PREUPDATE_HOOK", "ENABLE_RTREE", "ENABLE_SESSION", "ENABLE_STMTVTAB", "ENABLE_UNLOCK_NOTIFY", "ENABLE_UPDATE_DELETE_LIMIT", "HAVE_ISNAN", "LIKE_DOESNT_MATCH_BLOBS", "MAX_SCHEMA_RETRY=25", "MAX_VARIABLE_NUMBER=250000", "OMIT_LOOKASIDE", "SECURE_DELETE", "SOUNDEX", "TEMP_STORE=1", "THREADSAFE=1" ] } } </pre> </details> 107914493 issue    
637363686 MDU6SXNzdWU2MzczNjM2ODY= 835 Mechanism for skipping CSRF checks on API posts 9599 closed 0   5533512 13 2020-06-11T22:41:10Z 2020-07-01T03:08:07Z 2020-07-01T03:08:07Z OWNER   While experimenting with https://github.com/simonw/datasette-auth-tokens I realized it's not currently possible to build API client programs that POST to Datasette because there's no mechanism for them to skip the CSRF checks added in #798. 107914493 issue    
647879783 MDU6SXNzdWU2NDc4Nzk3ODM= 876 Add log out link to the pattern portfolio 9599 closed 0   5533512 1 2020-06-30T05:42:15Z 2020-06-30T23:50:04Z 2020-06-30T23:47:31Z OWNER   Follows #875 107914493 issue    
648569227 MDU6SXNzdWU2NDg1NjkyMjc= 879 Database page documentation still talks about hashes in URLs 9599 closed 0   5533512 1 2020-06-30T23:43:17Z 2020-06-30T23:48:06Z 2020-06-30T23:45:42Z OWNER   https://datasette.readthedocs.io/en/0.44/pages.html > Note that these URLs end in a 7 character hash. This hash is derived from the contents of the database, and ensures that each URL is immutable: the data returned from a URL containing the hash will always be the same, since if the contents of the database file changes by even a single byte a new hash will be generated. This isn't accurate any more - that's not default behaviour, and it may be removed entirely in #647. 107914493 issue    
636722501 MDU6SXNzdWU2MzY3MjI1MDE= 832 Having view-table permission but NOT view-database should still grant access to /db/table 9599 closed 0   5533512 12 2020-06-11T05:12:59Z 2020-06-30T23:42:11Z 2020-06-30T23:42:11Z OWNER   Stumbled into this while working on `datasette-permissions-sql`. I had granted table permissions, but the permission check wasn't even executed because the user failed the previous `view-database` check. 107914493 issue    
646737558 MDU6SXNzdWU2NDY3Mzc1NTg= 870 Refactor default views to use register_routes 9599 open 0   3268330 10 2020-06-27T18:53:12Z 2020-06-30T19:26:35Z   OWNER   It would be much cleaner if Datasette's default views were all registered using the new `register_routes()` plugin hook. Could dramatically reduce the code in `datasette/app.py`. > The ideal fix here would be to rework my `BaseView` subclass mechanism to work with `register_routes()` so that those views don't have any special privileges above plugin-provided views. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/864#issuecomment-648580556_ 107914493 issue    
648435885 MDU6SXNzdWU2NDg0MzU4ODU= 878 BaseView should be a documented API for plugins to use 9599 open 0   3268330 0 2020-06-30T19:26:13Z 2020-06-30T19:26:26Z   OWNER   Can be part of #870 - refactoring existing views to use `register_routes()`. > I'm going to put the new `check_permissions()` method on `BaseView` as well. If I want that method to be available to plugins I can do so by turning that `BaseView` class into a documented API that plugins are encouraged to use themselves. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/832#issuecomment-651995453_ 107914493 issue    
647095487 MDU6SXNzdWU2NDcwOTU0ODc= 873 "datasette -p 0 --root" gives the wrong URL 9599 open 0     12 2020-06-29T04:03:06Z 2020-06-29T15:44:54Z   OWNER   ``` $ datasette -p 0 --root http://127.0.0.1:0/-/auth-token?token=2d498c... ``` The port is incorrect. 107914493 issue    
637966833 MDU6SXNzdWU2Mzc5NjY4MzM= 840 Log out mechanism for clearing ds_actor cookie 9599 closed 0   5533512 4 2020-06-12T19:41:51Z 2020-06-29T04:31:43Z 2020-06-29T04:31:43Z OWNER   Need a cookie clearing mechanism and a way to show that you are logged in. `datasette-auth-github` had a solution for this that can be pulled into core. 107914493 issue    
647095808 MDU6SXNzdWU2NDcwOTU4MDg= 874 /favicon.ico 500 error 9599 closed 0   5533512 0 2020-06-29T04:04:22Z 2020-06-29T04:27:18Z 2020-06-29T04:27:18Z OWNER   ``` Traceback (most recent call last): File "...datasette/datasette/app.py", line 969, in route_path response = await view(request, send) TypeError: favicon() missing 1 required positional argument: 'send' ``` 107914493 issue    
644309017 MDU6SXNzdWU2NDQzMDkwMTc= 864 datasette.add_message() doesn't work inside plugins 9599 closed 0   5533512 6 2020-06-24T04:30:06Z 2020-06-29T00:51:01Z 2020-06-29T00:51:01Z OWNER   Similar problem to #863 - calling `datasette.add_message()` in a view registered using the `register_routes()` plugin hook doesn't work, because the code that writes accumulated messages to the `ds_messages` signed cookie lives in the `BaseView` class here: https://github.com/simonw/datasette/blob/28bb1c51897f3956861755e345e18b8e0b1423ac/datasette/views/base.py#L94-L97 107914493 issue    
638259643 MDU6SXNzdWU2MzgyNTk2NDM= 847 Take advantage of .coverage being a SQLite database 9599 closed 0     4 2020-06-14T00:41:25Z 2020-06-28T20:50:21Z 2020-06-28T20:50:21Z OWNER   The `.coverage` file generated by running `pytest-cov` is now a SQLite database! I could do something interesting with this. Maybe after each test run for a new commit I could store that database file somewhere? Lots of interesting challenges here. I got a change into `coveragepy` last year which helps make the custom SQL functions available for doing fun things in Datasette: https://github.com/nedbat/coveragepy/issues/868 Bigger challenge: if I have a DB file for every commit, that's hundreds (potentially thousands) of DB files. Datasette isn't designed to handle thousands of files like that. So, do I figure out how to have Datasette open a file on-command for just a single request? Or, an easier option, do I copy data from those files into a single database with a modified schema to include the commit hash in each table row? (Following on from #841 and #844) 107914493 issue    
646840273 MDU6SXNzdWU2NDY4NDAyNzM= 871 Rename the _timestamp magic parameters to _now 9599 closed 0   5533512 1 2020-06-28T04:49:08Z 2020-06-28T19:49:49Z 2020-06-28T19:49:49Z OWNER   I like the shorter name better. Follows on from #842. 107914493 issue    
637342551 MDU6SXNzdWU2MzczNDI1NTE= 834 startup() plugin hook 9599 closed 0   5533512 6 2020-06-11T21:48:14Z 2020-06-28T19:38:50Z 2020-06-13T17:56:12Z OWNER   It might be useful to have an `startup` hook which gets passed the `datasette` object as soon as Datasette has finished initializing. My initial use-case for this is configuration verification - checking that the `"plugins"` configuration block for this plugin contains valid details. I imagine there are plenty of other potential uses for this as well. 107914493 issue    
638212085 MDU6SXNzdWU2MzgyMTIwODU= 842 Magic parameters for canned queries 9599 closed 0   5533512 18 2020-06-13T18:50:08Z 2020-06-28T03:30:31Z 2020-06-28T02:58:18Z OWNER   Now that writable canned queries (#698) have landed, it would be neat if they supported "magic" parameters - parameters that are automatically populated with: - the current actor ID / other actor properties - the current date and time - the user's IP or user-agent And maybe other things potentially added by plugins. 107914493 issue    
646734280 MDExOlB1bGxSZXF1ZXN0NDQwOTQ2ODE3 869 Magic parameters for canned queries 9599 closed 0   5533512 1 2020-06-27T18:37:21Z 2020-06-28T02:58:18Z 2020-06-28T02:58:17Z OWNER simonw/datasette/pulls/869 Implementation for #842 TODO: - [x] Add tests for built-in magic parameters - [x] Magic parameters should not show up as blank form fields on the query page - [x] Update documentation for new `_request_X` (now called `_header_X`) implementation where X is a key from the ASGI scope - [x] Make sure these only work for canned queries, not for arbitrary SQL queries (security issue) - [x] Add test for the `register_magic_parameters` plugin hook - [x] Add documentation for the `register_magic_parameters` plugin hook 107914493 pull    
645975649 MDU6SXNzdWU2NDU5NzU2NDk= 867 register_routes() should support non-async view functions too 9599 closed 0   5533512 1 2020-06-26T03:11:25Z 2020-06-27T18:30:41Z 2020-06-27T18:30:40Z OWNER   I was looking at this: https://github.com/simonw/datasette-block-robots/blob/main/datasette_block_robots/__init__.py ```python from datasette import hookimpl from datasette.utils.asgi import Response async def robots_txt(): return Response.text("User-agent: *\nDisallow: /") @hookimpl def register_routes(): return [ (r"^/robots\.txt$", robots_txt), ] ``` And I realized that if `register_routes()` could support non-async view functions it could be reduced to this: ```python @hookimpl def register_routes(): return [ (r"^/robots\.txt$", lambda: Response.text("User-agent: *\nDisallow: /")), ] ``` 107914493 issue    
644610729 MDExOlB1bGxSZXF1ZXN0NDM5MjAzODA4 866 Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15 27856297 closed 0     1 2020-06-24T13:21:47Z 2020-06-24T18:50:57Z 2020-06-24T18:50:56Z CONTRIBUTOR simonw/datasette/pulls/866 Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/53f3da7aefc719e62cbaa89e57ab68a7e12cc3c3"><code>53f3da7</code></a> Prepare for release</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/e99569de645c37fc048964ab4c5073529080fd86"><code>e99569d</code></a> A line is added to the changelog.</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/4099b6351793611acabd8b26d93aabd44ce200c5"><code>4099b63</code></a> One import is not needed</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/68513b33616ff2ef93972acf1b992e7b1a18c4d1"><code>68513b3</code></a> Clarify names and comments, according to yanlend comments 26 May</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/907e8f24b9444f50cb30f17265665706882bc01f"><code>907e8f2</code></a> FIX new test_cases on python 3.5 &amp; 3.6</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/51d986cec83fdbc14fa08015424c79397afc7ad9"><code>51d986c</code></a> To solve test cases that fail:</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/f97e900f1fcb51a572a1b861c95ac49e69bbfdf9"><code>f97e900</code></a> 1) Test case (test_async_fixtures_with_finalizer) refactoring to pass on pyth...</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/c1131f8b5313189508dc81d7ef1937ccb136658b"><code>c1131f8</code></a> 1) A new test case that fails with 0.12.0, and pass with this commit.</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/7a255bc4cf82aba7aa4b213c1a97c81d532c1e85"><code>7a255bc</code></a> 0.13.0 open for business</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/b8e2a45e152a196cef7fdb6ddcf3e2d67a0f01ca"><code>b8e2a45</code></a> 0.12.0</li> <li>Additional commits viewable in <a href="https://github.com/pytest-dev/pytest-asyncio/compare/v0.10.0...v0.14.0">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired) </details> 107914493 pull    
644582921 MDU6SXNzdWU2NDQ1ODI5MjE= 865 base_url doesn't seem to work when adding criteria and clicking "apply" 6739646 open 0     2 2020-06-24T12:39:57Z 2020-06-24T18:43:08Z   NONE   Over on Apache Tika, we're using datasette to allow users to make sense of the metadata for our file regression testing corpus. This could be user error in how I've set up the reverse proxy! I started datasette like so: `docker run -d -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/corpora-metadata.db --config sql_time_limit_ms:60000 --config base_url:/datasette/` I then reverse proxied like so: ProxyPreserveHost On ProxyPass /datasette http://x.y.z.q:xxxx ProxyPassReverse /datasette http://x.y.z.q:xxx Regular sql works perfectly: https://corpora.tika.apache.org/datasette/corpora-metadata?sql=select+mime_string%2C+count%281%29+as+cnt%0D%0Afrom+profiles+p%0D%0Ajoin+mimes+m+on+p.mime_id%3Dm.mime_id%0D%0Agroup+by+mime_string%0D%0Aorder+by+cnt+desc However, adding criteria and clicking 'Apply' https://corpora.tika.apache.org/datasette/corpora-metadata/tika_1_24_1_mimes?_sort=file&mime__exact=text%2Fplain bounces back to: https://corpora.tika.apache.org/corpora-metadata/tika_1_24_1_mimes?_sort=file&file__contains=bug&mime__exact=text%2Fplain 107914493 issue    
642388564 MDU6SXNzdWU2NDIzODg1NjQ= 858 publish heroku does not work on Windows 10 870912 open 0     1 2020-06-20T14:40:28Z 2020-06-24T18:42:10Z   NONE   When executing "datasette publish heroku schools.db" on Windows 10, I get the following error ```shell File "c:\users\dell\.virtualenvs\sec-schools-jn-cwk8z\lib\site-packages\datasette\publish\heroku.py", line 54, in heroku line.split()[0] for line in check_output(["heroku", "plugins"]).splitlines() File "c:\python38\lib\subprocess.py", line 411, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File "c:\python38\lib\subprocess.py", line 489, in run with Popen(*popenargs, **kwargs) as process: File "c:\python38\lib\subprocess.py", line 854, in __init__ self._execute_child(args, executable, preexec_fn, close_fds, File "c:\python38\lib\subprocess.py", line 1307, in _execute_child hp, ht, pid, tid = _winapi.CreateProcess(executable, args, FileNotFoundError: [WinError 2] The system cannot find the file specified ``` Changing https://github.com/simonw/datasette/blob/55a6ffb93c57680e71a070416baae1129a0243b8/datasette/publish/heroku.py#L54 to ```python line.split()[0] for line in check_output(["heroku", "plugins"], shell=True).splitlines() ``` as well as the other `check_output()` and `call()` within the same file leads me to another recursive error about temp files 107914493 issue    
637395097 MDU6SXNzdWU2MzczOTUwOTc= 838 Incorrect URLs when served behind a proxy with base_url set 79913 open 0     4 2020-06-11T23:58:55Z 2020-06-24T12:51:48Z   NONE   I'm running `datasette serve --config base_url:/foo/ …`, proxying to it with this Apache config: ProxyPass /foo/ http://localhost:8001/ ProxyPassReverse /foo/ http://localhost:8001/ and then accessing it via `https://example.com/foo/`. Although many of the URLs in the pages are correct (presumably because they either use absolute paths which include `base_url` or relative paths), the faceting and pagination links still use fully-qualified URLs pointing at `http://localhost:8001`. I looked into this a little in the source code, and it seems to be an issue anywhere `request.url` or `request.path` is used, as these contain the values for the request between the frontend (Apache) and backend (Datasette) server. Those properties are primarily used via the `path_with_…` family of utility functions and the `Datasette.absolute_url` method. 107914493 issue    
640943441 MDU6SXNzdWU2NDA5NDM0NDE= 853 Ensure register_routes() works for POST 9599 closed 0   5533512 1 2020-06-18T06:24:55Z 2020-06-24T04:30:30Z 2020-06-18T16:22:02Z OWNER   https://twitter.com/amjithr/status/1273496759684050944 107914493 issue    
644283211 MDU6SXNzdWU2NDQyODMyMTE= 863 {{ csrftoken() }} doesn't work with datasette.render_template() 9599 closed 0   5533512 0 2020-06-24T03:11:49Z 2020-06-24T04:30:30Z 2020-06-24T03:24:01Z OWNER   The documentation here suggests that it will work: https://github.com/simonw/datasette/blob/eed116ac0599c7d21b7129af94d58ce03a923e4e/docs/internals.rst#L540-L546 But right now the `csrftoken` variable is set in BaseView.render, which means it's not visible to plugins that try to render templates using `datasette.render_template`: https://github.com/simonw/datasette/blob/799c5d53570d773203527f19530cf772dc2eeb24/datasette/views/base.py#L99-L106 107914493 issue    
644161221 MDU6SXNzdWU2NDQxNjEyMjE= 117 Support for compound (composite) foreign keys 9599 open 0     3 2020-06-23T21:33:42Z 2020-06-23T21:40:31Z   OWNER   It turns out SQLite supports composite foreign keys: https://www.sqlite.org/foreignkeys.html#fk_composite Their example looks like this: ```sql CREATE TABLE album( albumartist TEXT, albumname TEXT, albumcover BINARY, PRIMARY KEY(albumartist, albumname) ); CREATE TABLE song( songid INTEGER, songartist TEXT, songalbum TEXT, songname TEXT, FOREIGN KEY(songartist, songalbum) REFERENCES album(albumartist, albumname) ); ``` Here's what that looks like in sqlite-utils: ``` In [1]: import sqlite_utils In [2]: import sqlite3 In [3]: conn = sqlite3.connect(":memory:") In [4]: conn Out[4]: <sqlite3.Connection at 0x1087186c0> In [5]: conn.executescript(""" ...: CREATE TABLE album( ...: albumartist TEXT, ...: albumname TEXT, ...: albumcover BINARY, ...: PRIMARY KEY(albumartist, albumname) ...: ); ...: ...: CREATE TABLE song( ...: songid INTEGER, ...: songartist TEXT, ...: songalbum TEXT, ...: songname TEXT, ...: FOREIGN KEY(songartist, songalbum) REFERENCES album(albumartist, albumname) ...: ); ...: """) Out[5]: <sqlite3.Cursor at 0x1088def10> In [6]: db = sqlite_utils.Database(conn) In [7]: db.tables Out[7]: [<Table album (albumartist, albumname, albumcover)>, <Table song (songid, songartist, songalbum, songname)>] In [8]: db.tables[0].foreign_keys Out[8]: [] In [9]: db.tables[1].foreign_keys Out[9]: [ForeignKey(table='song', column='songartist', other_table='album', other_column='albumartist'), ForeignKey(table='song', column='songalbum', other_table='album', other_column='albumname')] ``` The table appears to have two separate foreign keys, when actually it has a single compound composite foreign key. 140912432 issue    
Powered by Datasette · Query took 17.897ms · About: github-to-sqlite