issues

952 rows where repo = 107914493 sorted by updated_at descending

View and edit SQL

Suggested facets: milestone, comments, author_association, created_at (date), updated_at (date), closed_at (date)

type

state

repo

  • datasette · 952
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app
710650633 MDU6SXNzdWU3MTA2NTA2MzM= 979 Default table view JSON should include CREATE TABLE simonw 9599 open 0     2 2020-09-28T23:54:58Z 2020-09-28T23:56:28Z   OWNER  

https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable

datasette 107914493 issue    
710506708 MDU6SXNzdWU3MTA1MDY3MDg= 978 Rendering glitch with column headings on mobile simonw 9599 closed 0     6 2020-09-28T19:04:45Z 2020-09-28T22:43:01Z 2020-09-28T22:43:01Z OWNER  

https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_parse%28%2210+october+2020+3pm%22%29%2C%0D%0A++dateutil_easter%28%222020%22%29%2C%0D%0A++dateutil_parse_fuzzy%28%22This+is+due+10+september%22%29%2C%0D%0A++dateutil_parse%28%221%2F2%2F2020%22%29%2C%0D%0A++dateutil_parse%28%222020-03-04%22%29%2C%0D%0A++dateutil_parse_dayfirst%28%222020-03-04%22%29%2C%0D%0A++dateutil_easter%282020%29

datasette 107914493 issue    
710269200 MDExOlB1bGxSZXF1ZXN0NDk0MTQ2MDQz 977 Update pytest requirement from <6.1.0,>=5.2.2 to >=5.2.2,<6.2.0 dependabot-preview[bot] 27856297 closed 0     1 2020-09-28T13:33:05Z 2020-09-28T22:16:36Z 2020-09-28T22:16:35Z CONTRIBUTOR simonw/datasette/pulls/977

Updates the requirements on pytest to permit the latest version.


Release notes

Sourced from https://github.com/pytest-dev/pytest/releases">pytest's releases.



6.1.0


pytest 6.1.0 (2020-09-26)


Breaking Changes




  • https://github-redirect.dependabot.com/pytest-dev/pytest/issues/5585">#5585: As per our policy, the following features which have been deprecated in the 5.X series are now
    removed:



    • The funcargnames read-only property of FixtureRequest, Metafunc, and Function classes. Use fixturenames attribute.

    • @pytest.fixture no longer supports positional arguments, pass all arguments by keyword instead.

    • Direct construction of Node subclasses now raise an error, use from_parent instead.

    • The default value for junit_family has changed to xunit2. If you require the old format, add junit_family=xunit1 to your configuration file.

    • The TerminalReporter no longer has a writer attribute. Plugin authors may use the public functions of the TerminalReporter instead of accessing the TerminalWriter object directly.

    • The --result-log option has been removed. Users are recommended to use the https://github.com/pytest-dev/pytest-reportlog">pytest-reportlog plugin instead.


    For more information consult
    https://docs.pytest.org/en/stable/deprecations.html">Deprecations and Removals in the docs.




Deprecations



Features



Improvements






Changelog

Sourced from https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst">pytest's changelog.




Commits



Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
datasette 107914493 pull    
642388564 MDU6SXNzdWU2NDIzODg1NjQ= 858 publish heroku does not work on Windows 10 simonlau 870912 open 0     2 2020-06-20T14:40:28Z 2020-09-27T21:23:05Z   NONE  

When executing "datasette publish heroku schools.db" on Windows 10, I get the following error

  File "c:\users\dell\.virtualenvs\sec-schools-jn-cwk8z\lib\site-packages\datasette\publish\heroku.py", line 54, in heroku
    line.split()[0] for line in check_output(["heroku", "plugins"]).splitlines()
  File "c:\python38\lib\subprocess.py", line 411, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "c:\python38\lib\subprocess.py", line 489, in run
    with Popen(*popenargs, **kwargs) as process:
  File "c:\python38\lib\subprocess.py", line 854, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "c:\python38\lib\subprocess.py", line 1307, in _execute_child
    hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
FileNotFoundError: [WinError 2] The system cannot find the file specified

Changing https://github.com/simonw/datasette/blob/55a6ffb93c57680e71a070416baae1129a0243b8/datasette/publish/heroku.py#L54

to

line.split()[0] for line in check_output(["heroku", "plugins"], shell=True).splitlines()

as well as the other check_output() and call() within the same file leads me to another recursive error about temp files

datasette 107914493 issue    
708185405 MDU6SXNzdWU3MDgxODU0MDU= 975 Dependabot couldn't authenticate with https://pypi.python.org/simple/ dependabot-preview[bot] 27856297 closed 0     0 2020-09-24T13:44:40Z 2020-09-25T13:34:34Z 2020-09-25T13:34:34Z CONTRIBUTOR  

Dependabot couldn't authenticate with https://pypi.python.org/simple/.

You can provide authentication details in your Dependabot dashboard by clicking into the account menu (in the top right) and selecting 'Config variables'.

View the update logs.

datasette 107914493 issue    
708289783 MDU6SXNzdWU3MDgyODk3ODM= 976 Idea: -o could open to a more convenient location simonw 9599 open 0     1 2020-09-24T15:56:35Z 2020-09-24T17:42:35Z   OWNER  

Idea: if a database only has a single table, this could open straight to /db/table. If it has multiple tables but a single database it could open straight to /db.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/970#issuecomment-698434236_

datasette 107914493 issue    
705108492 MDU6SXNzdWU3MDUxMDg0OTI= 970 request an "-o" option on "datasette server" to open the default browser at the running url secretGeek 2861690 closed 0     4 2020-09-20T13:16:34Z 2020-09-24T15:56:50Z 2020-09-22T14:27:04Z NONE  

This is a request for a "convenience" feature, and only a nice to have. It's based on seeing this feature in several little command line hypertext server apps.

If you run, for example:

datasette.exe serve --open "mydb.s3db"

I would like it if default browser is launched, at the URL that is being served.

The angular cli does this, for example

ng serve <project> --open #see https://angular.io/cli/serve

...as does my usual mini web server of choice when inspecting local static files....

npx http-server -o # see https://www.npmjs.com/package/http-server

Just a tiny thing. Love your work!

datasette 107914493 issue    
275125561 MDU6SXNzdWUyNzUxMjU1NjE= 123 Datasette serve should accept paths/URLs to CSVs and other file formats simonw 9599 open 0     6 2017-11-19T02:05:48Z 2020-09-24T07:42:05Z   OWNER  

This would remove the csvs-to-sqlite step which I end up using for almost everything.

I'm hesitant to introduce pandas as a required dependency though since it require compiling numpy. Could build it so this option is only available if you have pandas installed.

datasette 107914493 issue    
707849175 MDU6SXNzdWU3MDc4NDkxNzU= 974 static assets and favicon aren't cached by the browser obra 45416 open 0     1 2020-09-24T04:44:55Z 2020-09-24T04:52:58Z   NONE  

Using datasette to solve some frustrating problems with our fulfillment provider today, I was surprised to see repeated requests for assets under /-/static and the favicon. While it won't likely be a huge performance bottleneck, I bet datasette would feel a bit zippier if you had Uvicorn serving up some caching-related headers telling the browser it was safe to cache static assets.

datasette 107914493 issue    
520655983 MDU6SXNzdWU1MjA2NTU5ODM= 619 "Invalid SQL" page should let you edit the SQL simonw 9599 open 0     7 2019-11-10T20:54:12Z 2020-09-23T23:31:46Z   OWNER  

https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D

Would be useful if this page showed you the invalid SQL you entered so you can edit it and try again.

datasette 107914493 issue    
274615452 MDU6SXNzdWUyNzQ2MTU0NTI= 111 Add “last_updated” to metadata simonw 9599 open 0     5 2017-11-16T18:22:20Z 2020-09-23T15:29:12Z   OWNER  

To give an indication as to when the data was last updated.

This should be a field in the metadata that is then shown on the index page and in the footer, if it is set.

Also support setting it using an option to “datasette publish” and “datasette package” - which can either be a string or can be the magic string “today” to set it to today’s date:

datasette publish file.db --last_updated=today
datasette 107914493 issue    
706486323 MDU6SXNzdWU3MDY0ODYzMjM= 973 'bool' object is not callable error simonw 9599 closed 0     2 2020-09-22T15:30:54Z 2020-09-22T15:40:35Z 2020-09-22T15:40:35Z OWNER  

I'm getting this when latest is deployed to Cloud Run:

Traceback (most recent call last):
  File "/usr/local/bin/datasette", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/datasette/cli.py", line 406, in serve
    inspect_data = json.load(open(inspect_file))
TypeError: 'bool' object is not callable

I think I may have broken things in #970 - a980199e61fe7ccf02c2123849d86172d2ae54ff

datasette 107914493 issue    
705057955 MDU6SXNzdWU3MDUwNTc5NTU= 969 Passing additional flags to tools used during publishing betatim 1448859 open 0     1 2020-09-20T06:54:53Z 2020-09-22T15:15:14Z   NONE  

This issue is about how best to pass additional options to tools used for publishing datasettes. A concrete example is wanting to pass the --tar flag to the heroku CLI tool. I think there are at least two options for doing this: documentation for each publishing tool to explain how to set flags via env variables (if possible) or building a mechanism that lets users pass additional flags through datasette.

When using datasette publish heroku binder-launches.db --extra-options="--config facet_time_limit_ms:35000 --config sql_time_limit_ms:35000" --name=binderlytics --install=datasette-vega to publish https://binderlytics.herokuapp.com/ the following error happens:

 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
Setting WEB_CONCURRENCY and restarting ⬢ binderlytics... done, v13
WEB_CONCURRENCY: 1
 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
 ▸    Couldn't detect GNU tar. Builds could fail due to decompression errors
 ▸    See https://devcenter.heroku.com/articles/platform-api-deploying-slugs#create-slug-archive
 ▸    Please install it, or specify the '--tar' option
 ▸    Falling back to node's built-in compressor
buffer.js:358
    throw new ERR_INVALID_OPT_VALUE.RangeError('size', size);
    ^

RangeError [ERR_INVALID_OPT_VALUE]: The value "3303763968" is invalid for option "size"
    at Function.alloc (buffer.js:367:3)
    at new Buffer (buffer.js:281:19)
    at Readable.<anonymous> (/Users/thead/.local/share/heroku/node_modules/archiver-utils/index.js:39:15)
    at Readable.emit (events.js:322:22)
    at endReadableNT (/Users/thead/.local/share/heroku/node_modules/readable-stream/lib/_stream_readable.js:1010:12)
    at processTicksAndRejections (internal/process/task_queues.js:84:21) {
  code: 'ERR_INVALID_OPT_VALUE'
}

After installing GNU tar with brew install gnu-tar and modifying datasette/publish/heroku.py to include the --tar=/path/to/gnu-tar publishing works.

I think the problem occurs once your heroku slug reaches a certain size. At least when I add only a few 100 entries to the datasette then the error does not occcur.

datasette version 0.49.1
OSX 10.14.6 (18G103)

datasette 107914493 issue    
681375466 MDU6SXNzdWU2ODEzNzU0NjY= 943 await datasette.client.get(path) mechanism for executing internal requests simonw 9599 open 0     31 2020-08-18T22:17:42Z 2020-09-22T15:00:39Z   OWNER  

datasette-graphql works by making internal requests to the TableView class (in order to take advantage of existing pagination logic, plus options like ?_search= and ?_where=) - see #915

I want to support a mod_rewrite style mechanism for putting nicer URLs on top of Datasette pages - I botched that together for a project here using an internal ASGI proxying trick: https://github.com/natbat/tidepools_near_me/commit/ec102c6da5a5d86f17628740d90b6365b671b5e1

If the datasette object provided a documented method for executing internal requests (in a way that makes sense with logging etc - i.e. doesn't get logged as a separate request) both of these use-cases would be much neater.

datasette 107914493 issue    
705840673 MDU6SXNzdWU3MDU4NDA2NzM= 972 Support faceting against arbitrary SQL queries simonw 9599 open 0     1 2020-09-21T19:00:43Z 2020-09-21T19:01:25Z   OWNER  

... support for running facets against arbitrary custom SQL queries is half-done in that facets now execute against wrapped subqueries as-of ea66c45df96479ef66a89caa71fff1a97a862646

https://github.com/simonw/datasette/blob/ea66c45df96479ef66a89caa71fff1a97a862646/datasette/facets.py#L192-L200
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/971#issuecomment-696307922_

datasette 107914493 issue    
705827457 MDU6SXNzdWU3MDU4Mjc0NTc= 971 Support the dbstat table simonw 9599 closed 0     7 2020-09-21T18:38:53Z 2020-09-21T19:00:02Z 2020-09-21T18:59:52Z OWNER  

dbstat is a table that is usually available on SQLite giving statistics about the database. For example:

https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=SELECT+*+FROM+%22dbstat%22+WHERE+name%3D%27bachelorette%2Fbachelorette%27%3B

<table> <thead> <tr> <th>name</th> <th>path</th> <th>pageno</th> <th>pagetype</th> <th>ncell</th> <th>payload</th> <th>unused</th> <th>mx_payload</th> <th>pgoffset</th> <th>pgsize</th> </tr> </thead> <tbody> <tr> <td>bachelorette/bachelorette</td> <td>/</td> <td>89</td> <td>internal</td> <td>13</td> <td>0</td> <td>3981</td> <td>0</td> <td>360448</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/000/</td> <td>91</td> <td>leaf</td> <td>66</td> <td>3792</td> <td>32</td> <td>74</td> <td>368640</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/001/</td> <td>92</td> <td>leaf</td> <td>67</td> <td>3800</td> <td>14</td> <td>74</td> <td>372736</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/002/</td> <td>93</td> <td>leaf</td> <td>65</td> <td>3717</td> <td>46</td> <td>70</td> <td>376832</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/003/</td> <td>94</td> <td>leaf</td> <td>68</td> <td>3742</td> <td>6</td> <td>71</td> <td>380928</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/004/</td> <td>95</td> <td>leaf</td> <td>70</td> <td>3696</td> <td>42</td> <td>66</td> <td>385024</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/005/</td> <td>96</td> <td>leaf</td> <td>69</td> <td>3721</td> <td>22</td> <td>71</td> <td>389120</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/006/</td> <td>97</td> <td>leaf</td> <td>70</td> <td>3737</td> <td>1</td> <td>72</td> <td>393216</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/007/</td> <td>98</td> <td>leaf</td> <td>69</td> <td>3728</td> <td>15</td> <td>69</td> <td>397312</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/008/</td> <td>99</td> <td>leaf</td> <td>73</td> <td>3715</td> <td>8</td> <td>64</td> <td>401408</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/009/</td> <td>100</td> <td>leaf</td> <td>73</td> <td>3705</td> <td>18</td> <td>62</td> <td>405504</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00a/</td> <td>101</td> <td>leaf</td> <td>75</td> <td>3681</td> <td>32</td> <td>62</td> <td>409600</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00b/</td> <td>102</td> <td>leaf</td> <td>77</td> <td>3694</td> <td>9</td> <td>62</td> <td>413696</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00c/</td> <td>103</td> <td>leaf</td> <td>74</td> <td>3673</td> <td>45</td> <td>62</td> <td>417792</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00d/</td> <td>104</td> <td>leaf</td> <td>5</td> <td>228</td> <td>3835</td> <td>48</td> <td>421888</td> <td>4096</td> </tr> </tbody> </table>

Other than direct select * from dbsat queries it is completely invisible.

It would be cool if https://fivethirtyeight.datasettes.com/fivethirtyeight/dbstat didn't 404 (on databases for which that table was available).

datasette 107914493 issue    
564833696 MDU6SXNzdWU1NjQ4MzM2OTY= 670 Prototoype for Datasette on PostgreSQL simonw 9599 open 0     10 2020-02-13T17:17:55Z 2020-09-21T14:46:10Z   OWNER  

I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite.

Some of the factors making me think PostgreSQL support could be worth the effort:
- Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL.
- Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using db-to-sqlite but that's a pretty big barrier to getting started - being able to run datasette postgresql://connection-string and start trying it out would be a massively better experience.
- Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus.
- Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else.
- It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't.
- Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future.

The above reasons feel strong enough to justify a prototype.

datasette 107914493 issue    
455852801 MDU6SXNzdWU0NTU4NTI4MDE= 507 Every datasette plugin on the ecosystem page should have a screenshot simonw 9599 open 0     4 2019-06-13T17:02:51Z 2020-09-17T02:47:35Z   OWNER  

https://github.com/simonw/datasette/blob/master/docs/ecosystem.rst

datasette 107914493 issue    
653529088 MDU6SXNzdWU2NTM1MjkwODg= 891 Consider using enable_callback_tracebacks(True) simonw 9599 closed 0     5 2020-07-08T19:07:16Z 2020-09-15T21:59:27Z 2020-09-15T21:59:27Z OWNER  

From https://docs.python.org/3/library/sqlite3.html#sqlite3.enable_callback_tracebacks

sqlite3.``enable_callback_tracebacks(flag)

By default you will not get any tracebacks in user-defined functions, aggregates, converters, authorizer callbacks etc. If you want to debug them, you can call this function with flag set to True. Afterwards, you will get tracebacks from callbacks on sys.stderr. Use False to disable the feature again.

Maybe turn this on for all of Datasette? Are there any disadvantages to doing that?

datasette 107914493 issue    
688427751 MDU6SXNzdWU2ODg0Mjc3NTE= 956 Push to Docker Hub failed - but it shouldn't run for alpha releases anyway simonw 9599 closed 0     7 2020-08-29T01:09:12Z 2020-09-15T20:46:41Z 2020-09-15T20:36:34Z OWNER  

https://github.com/simonw/datasette/runs/1043709494?check_suite_focus=true

https://user-images.githubusercontent.com/9599/91625110-80c55a80-e959-11ea-8fea-70508c53fcfb.png">

  • This step should not run if a release is an alpha or beta
  • When it DOES run it should work
  • See it work for both an alpha and a non-alpha release, then close this ticket
datasette 107914493 issue    
648421105 MDU6SXNzdWU2NDg0MjExMDU= 877 Consider dropping explicit CSRF protection entirely? simonw 9599 closed 0     9 2020-06-30T19:00:55Z 2020-09-15T20:42:05Z 2020-09-15T20:42:04Z OWNER  

https://scotthelme.co.uk/csrf-is-dead/ from Feb 2017 has background here. The SameSite=lax cookie property effectively eliminates CSRF in modern browsers. https://caniuse.com/#search=SameSite shows 92.13% global support for it.

Datasette already uses SameSite=lax when it sets cookies by default: https://github.com/simonw/datasette/blob/af350ba4571b8e3f9708c40f2ddb48fea7ac1084/datasette/utils/asgi.py#L327-L341

A few options then. I could ditch CSRF protection entirely. I could make it optional - turn it off by default, but let users who care about that remaining 7.87% of global users opt back into it.

One catch: login CSRF: I don't see how SameSite=lax protects against that attack.

datasette 107914493 issue    
649907676 MDU6SXNzdWU2NDk5MDc2NzY= 889 asgi_wrapper plugin hook is crashing at startup amjith 49260 closed 0     3 2020-07-02T12:53:13Z 2020-09-15T20:40:52Z 2020-09-15T20:40:52Z CONTRIBUTOR  

Steps to reproduce:

  1. Install datasette-media plugin
    pip install datasette-media
  2. Launch datasette
    datasette databasename.db
  3. Error
INFO:     Started server process [927704]
INFO:     Waiting for application startup.
ERROR:    Exception in 'lifespan' protocol
Traceback (most recent call last):
  File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/lifespan/on.py", line 48, in main
    await app(scope, self.receive, self.send)
  File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/datasette_media/__init__.py", line 9, in wrapped_app
    path = scope["path"]
KeyError: 'path'
ERROR:    Application startup failed. Exiting.
datasette 107914493 issue    
657747959 MDU6SXNzdWU2NTc3NDc5NTk= 895 SQL query output should show numeric values in a different colour simonw 9599 closed 0     1 2020-07-16T00:28:03Z 2020-09-15T20:40:08Z 2020-09-15T20:40:08Z OWNER  

Compare https://latest.datasette.io/fixtures/sortable with https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+order+by+pk1%2C+pk2+limit+101

https://user-images.githubusercontent.com/9599/87612845-82e09c00-c6c0-11ea-806e-93764ca468c4.png">

datasette 107914493 issue    
649702801 MDU6SXNzdWU2NDk3MDI4MDE= 888 URLs in release notes point to 127.0.0.1 abdusco 3243482 closed 0     1 2020-07-02T07:28:04Z 2020-09-15T20:39:50Z 2020-09-15T20:39:49Z CONTRIBUTOR  

Just a quick heads up:

Release notes for 0.45 include urls that point to localhost.

https://github.com/simonw/datasette/releases/tag/0.45

datasette 107914493 issue    
522352520 MDU6SXNzdWU1MjIzNTI1MjA= 634 Don't run tests twice when releasing a tag simonw 9599 closed 0     2 2019-11-13T17:02:42Z 2020-09-15T20:37:58Z 2020-09-15T20:37:58Z OWNER  

Shipping a release currently runs the tests twice: https://travis-ci.org/simonw/datasette/builds/611463728

It does a regular test run on Python 3.6/7/8 - then the "Release tagged version" step runs the tests again before publishing to PyPI! This second run is not necessary.

datasette 107914493 issue    
639072811 MDU6SXNzdWU2MzkwNzI4MTE= 849 Rename master branch to main simonw 9599 closed 0   Datasette 1.0 3268330 10 2020-06-15T19:05:54Z 2020-09-15T20:37:14Z 2020-09-15T20:37:14Z OWNER  

I was waiting for consensus to form around this (and kind-of hoping for trunk since I like the tree metaphor) and it looks like main is it.

I've seen convincing arguments against trunk too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So main is better anyway.

datasette 107914493 issue    
682184050 MDU6SXNzdWU2ODIxODQwNTA= 946 Exception in tracing code simonw 9599 closed 0     1 2020-08-19T21:12:27Z 2020-09-15T20:16:50Z 2020-09-15T20:16:50Z OWNER  

When using ?_trace=1:

Traceback (most recent call last):
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 390, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/datasette/utils/asgi.py", line 150, in __call__
    await self.app(scope, receive, send)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/datasette/tracer.py", line 137, in __call__
    await self.app(scope, receive, wrapped_send)
  File "/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/contextlib.py", line 120, in __exit__
    next(self.gen)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/datasette/tracer.py", line 63, in capture_traces
    del tracers[task_id]
KeyError: 4575365856
datasette 107914493 issue    
702069429 MDU6SXNzdWU3MDIwNjk0Mjk= 967 Writable canned queries with magic parameters fail if POST body is empty simonw 9599 closed 0     11 2020-09-15T16:14:43Z 2020-09-15T20:13:10Z 2020-09-15T20:13:10Z OWNER  

When I try to use the new ?_json=1 feature from #880 with magic parameters from #842 I get this error:

Incorrect number of bindings supplied. The current statement uses 1, and there are 0 supplied

datasette 107914493 issue    
449854604 MDU6SXNzdWU0NDk4NTQ2MDQ= 492 Facets not correctly persisted in hidden form fields simonw 9599 closed 0   Datasette 1.0 3268330 4 2019-05-29T14:49:39Z 2020-09-15T20:12:29Z 2020-09-15T20:12:29Z OWNER  

Steps to reproduce: visit https://2a4b892.datasette.io/fixtures/roadside_attractions?_facet_m2m=attraction_characteristic and click "Apply"

Result is a 500: no such column: attraction_characteristic

The error occurs because of this hidden HTML input:

<input type="hidden" name="_facet" value="attraction_characteristic">

This should be:

<input type="hidden" name="_facet_m2m" value="attraction_characteristic">
datasette 107914493 issue    
701584448 MDU6SXNzdWU3MDE1ODQ0NDg= 966 Remove _request_ip example from canned queries documentation simonw 9599 closed 0     0 2020-09-15T03:51:33Z 2020-09-15T03:52:45Z 2020-09-15T03:52:45Z OWNER  

_request_ip isn't valid, so it shouldn't be in the example: https://github.com/simonw/datasette/blob/cb515a9d75430adaf5e545a840bbc111648e8bfd/docs/sql_queries.rst#L320-L322

datasette 107914493 issue    
688622148 MDU6SXNzdWU2ODg2MjIxNDg= 957 Simplify imports of common classes simonw 9599 open 0   Datasette 1.0 3268330 5 2020-08-29T23:44:04Z 2020-09-14T22:20:13Z   OWNER  

There are only a few classes that plugins need to import. It would be nice if these imports were as short and memorable as possible.

For example:

from datasette.app import Datasette
from datasette.utils.asgi import Response

Could both become:

from datasette import Datasette
from datasette import Response
datasette 107914493 issue    
687711713 MDU6SXNzdWU2ODc3MTE3MTM= 955 Release updated datasette-atom and datasette-ics simonw 9599 closed 0   Datasette 0.49 5818042 2 2020-08-28T04:55:21Z 2020-09-14T22:19:46Z 2020-09-14T22:19:46Z OWNER  

These should release straight after Datasette 0.49 with the change from #953.

datasette 107914493 issue    
679808124 MDU6SXNzdWU2Nzk4MDgxMjQ= 940 Move CI to GitHub Issues simonw 9599 closed 0   Datasette 0.49 5818042 20 2020-08-16T19:06:08Z 2020-09-14T22:09:35Z 2020-09-14T22:09:35Z OWNER  

It looks like the tests take 3m33s to run in GitHub Actions, but they're taking more than 8 minutes in Travis

datasette 107914493 issue    
648637666 MDU6SXNzdWU2NDg2Mzc2NjY= 880 POST to /db/canned-query that returns JSON should be supported (for API clients) simonw 9599 closed 0   Datasette 0.49 5818042 11 2020-07-01T03:14:43Z 2020-09-14T21:28:21Z 2020-09-14T21:25:01Z OWNER  

Now that CSRF is solved for API requests (#835) it would be good to support API requests to the .json extension.

datasette 107914493 issue    
701294727 MDU6SXNzdWU3MDEyOTQ3Mjc= 965 Documentation for 404.html, 500.html templates simonw 9599 closed 0   Datasette 0.49 5818042 3 2020-09-14T17:36:59Z 2020-09-14T18:49:49Z 2020-09-14T18:47:22Z OWNER  

This mechanism is not documented: https://github.com/simonw/datasette/blob/30b98e4d2955073ca2bca92ca7b3d97fcd0191bf/datasette/app.py#L1119-L1129

datasette 107914493 issue    
700728217 MDU6SXNzdWU3MDA3MjgyMTc= 964 raise_404 mechanism for custom templates simonw 9599 closed 0   Datasette 0.49 5818042 1 2020-09-14T03:22:15Z 2020-09-14T17:49:44Z 2020-09-14T17:39:34Z OWNER  

Having tried this out I think it does need a raise_404() mechanism - which needs to be smart enough to trigger the default 404 handler without accidentally going into an infinite loop.

_Originally posted by @simonw in https://github.com/simonw/datasette/issues/944#issuecomment-691788478_

datasette 107914493 issue    
681516976 MDU6SXNzdWU2ODE1MTY5NzY= 944 Path parameters for custom pages simonw 9599 closed 0   Datasette 0.49 5818042 5 2020-08-19T03:25:17Z 2020-09-14T03:21:45Z 2020-09-14T02:34:58Z OWNER  

Custom pages let you e.g. create a templates/pages/about.html page and have it automatically served at /about.

It would be useful if these pages could capture path patterns. I like the Python format string syntax for this (also used by Starlette): /foo/bar/{slug}.

So... how about embedding those patterns in the filenames themselves?

templates/pages/museums/{slug}.html

Would capture any hits to /museums/something and use that page to serve them.

datasette 107914493 issue    
459590021 MDU6SXNzdWU0NTk1OTAwMjE= 519 Decide what goes into Datasette 1.0 simonw 9599 open 0   Datasette 1.0 3268330 3 2019-06-23T15:47:41Z 2020-09-12T22:48:53Z   OWNER  

Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.

datasette 107914493 issue    
699947574 MDU6SXNzdWU2OTk5NDc1NzQ= 963 Currently selected array facets are not correctly persisted through hidden form fields mhalle 649467 closed 0   Datasette 0.49 5818042 1 2020-09-12T01:49:17Z 2020-09-12T21:54:29Z 2020-09-12T21:54:09Z NONE  

Faceted search uses JSON array elements as facets rather than the arrays. However, if a search is "Apply"ed (using the Apply button), the array itself rather than its elements used.

To reproduce:
https://latest.datasette.io/fixtures/facetable?_sort=pk&_facet=created&_facet=tags&_facet_array=tags

Press "Apply", which might be done when removing a filter. Notice that the "tags" facet values are now arrays, not array elements. It appears the "&_facet_array=tags" element of the query string is dropped.

datasette 107914493 issue    
627794879 MDU6SXNzdWU2Mjc3OTQ4Nzk= 782 Redesign default JSON format in preparation for Datasette 1.0 simonw 9599 open 0     8 2020-05-30T18:47:07Z 2020-09-12T21:39:03Z   OWNER  

The default JSON just isn't right. I find myself using ?_shape=array for almost everything I build against the API.

datasette 107914493 issue    
323658641 MDU6SXNzdWUzMjM2NTg2NDE= 262 Add ?_extra= mechanism for requesting extra properties in JSON simonw 9599 open 0     3 2018-05-16T14:55:42Z 2020-09-12T18:22:44Z   OWNER  

Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional template_data() function which is called if the view is being rendered as HTML.

This template_data() function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON.

Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704

With features like Facets in #255 I'm beginning to want to move more items into the template_data() - in the case of facets it's the suggested_facets array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used.

But... as an API user, I want to still optionally be able to access that information.

Solution: Add a ?_extra=suggested_facets&_extra=table_metadata argument which can be used to optionally request additional blocks to be added to the JSON API.

Then redefine as many of the current template_data() features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates.

This could allow the JSON representation to be slimmed down further (removing e.g. the table_definition and view_definition keys) while still making that information available to API users who need it.

datasette 107914493 issue    
569275763 MDU6SXNzdWU1NjkyNzU3NjM= 680 Release automation: automate the bit that posts the GitHub release simonw 9599 closed 0     5 2020-02-22T03:50:40Z 2020-09-12T18:18:50Z 2020-09-12T18:18:50Z OWNER  

The most manual part of the release process right now is having to post a GitHub release that matches the updated changelog.

This is particularly annoying because the changelog is in .rst while the GitHub release needs markdown - so I currently manually translate between the two.

Having the release script automatically post a GitHub release at the end would be much more convenient.

datasette 107914493 issue    
649429772 MDU6SXNzdWU2NDk0Mjk3NzI= 886 Reconsider how _actor_X magic parameter deals with missing values simonw 9599 open 0     2 2020-07-02T00:00:38Z 2020-09-11T21:35:26Z   OWNER  

I had to build a custom _actorornull prefix for datasette-saved-queries:

def actorornull(key, request):
    if request.actor is None:
        return None
    return request.actor.get(key)


@hookimpl
def register_magic_parameters():
    return [
        ("actorornull", actorornull),
    ]

Maybe the actor magic in Datasette core should do that out of the box?

https://github.com/simonw/datasette/blob/f1f581b7ffcd5d8f3ae6c1c654d813a6641410eb/datasette/default_magic_parameters.py#L14-L17

datasette 107914493 issue    
684925907 MDU6SXNzdWU2ODQ5MjU5MDc= 948 Upgrade CodeMirror simonw 9599 closed 0   Datasette 0.49 5818042 8 2020-08-24T19:55:33Z 2020-09-11T21:34:24Z 2020-08-30T18:03:07Z OWNER  

Datasette currently bundles 5.31.0 (from October 2017) - latest version is 5.57.0 (August 2020). https://codemirror.net/doc/releases.html

datasette 107914493 issue    
691475400 MDU6SXNzdWU2OTE0NzU0MDA= 958 Upgrade to latest Black (20.8b1) simonw 9599 closed 0   Datasette 0.49 5818042 0 2020-09-02T22:24:19Z 2020-09-11T21:34:24Z 2020-09-02T22:25:10Z OWNER  

Black has some changes: https://black.readthedocs.io/en/stable/change_log.html#b0 - in particular:

  • re-implemented support for explicit trailing commas: now it works consistently within any bracket pair, including nested structures (#1288 and duplicates)
  • Black now reindents docstrings when reindenting code around it (#1053)
datasette 107914493 issue    
699622046 MDU6SXNzdWU2OTk2MjIwNDY= 962 datasette --pdb option for debugging errors simonw 9599 closed 0   Datasette 0.49 5818042 1 2020-09-11T18:33:10Z 2020-09-11T21:34:24Z 2020-09-11T18:38:01Z OWNER  

I needed to debug an exception from deep inside a Jinja template the other day. I hacked this together and it helped.

datasette 107914493 issue    
573755726 MDU6SXNzdWU1NzM3NTU3MjY= 690 Mechanism for plugins to add UI to pages in specific locations simonw 9599 open 0     5 2020-03-02T06:48:36Z 2020-09-11T21:33:40Z   OWNER  

Now that we have support for plugins that can write I'm seeing all sorts of places where a plugin might need to add UI to the table page.

Some examples:

  • datasette-configure-fts needs to add a "configure search for this table" link
  • a plugin that lets you render or delete tables needs to add a link or button somewhere
  • existing plugins like datasette-vega and datasette-cluster-map already do this with JavaScript

The challenge here is that multiple plugins may want to do this, so simply overriding templates and populating names blocks doesn't entirely work as templates may override each other.

datasette 107914493 issue    
684111953 MDU6SXNzdWU2ODQxMTE5NTM= 947 datasette --get exit code should reflect HTTP errors simonw 9599 closed 0   Datasette 0.49 5818042 1 2020-08-23T04:17:08Z 2020-09-11T21:33:15Z 2020-09-11T21:33:15Z OWNER  

If you run datasette . --get / and the result is a 500 or 404 error (anything that's not a 200 or a 30x) the exit code from the command should not be 0.

It should still output the returned content to stdout.

This will help with writing soundness checks, as seen in https://til.simonwillison.net/til/til/github-actions_grep-tests.md

datasette 107914493 issue    
696908389 MDU6SXNzdWU2OTY5MDgzODk= 961 Verification checks for metadata.json on startup simonw 9599 open 0     2 2020-09-09T15:21:53Z 2020-09-09T15:24:31Z   OWNER  

I lost a bunch of time yesterday trying to figure out why a Datasette instance wasn't starting up - it turned out it was because I had a facets: reference that mentioned a column that did not exist.

Catching these on startup would be good.

datasette 107914493 issue    
691537426 MDU6SXNzdWU2OTE1Mzc0MjY= 959 Internals API idea: results.dicts in addition to results.rows simonw 9599 open 0     0 2020-09-03T00:50:17Z 2020-09-03T00:50:17Z   OWNER  

I just wrote this code:

    results = await database.execute(SEARCH_SQL, {"query": query})
    return [dict(r) for r in results.rows]

How about having results.dicts as a utility property that does that?

datasette 107914493 issue    
687245650 MDExOlB1bGxSZXF1ZXN0NDc0NzAzMDA3 952 Update black requirement from ~=19.10b0 to >=19.10,<21.0 dependabot-preview[bot] 27856297 closed 0     1 2020-08-27T13:31:36Z 2020-09-02T22:26:17Z 2020-09-02T22:26:16Z CONTRIBUTOR simonw/datasette/pulls/952

Updates the requirements on black to permit the latest version.


Changelog

Sourced from https://github.com/psf/black/blob/master/CHANGES.md">black's changelog.



20.8b1


Packaging



  • explicitly depend on Click 7.1.2 or newer as Black no longer works with versions
    older than 7.0


20.8b0


Black







Commits



Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
datasette 107914493 pull    
542553350 MDU6SXNzdWU1NDI1NTMzNTA= 655 Copy and paste doesn't work reliably on iPhone for SQL editor simonw 9599 closed 0   Datasette 1.0 3268330 3 2019-12-26T13:15:10Z 2020-08-30T17:51:40Z 2020-08-30T17:51:40Z OWNER  

I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.

datasette 107914493 issue    
687694947 MDU6SXNzdWU2ODc2OTQ5NDc= 954 Remove old register_output_renderer dict mechanism in Datasette 1.0 simonw 9599 open 0   Datasette 1.0 3268330 1 2020-08-28T04:04:23Z 2020-08-28T04:56:31Z   OWNER  

Documentation says that the old dictionary mechanism will be deprecated by 1.0:

https://github.com/simonw/datasette/blob/799ecae94824640bdff21f86997f69844048d5c3/docs/plugin_hooks.rst#L460
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/953#issuecomment-682312494_

datasette 107914493 issue    
682005535 MDU6SXNzdWU2ODIwMDU1MzU= 945 datasette install -U for upgrading packages simonw 9599 closed 0   Datasette 0.49 5818042 1 2020-08-19T17:12:04Z 2020-08-28T04:53:14Z 2020-08-19T17:20:50Z OWNER  

This will also give Homebrew a way to upgrade Datasette itself without having to wait for the latest packaged version to land in Homebrew core.

datasette 107914493 issue    
687681018 MDU6SXNzdWU2ODc2ODEwMTg= 953 register_output_renderer render function should be able to return a Response simonw 9599 closed 0   Datasette 0.49 5818042 1 2020-08-28T03:21:21Z 2020-08-28T04:53:03Z 2020-08-28T04:03:01Z OWNER  

That plugin hook was designed before Datasette had a documented Response class. It should optionally be allowed to return a Response in addition to the current custom dictionary.

datasette 107914493 issue    
685806511 MDU6SXNzdWU2ODU4MDY1MTE= 950 Private/secret databases: database files that are only visible to plugins simonw 9599 open 0     5 2020-08-25T20:46:17Z 2020-08-26T00:50:48Z   OWNER  

In thinking about the best way to implement https://github.com/simonw/datasette-auth-passwords/issues/6 (SQL-backed user accounts for datasette-auth-passwords) I realized that there are a few different use-cases where a plugin might want to store data that isn't visible to regular Datasette users:

  • Storing password hashes
  • Storing API tokens
  • Storing secrets that are used for data import integrations (secrets for talking to the Twitter API for example)

Idea: allow one or more private database files to be attached to Datasette, something like this:

datasette github.db linkedin.db -s secrets.db -m metadata.yml

The secrets.db file would not be visible using any of the Datasette's usual interface or API routes - but plugins would be able to run queries against it.

So datasette-auth-passwords might then be configured like this:

plugins:
  datasette-auth-passwords:
    database: secrets
    sql: "select password_hash from passwords where username = :username"

The plugin could even refuse to operate against a database that hadn't been loaded as a secret database.

datasette 107914493 issue    
684961449 MDU6SXNzdWU2ODQ5NjE0NDk= 949 Try out CodeMirror SQL hints simonw 9599 open 0     2 2020-08-24T20:58:21Z 2020-08-24T21:09:50Z   OWNER  

It would also be interesting to try out the SQL hint mode, which can autocomplete against tables and columns. This demo shows how to configure that: https://codemirror.net/mode/sql/

Some missing documentation: https://stackoverflow.com/questions/20023381/codemirror-how-add-tables-to-sql-hint
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/948#issuecomment-679355426_

datasette 107914493 issue    
671763164 MDU6SXNzdWU2NzE3NjMxNjQ= 915 Refactor TableView class so things like datasette-graphql can reuse the logic simonw 9599 closed 0     3 2020-08-03T03:13:33Z 2020-08-18T22:28:37Z 2020-08-18T22:28:37Z OWNER  

_Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-667780040_

datasette 107914493 issue    
323718842 MDU6SXNzdWUzMjM3MTg4NDI= 268 Mechanism for ranking results from SQLite full-text search simonw 9599 open 0     2 2018-05-16T17:36:40Z 2020-08-18T21:18:35Z   OWNER  

This isn't particularly straight-forward - all the more reason for Datasette to implement it for you. This article is helpful: http://charlesleifer.com/blog/using-sqlite-full-text-search-with-python/

datasette 107914493 issue    
681334912 MDU6SXNzdWU2ODEzMzQ5MTI= 942 Support column descriptions in metadata.json simonw 9599 open 0     3 2020-08-18T20:52:00Z 2020-08-18T21:05:24Z   OWNER  

Could look something like this:

{
    "title": "Five Thirty Eight",
    "license": "CC Attribution 4.0 License",
    "license_url": "https://creativecommons.org/licenses/by/4.0/",
    "source": "fivethirtyeight/data on GitHub",
    "source_url": "https://github.com/fivethirtyeight/data",
    "databases": {
        "fivethirtyeight": {
            "tables": {
                "mueller-polls/mueller-approval-polls": {
                    "description_html": "<p>....</p>",
                    "columns": {
                        "name_of_column": "column_description goes here"
}
datasette 107914493 issue    
647095487 MDU6SXNzdWU2NDcwOTU0ODc= 873 "datasette -p 0 --root" gives the wrong URL simonw 9599 open 0     14 2020-06-29T04:03:06Z 2020-08-18T17:26:10Z   OWNER  
$ datasette -p 0 --root
http://127.0.0.1:0/-/auth-token?token=2d498c...

The port is incorrect.

datasette 107914493 issue    
679809281 MDExOlB1bGxSZXF1ZXN0NDY4NDg0MDMx 941 Run CI on GitHub Actions, not Travis simonw 9599 closed 0     1 2020-08-16T19:13:39Z 2020-08-18T05:09:36Z 2020-08-18T05:09:35Z OWNER simonw/datasette/pulls/941

Refs #940

datasette 107914493 pull    
671056788 MDU6SXNzdWU2NzEwNTY3ODg= 914 "Object of type bytes is not JSON serializable" for _nl=on simonw 9599 closed 0     1 2020-08-01T17:43:10Z 2020-08-16T21:10:27Z 2020-08-16T18:26:59Z OWNER  

https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array returns this:

[
  {
    "rowid": 1,
    "data": "this is binary data"
  }
]

But adding &_nl=on returns this: https://latest.datasette.io/fixtures/binary_data.json?_sort_desc=data&_shape=array&_nl=on

{
  "ok": false,
  "error": "Object of type bytes is not JSON serializable",
  "status": 500,
  "title": null
}

I found this error by running wget -r 127.0.0.1:8001 against my local fixtures.db.

datasette 107914493 issue    
679779797 MDU6SXNzdWU2Nzk3Nzk3OTc= 939 extra_ plugin hooks should take the same arguments simonw 9599 closed 0     6 2020-08-16T16:04:54Z 2020-08-16T18:25:05Z 2020-08-16T16:50:29Z OWNER  
  • extra_css_urls(template, database, table, datasette)
  • extra_js_urls(template, database, table, datasette)
  • extra_body_script(template, database, table, view_name, datasette)
  • extra_template_vars(template, database, table, view_name, request, datasette)

_Originally posted by @simonw in https://github.com/simonw/datasette/issues/938#issuecomment-674544691_

datasette 107914493 issue    
679700269 MDU6SXNzdWU2Nzk3MDAyNjk= 938 Pass columns to extra CSS/JS/etc plugin hooks simonw 9599 closed 0     3 2020-08-16T06:37:47Z 2020-08-16T18:10:23Z 2020-08-16T18:09:59Z OWNER  

I'd like datasette-cluster-map to only add links to JavaScript on pages that have tables with latitude and longitude columns.

Passing the names of the columns to the plugin hook can support this and will be backwards compatible thanks to pluggy.

datasette 107914493 issue    
679660778 MDExOlB1bGxSZXF1ZXN0NDY4Mzc3MjEy 937 Docs now live at docs.datasette.io simonw 9599 closed 0     0 2020-08-15T23:53:52Z 2020-08-15T23:57:06Z 2020-08-15T23:57:05Z OWNER simonw/datasette/pulls/937 datasette 107914493 pull    
679646710 MDU6SXNzdWU2Nzk2NDY3MTA= 935 db.execute_write_fn(create_tables, block=True) hangs a thread if connection fails simonw 9599 closed 0     3 2020-08-15T21:49:17Z 2020-08-15T22:35:33Z 2020-08-15T22:35:33Z OWNER  

Discovered in https://github.com/simonw/latest-datasette-with-all-plugins/issues/3#issuecomment-674449757

datasette 107914493 issue    
679650632 MDExOlB1bGxSZXF1ZXN0NDY4MzcwNjU4 936 Don't hang in db.execute_write_fn() if connection fails simonw 9599 closed 0     2 2020-08-15T22:20:12Z 2020-08-15T22:35:33Z 2020-08-15T22:35:32Z OWNER simonw/datasette/pulls/936

Refs #935

datasette 107914493 pull    
679637501 MDU6SXNzdWU2Nzk2Mzc1MDE= 934 --get doesn't fully invoke the startup routine simonw 9599 closed 0     0 2020-08-15T20:30:25Z 2020-08-15T20:53:49Z 2020-08-15T20:53:49Z OWNER  

https://github.com/simonw/datasette/blob/7702ea602188899ee9b0446a874a6a9b546b564d/datasette/cli.py#L417-L433

Spotted this working on https://github.com/simonw/latest-datasette-with-all-plugins/issues/3 - I'd like to be able to use datasette --get / as a sanity checking test, but that doesn't work if the init hooks aren't fully executed.

datasette 107914493 issue    
678760988 MDU6SXNzdWU2Nzg3NjA5ODg= 932 End-user documentation simonw 9599 open 0     4 2020-08-13T22:04:39Z 2020-08-14T16:02:24Z   OWNER  

Datasette's documentation is aimed at people who install and configure it.

What about end users of preconfigured and deployed Datasette instances?

Something that can be linked to from the Datasette UI would be really useful.

datasette 107914493 issue    
677926613 MDU6SXNzdWU2Nzc5MjY2MTM= 931 Docker container is no longer being pushed (it's stuck on 0.45) simonw 9599 closed 0     7 2020-08-12T19:33:03Z 2020-08-12T21:36:20Z 2020-08-12T21:36:20Z OWNER  

e.g. https://travis-ci.org/github/simonw/datasette/jobs/717123725

Here's how it broke:

--2020-08-12 03:08:17--  https://www.gaia-gis.it/gaia-sins/freexl-1.0.5.tar.gz
Resolving www.gaia-gis.it (www.gaia-gis.it)... 212.83.162.51
Connecting to www.gaia-gis.it (www.gaia-gis.it)|212.83.162.51|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2020-08-12 03:08:18 ERROR 404: Not Found.
The command '/bin/sh -c wget "https://www.gaia-gis.it/gaia-sins/freexl-1.0.5.tar.gz" && tar zxf freexl-1.0.5.tar.gz     && cd freexl-1.0.5 && ./configure && make && make install' returned a non-zero code: 8
The command "docker build -f Dockerfile -t $REPO:$TRAVIS_TAG ." exited with 8.
0.07s$ docker tag $REPO:$TRAVIS_TAG $REPO:latest
Error response from daemon: No such image: [secure]/datasette:0.47.1
The command "docker tag $REPO:$TRAVIS_TAG $REPO:latest" exited with 1.
0.08s$ docker push $REPO
The push refers to repository [docker.io/[secure]/datasette]
An image does not exist locally with the tag: [secure]/datasette
The command "docker push $REPO" exited with 1.
cache.2
store build cache
datasette 107914493 issue    
677326155 MDU6SXNzdWU2NzczMjYxNTU= 930 Datasette sdist is missing templates (hence broken when installing from Homebrew) simonw 9599 closed 0     6 2020-08-12T02:20:16Z 2020-08-12T03:30:59Z 2020-08-12T03:30:59Z OWNER  

Pretty nasty bug this: I'm getting 500 errors for all pages that try to render a template after installing the newly released Datasette 0.47 - both from pip install and via Homebrew.

datasette 107914493 issue    
677250834 MDU6SXNzdWU2NzcyNTA4MzQ= 926 datasette fixtures.db --get "/fixtures.json" simonw 9599 closed 0     2 2020-08-11T22:55:36Z 2020-08-12T00:26:17Z 2020-08-12T00:24:42Z OWNER  

I can expose ALL of Datasette's functionality on the command-line (without even running a web server) by adding --get and --post options to datasette serve.

datasette fixtures.db --get "/fixtures.json"

This would instantiate the Datasette ASGI app, run a fake request for /fixtures.json through it, dump the results out to standard output and quit.

A --post option could do the same for a POST request. Treating that as a stretch goal for the moment.

datasette 107914493 issue    
677265716 MDExOlB1bGxSZXF1ZXN0NDY2NDEwNzU1 927 'datasette --get' option, refs #926 simonw 9599 closed 0     5 2020-08-11T23:31:52Z 2020-08-12T00:24:42Z 2020-08-12T00:24:41Z OWNER simonw/datasette/pulls/927

Refs #926, #898

datasette 107914493 pull    
677272618 MDU6SXNzdWU2NzcyNzI2MTg= 928 Test failures caused by failed attempts to mock pip simonw 9599 closed 0     4 2020-08-11T23:53:18Z 2020-08-12T00:07:49Z 2020-08-12T00:07:49Z OWNER  

Errors like this one:

https://github.com/simonw/datasette/pull/927/checks?check_run_id=973559696

2020-08-11T23:36:39.8801334Z =================================== FAILURES ===================================
2020-08-11T23:36:39.8802411Z _________________________________ test_install _________________________________
2020-08-11T23:36:39.8803242Z 
2020-08-11T23:36:39.8804935Z thing = <module 'pip._internal.cli' from '/opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/site-packages/pip/_internal/cli/__init__.py'>
2020-08-11T23:36:39.8806663Z comp = 'main', import_path = 'pip._internal.cli.main'
2020-08-11T23:36:39.8807696Z 
2020-08-11T23:36:39.8808728Z     def _dot_lookup(thing, comp, import_path):
2020-08-11T23:36:39.8810573Z         try:
2020-08-11T23:36:39.8812262Z >           return getattr(thing, comp)
2020-08-11T23:36:39.8817136Z E           AttributeError: module 'pip._internal.cli' has no attribute 'main'
2020-08-11T23:36:39.8843043Z 
2020-08-11T23:36:39.8855951Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1215: AttributeError
2020-08-11T23:36:39.8873372Z 
2020-08-11T23:36:39.8877803Z During handling of the above exception, another exception occurred:
2020-08-11T23:36:39.8906532Z 
2020-08-11T23:36:39.8925767Z     def get_src_prefix():
2020-08-11T23:36:39.8928277Z         # type: () -> str
2020-08-11T23:36:39.8930068Z         if running_under_virtualenv():
2020-08-11T23:36:39.8949721Z             src_prefix = os.path.join(sys.prefix, 'src')
2020-08-11T23:36:39.8951813Z         else:
2020-08-11T23:36:39.8969014Z             # FIXME: keep src in cwd for now (it is not a temporary folder)
2020-08-11T23:36:39.9012110Z             try:
2020-08-11T23:36:39.9013489Z >               src_prefix = os.path.join(os.getcwd(), 'src')
2020-08-11T23:36:39.9014538Z E               FileNotFoundError: [Errno 2] No such file or directory
2020-08-11T23:36:39.9016122Z 
2020-08-11T23:36:39.9017617Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/site-packages/pip/_internal/locations.py:50: FileNotFoundError
2020-08-11T23:36:39.9018802Z 
2020-08-11T23:36:39.9020070Z During handling of the above exception, another exception occurred:
2020-08-11T23:36:39.9020930Z 
2020-08-11T23:36:39.9022275Z args = (), keywargs = {}
2020-08-11T23:36:39.9023183Z 
2020-08-11T23:36:39.9024077Z     @wraps(func)
2020-08-11T23:36:39.9024984Z     def patched(*args, **keywargs):
2020-08-11T23:36:39.9028770Z >       with self.decoration_helper(patched,
2020-08-11T23:36:39.9031861Z                                     args,
2020-08-11T23:36:39.9038358Z                                     keywargs) as (newargs, newkeywargs):
2020-08-11T23:36:39.9039654Z 
2020-08-11T23:36:39.9040566Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1322: 
2020-08-11T23:36:39.9041492Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
datasette 107914493 issue    
677037043 MDU6SXNzdWU2NzcwMzcwNDM= 923 Add homebrew installation to documentation simonw 9599 closed 0     5 2020-08-11T16:54:31Z 2020-08-11T22:53:07Z 2020-08-11T22:52:46Z OWNER  

$ brew tap simonw/datasette $ brew install simonw/datasette/datasette $ datasette --version datasette, version 0.46
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/335#issuecomment-672088880_

datasette 107914493 issue    
677227912 MDU6SXNzdWU2NzcyMjc5MTI= 925 "datasette install" and "datasette uninstall" commands simonw 9599 closed 0     3 2020-08-11T22:04:32Z 2020-08-11T22:34:37Z 2020-08-11T22:32:12Z OWNER  

When installing Datasette plugins it's crucial that they end up in the same virtual environment as Datasette itself.

It's not necessarily obvious how to do this, especially if you install Datasette via pipx or homebrew.

Solution: datasette install datasette-vega and datasette uninstall datasette-vega commands that know how to install to the correct place - a very thin wrapper around pip install.

datasette 107914493 issue    
339505204 MDU6SXNzdWUzMzk1MDUyMDQ= 335 Package datasette for installation using homebrew simonw 9599 closed 0     12 2018-07-09T15:45:03Z 2020-08-11T16:54:06Z 2020-08-11T16:54:06Z OWNER  

https://docs.brew.sh/Python-for-Formula-Authors describes how.

Applications should be installed into a Python virtualenv environment rooted in libexec. This prevents the app’s Python modules from contaminating the system site-packages and vice versa.

It recommends using https://github.com/tdsmith/homebrew-pypi-poet

datasette 107914493 issue    
675724951 MDU6SXNzdWU2NzU3MjQ5NTE= 918 Security issue: read-only canned queries leak CSRF token in URL simonw 9599 closed 0     4 2020-08-09T16:03:01Z 2020-08-09T16:56:48Z 2020-08-09T16:11:59Z OWNER  

The HTML form for a read-only canned query includes the hidden CSRF token field added in #798 for writable canned queries (#698).

This means that submitting those read-only forms exposes the CSRF token in the URL - for example on https://latest.datasette.io/fixtures/neighborhood_search submitting the form took me to:

https://latest.datasette.io/fixtures/neighborhood_search?text=down&csrftoken=IlFubnoxVVpLU1NGT3NMVUoi.HbOPd2YH_epQmp8f_aAt0s-MxtU

This token could potentially leak to an attacker if the resulting page has a link to an external site on it and the user clicks the link, since the token would be exposed in the referral logs.

datasette 107914493 issue    
675727366 MDU6SXNzdWU2NzU3MjczNjY= 919 Travis should not build the master branch, only the main branch simonw 9599 closed 0     3 2020-08-09T16:18:25Z 2020-08-09T16:26:18Z 2020-08-09T16:19:37Z OWNER  

Caused by #849 - since we are mirroring the two branches (to ensure old links to master keep working) Travis is building both.

The following in .travis.yml should fix that:

branches:
  except:
  - master
datasette 107914493 issue    
675594325 MDU6SXNzdWU2NzU1OTQzMjU= 917 Idea: "datasette publish" option for "only if the data has changed simonw 9599 open 0     0 2020-08-08T21:58:27Z 2020-08-08T21:58:27Z   OWNER  

This is a pattern I often find myself needing. I usually implement this in GitHub Actions like this:

https://github.com/simonw/covid-19-datasette/blob/efa01c39abc832b8641fc2a92840cc3acae2fb08/.github/workflows/scheduled.yml#L52-L63

    - name: Set variables to decide if we should deploy
      id: decide_variables
      run: |-
        echo "##[set-output name=latest;]$(datasette inspect covid.db | jq '.covid.hash' -r)"
        echo "##[set-output name=deployed;]$(curl -s https://covid-19.datasettes.com/-/databases.json | jq '.[0].hash' -r)"
    - name: Set up Cloud Run
      if: github.event_name == 'workflow_dispatch' || steps.decide_variables.outputs.latest != steps.decide_variables.outputs.deployed
      uses: GoogleCloudPlatform/github-actions/setup-gcloud@master

This is pretty fiddly. It might be good for datasette publish to grow a helper option that does effectively this - hashes the databases (and the metadata.json) and compares them to the deployed version.

datasette 107914493 issue    
672421411 MDU6SXNzdWU2NzI0MjE0MTE= 916 Support reverse pagination (previous page, has-previous-items) simonw 9599 open 0     0 2020-08-04T00:32:06Z 2020-08-04T00:32:20Z   OWNER  

I need this for datasette-graphql for full compatibility with the way Relay likes to paginate - using cursors for paginating backwards as well as for paginating forwards.

This may be the kick I need to get Datasette pagination to work in reverse too.
_Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-668305853_

datasette 107914493 issue    
661605489 MDU6SXNzdWU2NjE2MDU0ODk= 900 Some links don't honor base_url noteed 50220 open 0     1 2020-07-20T09:40:50Z 2020-07-31T23:56:38Z   NONE  

Hi,

I've been playing with Datasette behind Nginx (awesome tool, thanks !). It seems some URLs are OK but some aren't. For instance in https://github.com/simonw/datasette/blob/master/datasette/templates/query.html#L61 it seems that url_csv includes a / prefix, resulting in the base_url not beeing honored.

Actually here, it seems that dropping the prefix / to make the link relative is enough (so it may not be strictly related to base_url).

Additional information:

datasette, version 0.45+0.gf1f581b.dirty

Relevant Nginx configuration (note that all the trailing slashes have some effect):

  location /datasette/ {
    proxy_pass http://127.0.0.1:9001/;
    proxy_set_header Host $host;
  }

Relelvant Datasette configuration (slashes matter too):

  --config base_url:/datasette/
datasette 107914493 issue    
660827546 MDU6SXNzdWU2NjA4Mjc1NDY= 899 How to setup a request limit per user Krazybug 133845 closed 0     1 2020-07-19T13:08:25Z 2020-07-31T23:54:42Z 2020-07-31T23:54:42Z NONE  

Hello,

Until now I'm using datasette without any authentication system but I would like to setup a configuration or limiting the number of requests per user (eventually by IP or with a cookie mechanism) and eventually allowing me to ban specific users/IPs.

Is there a plugin available for this use case ?
If not what are your insights regarding this UC ?

Should I write a plugin ? Should I deploy datasette behind a reverse proxy to manage this ?

datasette 107914493 issue    
670209331 MDU6SXNzdWU2NzAyMDkzMzE= 913 Mechanism for passing additional options to `datasette my.db` that affect plugins simonw 9599 open 0     3 2020-07-31T20:38:26Z 2020-07-31T23:52:11Z   OWNER  

It's a shame there's no obvious mechanism for passing additional options to datasette my.db that affect how plugins work.

The only way I can think of at the moment is via environment variables:

DATASETTE_INSERT_UNSAFE=1 datasette my.db

This will have to do for the moment - it's ugly enough that people will at least know they are doing something unsafe, which is the goal here.
_Originally posted by @simonw in https://github.com/simonw/datasette-insert/issues/15#issuecomment-667346438_

datasette 107914493 issue    
662322234 MDExOlB1bGxSZXF1ZXN0NDUzODkwMjky 901 Use None as a default arg fcatus 56323389 closed 0     1 2020-07-20T22:18:38Z 2020-07-31T18:42:39Z 2020-07-31T18:42:39Z CONTRIBUTOR simonw/datasette/pulls/901

When passing a mutable value as a default argument in a function, the default argument is mutated anytime that value is mutated. This poses a bug risk. Instead, use None as a default and assign the mutable value inside the function.

datasette 107914493 pull    
668064778 MDU6SXNzdWU2NjgwNjQ3Nzg= 912 Add "publishing to Vercel" to the publish docs simonw 9599 closed 0     0 2020-07-29T18:50:58Z 2020-07-31T17:06:35Z 2020-07-31T17:06:35Z OWNER  

https://datasette.readthedocs.io/en/0.45/publish.html#datasette-publish currently only lists Cloud Run, Heroku and Fly. It should list Vercel too.

(I should probably rename datasette-publish-now to datasette-publish-vercel)

datasette 107914493 issue    
667467128 MDU6SXNzdWU2Njc0NjcxMjg= 909 AsgiFileDownload: filename not correctly passed simonw 9599 closed 0     2 2020-07-29T00:41:43Z 2020-07-30T00:56:17Z 2020-07-29T21:34:48Z OWNER  

https://github.com/simonw/datasette/blob/3c33b421320c0be81a625ca7307b2e4416a9ed5b/datasette/utils/asgi.py#L396-L405
self.filename should be passed to asgi_send_file()

datasette 107914493 issue    
667840539 MDExOlB1bGxSZXF1ZXN0NDU4NDM1NTky 910 Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0 dependabot-preview[bot] 27856297 closed 0     1 2020-07-29T13:21:17Z 2020-07-29T21:26:05Z 2020-07-29T21:26:04Z CONTRIBUTOR simonw/datasette/pulls/910

Updates the requirements on pytest to permit the latest version.


Release notes

Sourced from https://github.com/pytest-dev/pytest/releases">pytest's releases.



6.0.0


pytest 6.0.0 (2020-07-28)


(Please see the full set of changes for this release also in the 6.0.0rc1 notes below)


Breaking Changes



Features



Improvements



Bug Fixes







Changelog

Sourced from https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst">pytest's changelog.




Commits



Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
datasette 107914493 pull    
668064026 MDU6SXNzdWU2NjgwNjQwMjY= 911 Rethink the --name option to "datasette publish" simonw 9599 open 0   Datasette 1.0 3268330 0 2020-07-29T18:49:49Z 2020-07-29T18:49:49Z   OWNER  

--name works inconsistently across the different publish providers - on Cloud Run you should use --service instead for example. Need to review it across all of them and either remove it or clarify what it does.

datasette 107914493 issue    
646448486 MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0 868 initial windows ci setup joshmgrant 702729 open 0     3 2020-06-26T18:49:13Z 2020-07-26T01:29:00Z   FIRST_TIME_CONTRIBUTOR simonw/datasette/pulls/868

Picking up the work done on #557 with a new PR. Seeing if I can get this working.

datasette 107914493 pull    
665400224 MDU6SXNzdWU2NjU0MDAyMjQ= 906 "allow": true for anyone, "allow": false for nobody simonw 9599 closed 0   Datasette 0.46 5607421 3 2020-07-24T20:28:10Z 2020-07-25T00:07:10Z 2020-07-25T00:05:04Z OWNER  

The "allow" syntax described at https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks currently says this:

An allow block can specify "no-one is allowed to do this" using an empty {}:

{ "allow": {} }

"allow": null allows all access, though this isn't documented (it should be though).

These are not very intuitive. How about also supporting "allow": true for "allow anyone" and "allow": false for "allow nobody"?

datasette 107914493 issue    
665407663 MDU6SXNzdWU2NjU0MDc2NjM= 908 Interactive debugging tool for "allow" blocks simonw 9599 closed 0   Datasette 0.46 5607421 3 2020-07-24T20:43:44Z 2020-07-25T00:06:15Z 2020-07-24T22:56:52Z OWNER  

It might be good to have a little interactive tool which helps debug these things, since there are quite a few edge-cases and the damage caused if people use them incorrectly is substantial.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/907#issuecomment-663726146_

datasette 107914493 issue    
665403403 MDU6SXNzdWU2NjU0MDM0MDM= 907 Allow documentation doesn't explain what happens with multiple allow keys simonw 9599 closed 0   Datasette 0.46 5607421 2 2020-07-24T20:34:40Z 2020-07-24T22:53:07Z 2020-07-24T22:53:07Z OWNER  

Documentation here: https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks

Doesn't explain that with the following "allow" block:

{
  "allow": {
    "id": "simonw",
    "role": "staff"
  }
}

The rule will match if EITHER the id is simonw OR the role includes staff.

The tests are missing this case too: https://github.com/simonw/datasette/blob/028f193dd6233fa116262ab4b07b13df7dcec9be/tests/test_utils.py#L504

Related to #906

datasette 107914493 issue    
442327592 MDU6SXNzdWU0NDIzMjc1OTI= 456 Installing installs the tests package hellerve 7725188 closed 0     3 2019-05-09T16:35:16Z 2020-07-24T20:39:54Z 2020-07-24T20:39:54Z CONTRIBUTOR  

Because setup.py uses find_packages and tests is on the top-level, pip install datasette will install a top-level package called tests, which is probably not desired behavior.

The offending line is here:
https://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40

And only pip uninstall datasette with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling:

$ pip uninstall datasette
Uninstalling datasette-0.27:
  Would remove:
    /usr/local/bin/datasette
    /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/*
    /usr/local/lib/python3.7/site-packages/datasette/*
    /usr/local/lib/python3.7/site-packages/tests/*
  Would not remove (might be manually added):
    [ .. snip .. ]
Proceed (y/n)? 

This should be a relatively simple fix, and I could drop a PR if desired!

Cheers

datasette 107914493 issue    
662439034 MDExOlB1bGxSZXF1ZXN0NDUzOTk1MTc5 902 Don't install tests package abeyerpath 32467826 closed 0     2 2020-07-21T01:08:50Z 2020-07-24T20:39:54Z 2020-07-24T20:39:54Z CONTRIBUTOR simonw/datasette/pulls/902

The exclude argument to find_packages needs an iterable of package
names.

Fixes: #456

datasette 107914493 pull    
663317875 MDU6SXNzdWU2NjMzMTc4NzU= 905 /database.db download should include content-length header simonw 9599 closed 0     2 2020-07-21T21:23:48Z 2020-07-22T04:59:46Z 2020-07-22T04:52:45Z OWNER  

I can do this by modifying this function: https://github.com/simonw/datasette/blob/02dc6298bdbfb1d63e0d2a39ff597b5fcc60e06b/datasette/utils/asgi.py#L248-L270

datasette 107914493 issue    
663228985 MDU6SXNzdWU2NjMyMjg5ODU= 904 Make database_url and other helpers available within .render_template() for plugins simonw 9599 open 0     0 2020-07-21T18:42:52Z 2020-07-21T18:43:05Z   OWNER  

I tried using this block of template in a plugin and got an error:

{% block nav %}
    <p class="crumbs">
        <a href="{{ base_url }}">home</a> /
        <a href="{{ database_url(database) }}">{{ database }}</a> /
        <a href="{{ database_url(database) }}/{{ table|quote_plus }}">{{ table }}</a>
    </p>
    {{ super() }}
{% endblock %}

Error: 'database_url' is undefined

That's because database_url is only made available by the BaseView template here:

https://github.com/simonw/datasette/blob/d6e03b04302a0852e7133dc030eab50177c37be7/datasette/views/base.py#L110-L125

datasette 107914493 issue    
663145122 MDU6SXNzdWU2NjMxNDUxMjI= 903 Add temporary plugin testing pattern to the testing docs simonw 9599 open 0     0 2020-07-21T16:22:34Z 2020-07-21T16:22:45Z   OWNER  

https://github.com/simonw/til/blob/master/pytest/registering-plugins-in-tests.md

Would be useful to include this pattern on https://datasette.readthedocs.io/en/stable/testing_plugins.html

datasette 107914493 issue    
442402832 MDExOlB1bGxSZXF1ZXN0Mjc3NTI0MDcy 458 setup: add tests to package exclusion hellerve 7725188 closed 0     1 2019-05-09T19:47:21Z 2020-07-21T01:14:42Z 2019-05-10T01:54:51Z CONTRIBUTOR simonw/datasette/pulls/458

This PR fixes #456 by adding tests to the package exclusion list.

Cheers

datasette 107914493 pull    

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Query took 221.208ms · About: github-to-sqlite