issues

1,332 rows sorted by updated_at descending

View and edit SQL

Suggested facets: milestone, author_association, created_at (date), updated_at (date), closed_at (date)

type

state

id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app
712368432 MDU6SXNzdWU3MTIzNjg0MzI= 984 Review accessibility of new column action menus simonw 9599 open 0     1 2020-09-30T23:56:44Z 2020-10-01T00:01:36Z   OWNER  

Feature added in #981

datasette 107914493 issue    
711627628 MDU6SXNzdWU3MTE2Mjc2Mjg= 981 Action menu for table columns simonw 9599 closed 0     16 2020-09-30T04:45:38Z 2020-09-30T23:58:18Z 2020-09-30T23:58:17Z OWNER  

At the very least I'd like a menu on each table column that lets me select sort-asc v.s. sort-desc without having to click twice.

I'd also like to be able to indicate that a column should be used for faceting (possibly only for columns that are not floating point and do not have a unique index on them).

This needs to be built with accessibility in mind - I don't want screenreaders to read out the contents of a menu as the "th" label for any given cell.

Related: #690

datasette 107914493 issue    
712316959 MDU6SXNzdWU3MTIzMTY5NTk= 183 Try out GitHub code scanning simonw 9599 closed 0     1 2020-09-30T22:16:14Z 2020-09-30T22:23:44Z 2020-09-30T22:23:44Z OWNER  

https://github.blog/2020-09-30-code-scanning-is-now-available/

sqlite-utils 140912432 issue    
326783670 MDU6SXNzdWUzMjY3ODM2NzA= 291 Avoid plugins accidentally loading dependencies twice simonw 9599 closed 0     3 2018-05-27T03:15:21Z 2020-09-30T20:36:12Z 2018-05-28T20:42:02Z OWNER  

Plugins that include JavaScript files risk loading the same code twice. In particular: I want to build a second plugin that uses the Leaflet mapping library (the first was datasette-cluster-map). But I don't want the two plugins to load duplicate copies of Leaflet.

datasette 107914493 issue    
542553350 MDU6SXNzdWU1NDI1NTMzNTA= 655 Copy and paste doesn't work reliably on iPhone for SQL editor simonw 9599 closed 0   Datasette 1.0 3268330 3 2019-12-26T13:15:10Z 2020-09-30T20:36:12Z 2020-08-30T17:51:40Z OWNER  

I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.

datasette 107914493 issue    
679700269 MDU6SXNzdWU2Nzk3MDAyNjk= 938 Pass columns to extra CSS/JS/etc plugin hooks simonw 9599 closed 0     3 2020-08-16T06:37:47Z 2020-09-30T20:36:12Z 2020-08-16T18:09:59Z OWNER  

I'd like datasette-cluster-map to only add links to JavaScript on pages that have tables with latitude and longitude columns.

Passing the names of the columns to the plugin hook can support this and will be backwards compatible thanks to pluggy.

datasette 107914493 issue    
684925907 MDU6SXNzdWU2ODQ5MjU5MDc= 948 Upgrade CodeMirror simonw 9599 closed 0   Datasette 0.49 5818042 8 2020-08-24T19:55:33Z 2020-09-30T20:36:12Z 2020-08-30T18:03:07Z OWNER  

Datasette currently bundles 5.31.0 (from October 2017) - latest version is 5.57.0 (August 2020). https://codemirror.net/doc/releases.html

datasette 107914493 issue    
314506446 MDU6SXNzdWUzMTQ1MDY0NDY= 214 Ability for plugins to define extra JavaScript and CSS simonw 9599 closed 0     6 2018-04-16T05:29:34Z 2020-09-30T20:36:11Z 2018-04-18T03:13:03Z OWNER  

This can hook in to the existing extra_css_urls and extra_js_urls mechanism:

https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L304-L305

The plugins should be able to bundle their own assets though, so it will also have to integrate with the /static/ static mounts mechanism somehow:

https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1255-L1257

Refs #14

datasette 107914493 issue    
460540321 MDU6SXNzdWU0NjA1NDAzMjE= 530 Extract codemirror SQL editor out into a plugin simonw 9599 open 0     0 2019-06-25T17:07:51Z 2020-09-30T20:35:25Z   OWNER  

Right now codemirror (used for the SQL editor on https://latest.datasette.io/fixtures?sql=select+*+from+%5B123_starts_with_digits%5D ) is the only JavaScript in Datasette.

It's also the only vendored dependency.

I'd like to move it out to a plugin. But... ideally I would like that plugin to be part of the default "pip install datasette" experience.

I don't know what the best pattern for optional dependencies is. I don't want to have to tell people to run pip install datasette[full]

datasette 107914493 issue    
712260429 MDU6SXNzdWU3MTIyNjA0Mjk= 983 JavaScript plugin hooks mechanism similar to pluggy simonw 9599 open 0     1 2020-09-30T20:32:43Z 2020-09-30T20:35:25Z   OWNER  

It would be neat to provide a JavaScript plugin hook that plugins can use to add their own options to this menu. No idea what that would look like though.

_Originally posted by @simonw in https://github.com/simonw/datasette/issues/981#issuecomment-701616922_

datasette 107914493 issue    
709043182 MDExOlB1bGxSZXF1ZXN0NDkzMTYyNzY3 178 Update README.md shakeel 19921 closed 0     1 2020-09-25T15:52:11Z 2020-09-30T20:29:28Z 2020-09-30T20:29:28Z CONTRIBUTOR simonw/sqlite-utils/pulls/178

The sqlite-utils insert releases.db releases - --pk is missing the pk field name, added "id" to fix it.

sqlite-utils 140912432 pull    
711649325 MDU6SXNzdWU3MTE2NDkzMjU= 182 Better handling of encodings other than utf-8 for "sqlite-utils insert" kaihendry 765871 open 0     1 2020-09-30T05:43:48Z 2020-09-30T20:28:23Z   NONE  

Makefile:

data.db:
        curl -O http://maps.natalian.org/data.txt
        go run csv-write.go > data.csv
        sqlite-utils insert data.db travels data.csv --csv

clean:
        rm data*

csv-write.go

Error message is:

sqlite-utils insert data.db travels data.csv --csv
Traceback (most recent call last):
  File "/home/hendry/.local/bin/sqlite-utils", line 8, in <module>
    sys.exit(cli())
  File "/home/hendry/.local/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/hendry/.local/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/hendry/.local/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/hendry/.local/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/hendry/.local/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/hendry/.local/lib/python3.8/site-packages/sqlite_utils/cli.py", line 614, in insert
    insert_upsert_implementation(
  File "/home/hendry/.local/lib/python3.8/site-packages/sqlite_utils/cli.py", line 553, in insert_upsert_implementation
    headers = next(reader)
  File "/usr/lib/python3.8/codecs.py", line 322, in decode
    (result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe3 in position 1234: invalid continuation byte
make: *** [Makefile:4: data.db] Error 1
[hendry@t14s datasette-map]$ sqlite-utils --version
sqlite-utils, version 2.19

Little bit surprised if Go is spewing out bad Unicode, but I'm not sure how to grok position 1234..

sqlite-utils 140912432 issue    
712202333 MDU6SXNzdWU3MTIyMDIzMzM= 982 SQL editor should allow execution of write queries, if you have permission simonw 9599 open 0     2 2020-09-30T19:04:35Z 2020-09-30T19:06:29Z   OWNER  

The datasette-write plugin provides this at the moment https://github.com/simonw/datasette-write - but it feels like it should be a built-in capability, protected by a default permission.

UI concept: if you have write permission then the existing SQL editor gets an "execute write" checkbox underneath it.

JavaScript can spot if you appear to be trying to execute an UPDATE or INSERT or DELETE query and check that checkbox for you.

If you link to a query page with a non-SELECT then that query will be displayed in the box ready for you to POST submit it. The page will also then get "cannot be embedded" headers to protect against clickjacking.

datasette 107914493 issue    
710819020 MDU6SXNzdWU3MTA4MTkwMjA= 980 Another rendering glitch with column headers on mobile simonw 9599 closed 0     2 2020-09-29T06:53:13Z 2020-09-29T19:21:51Z 2020-09-29T19:21:50Z OWNER  

Similar to #978.

https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_rrule(%27FREQ%3DHOURLY%3BCOUNT%3D5%27)%2C%0D%0A++dateutil_rrule_date(%0D%0A++++%27FREQ%3DDAILY%3BCOUNT%3D3%27%2C%0D%0A++++%271st+jan+2020%27%0D%0A++)%3B

https://user-images.githubusercontent.com/9599/94523237-c0e05d00-01e5-11eb-880d-5535f43f07a5.png">

datasette 107914493 issue    
710650633 MDU6SXNzdWU3MTA2NTA2MzM= 979 Default table view JSON should include CREATE TABLE simonw 9599 open 0     2 2020-09-28T23:54:58Z 2020-09-28T23:56:28Z   OWNER  

https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable

datasette 107914493 issue    
710506708 MDU6SXNzdWU3MTA1MDY3MDg= 978 Rendering glitch with column headings on mobile simonw 9599 closed 0     6 2020-09-28T19:04:45Z 2020-09-28T22:43:01Z 2020-09-28T22:43:01Z OWNER  

https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_parse%28%2210+october+2020+3pm%22%29%2C%0D%0A++dateutil_easter%28%222020%22%29%2C%0D%0A++dateutil_parse_fuzzy%28%22This+is+due+10+september%22%29%2C%0D%0A++dateutil_parse%28%221%2F2%2F2020%22%29%2C%0D%0A++dateutil_parse%28%222020-03-04%22%29%2C%0D%0A++dateutil_parse_dayfirst%28%222020-03-04%22%29%2C%0D%0A++dateutil_easter%282020%29

datasette 107914493 issue    
710269200 MDExOlB1bGxSZXF1ZXN0NDk0MTQ2MDQz 977 Update pytest requirement from <6.1.0,>=5.2.2 to >=5.2.2,<6.2.0 dependabot-preview[bot] 27856297 closed 0     1 2020-09-28T13:33:05Z 2020-09-28T22:16:36Z 2020-09-28T22:16:35Z CONTRIBUTOR simonw/datasette/pulls/977

Updates the requirements on pytest to permit the latest version.


Release notes

Sourced from https://github.com/pytest-dev/pytest/releases">pytest's releases.



6.1.0


pytest 6.1.0 (2020-09-26)


Breaking Changes




  • https://github-redirect.dependabot.com/pytest-dev/pytest/issues/5585">#5585: As per our policy, the following features which have been deprecated in the 5.X series are now
    removed:



    • The funcargnames read-only property of FixtureRequest, Metafunc, and Function classes. Use fixturenames attribute.

    • @pytest.fixture no longer supports positional arguments, pass all arguments by keyword instead.

    • Direct construction of Node subclasses now raise an error, use from_parent instead.

    • The default value for junit_family has changed to xunit2. If you require the old format, add junit_family=xunit1 to your configuration file.

    • The TerminalReporter no longer has a writer attribute. Plugin authors may use the public functions of the TerminalReporter instead of accessing the TerminalWriter object directly.

    • The --result-log option has been removed. Users are recommended to use the https://github.com/pytest-dev/pytest-reportlog">pytest-reportlog plugin instead.


    For more information consult
    https://docs.pytest.org/en/stable/deprecations.html">Deprecations and Removals in the docs.




Deprecations



Features



Improvements






Changelog

Sourced from https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst">pytest's changelog.




Commits



Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
datasette 107914493 pull    
709920027 MDU6SXNzdWU3MDk5MjAwMjc= 181 pk=["id"] should have same effect as pk="id" simonw 9599 open 0     1 2020-09-28T04:28:07Z 2020-09-28T04:29:23Z   OWNER  
In [11]: db['one'].insert({"id": 1, "name": "oentuh"}, pk="id")
Out[11]: <Table one (id, name)>

In [12]: db['two'].insert({"id": 1, "name": "oentuh"}, pk=["id"])
Out[12]: <Table two (id, name)>

In [13]: db['one'].schema
Out[13]: 'CREATE TABLE [one] (\n   [id] INTEGER PRIMARY KEY,\n   [name] TEXT\n)'

In [14]: db['two'].schema
Out[14]: 'CREATE TABLE [two] (\n   [id] INTEGER,\n   [name] TEXT\n)'
sqlite-utils 140912432 issue    
709861194 MDU6SXNzdWU3MDk4NjExOTQ= 180 Try running some tests using Hypothesis simonw 9599 open 0     1 2020-09-28T01:11:30Z 2020-09-28T01:11:45Z   OWNER  

Inspired by this Twitter conversation: https://twitter.com/simonw/status/1310386009465479168

sqlite-utils 140912432 issue    
642388564 MDU6SXNzdWU2NDIzODg1NjQ= 858 publish heroku does not work on Windows 10 simonlau 870912 open 0     2 2020-06-20T14:40:28Z 2020-09-27T21:23:05Z   NONE  

When executing "datasette publish heroku schools.db" on Windows 10, I get the following error

  File "c:\users\dell\.virtualenvs\sec-schools-jn-cwk8z\lib\site-packages\datasette\publish\heroku.py", line 54, in heroku
    line.split()[0] for line in check_output(["heroku", "plugins"]).splitlines()
  File "c:\python38\lib\subprocess.py", line 411, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "c:\python38\lib\subprocess.py", line 489, in run
    with Popen(*popenargs, **kwargs) as process:
  File "c:\python38\lib\subprocess.py", line 854, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "c:\python38\lib\subprocess.py", line 1307, in _execute_child
    hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
FileNotFoundError: [WinError 2] The system cannot find the file specified

Changing https://github.com/simonw/datasette/blob/55a6ffb93c57680e71a070416baae1129a0243b8/datasette/publish/heroku.py#L54

to

line.split()[0] for line in check_output(["heroku", "plugins"], shell=True).splitlines()

as well as the other check_output() and call() within the same file leads me to another recursive error about temp files

datasette 107914493 issue    
709577625 MDU6SXNzdWU3MDk1Nzc2MjU= 179 sqlite-utils transform/insert --detect-types simonw 9599 open 0     3 2020-09-26T17:28:55Z 2020-09-27T20:31:50Z   OWNER  

Idea from https://github.com/simonw/datasette-edit-tables/issues/13 - provide Python utility methods and accompanying CLI options for detecting the likely types of TEXT columns.

So if you have a text column that actually contained exclusively integer string values, it can let you know and let you run transform against it.

sqlite-utils 140912432 issue    
709789634 MDU6SXNzdWU3MDk3ODk2MzQ= 27 Sort order is not persisted by facet filter links simonw 9599 open 0     0 2020-09-27T18:22:07Z 2020-09-27T18:22:07Z   MEMBER  

A link to /-/beta?category=1&timestamp__date=2018-08-01&q=swedish should be to /-/beta?category=1&timestamp__date=2018-08-01&q=swedish&sort=newest

dogsheep-beta 197431109 issue    
708185405 MDU6SXNzdWU3MDgxODU0MDU= 975 Dependabot couldn't authenticate with https://pypi.python.org/simple/ dependabot-preview[bot] 27856297 closed 0     0 2020-09-24T13:44:40Z 2020-09-25T13:34:34Z 2020-09-25T13:34:34Z CONTRIBUTOR  

Dependabot couldn't authenticate with https://pypi.python.org/simple/.

You can provide authentication details in your Dependabot dashboard by clicking into the account menu (in the top right) and selecting 'Config variables'.

View the update logs.

datasette 107914493 issue    
675753042 MDU6SXNzdWU2NzU3NTMwNDI= 131 "insert" command options for column types simonw 9599 open 0     1 2020-08-09T18:59:11Z 2020-09-24T22:48:32Z   OWNER  

The insert command currently results in string types for every column - at least when used against CSV or TSV inputs.

It would be useful if you could do the following:

  • automatically detects the column types based on eg the first 1000 records
  • explicitly state the rule for specific columns

--detect-types could work for the former - or it could do that by default and allow opt-out using --no-detect-types

For specific columns maybe this:

sqlite-utils insert db.db images images.tsv \
  --tsv \
  -c id int \
  -c score float
sqlite-utils 140912432 issue    
683812642 MDU6SXNzdWU2ODM4MTI2NDI= 136 --spatialite option for sqlite-utils query simonw 9599 open 0     2 2020-08-21T20:31:25Z 2020-09-24T22:47:29Z   OWNER  

In conjunction with #135 - this would do the same thing as --load-extension=path-to-spatialite (see #134)

sqlite-utils 140912432 issue    
683830416 MDU6SXNzdWU2ODM4MzA0MTY= 137 --load-extension for other sqlite-utils commands simonw 9599 open 0     1 2020-08-21T21:12:56Z 2020-09-24T22:47:29Z   OWNER  

e.g. for this:

calands-datasette % sqlite-utils tables calands.db --counts
[{"table": "spatial_ref_sys", "count": 4924},
 {"table": "spatialite_history", "count": 14},
 {"table": "sqlite_sequence", "count": 1},
 {"table": "geometry_columns", "count": 2},
 {"table": "spatial_ref_sys_aux", "count": 4873},
 {"table": "views_geometry_columns", "count": 0},
 {"table": "virts_geometry_columns", "count": 0},
 {"table": "geometry_columns_statistics", "count": 2},
 {"table": "views_geometry_columns_statistics", "count": 0},
 {"table": "virts_geometry_columns_statistics", "count": 0},
 {"table": "geometry_columns_field_infos", "count": 0},
 {"table": "views_geometry_columns_field_infos", "count": 0},
 {"table": "virts_geometry_columns_field_infos", "count": 0},
 {"table": "geometry_columns_time", "count": 2},
 {"table": "geometry_columns_auth", "count": 2},
 {"table": "views_geometry_columns_auth", "count": 0},
 {"table": "virts_geometry_columns_auth", "count": 0},
Traceback (most recent call last):
  File "/usr/local/bin/sqlite-utils", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py", line 143, in tables
    for line in output_rows(_iter(), headers, nl, arrays, json_cols):
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py", line 922, in output_rows
    for row, next_row in itertools.zip_longest(current_iter, next_iter):
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py", line 123, in _iter
    row.append(db[name].count)
  File "/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/db.py", line 458, in count
    return self.db.conn.execute(
sqlite3.OperationalError: no such module: VirtualSpatialIndex

The tables command could take --load-extension too - as could rows and other similar commands.

Follow-on from #134

sqlite-utils 140912432 issue    
684118950 MDU6SXNzdWU2ODQxMTg5NTA= 138 extracts= doesn't configure foreign keys simonw 9599 closed 0     2 2020-08-23T05:21:15Z 2020-09-24T22:47:01Z 2020-09-24T22:46:52Z OWNER  

In using extracts= for shapefiles-to-sqlite in https://github.com/simonw/shapefile-to-sqlite/issues/9 I've run into a couple of pretty serious flaws:

  • The columns in the original table are still TEXT even when the foreign key they are supposed to reference is an INTEGER - which means Datasette foreign key features don't actually work
  • Those foreign key relationships aren't setup automatically - creating them is left as an exercise for the developer
sqlite-utils 140912432 issue    
707478649 MDU6SXNzdWU3MDc0Nzg2NDk= 173 Progress bar for sqlite-utils insert simonw 9599 open 0     4 2020-09-23T15:43:56Z 2020-09-24T20:50:19Z   OWNER  

It would be nice if sqlite-utils insert had a progress bar, for when it's churning through huge CSV files.

sqlite-utils 140912432 issue    
652700770 MDU6SXNzdWU2NTI3MDA3NzA= 119 Ability to remove a foreign key simonw 9599 closed 0     3 2020-07-07T22:31:37Z 2020-09-24T20:36:59Z 2020-09-24T20:36:59Z OWNER  

Useful if you add one but make a mistake and need to undo it without recreating the database from scratch.

sqlite-utils 140912432 issue    
581795570 MDU6SXNzdWU1ODE3OTU1NzA= 93 Support more string values for types in .add_column() simonw 9599 open 0     0 2020-03-15T19:32:49Z 2020-09-24T20:36:46Z   OWNER  

https://sqlite-utils.readthedocs.io/en/2.4.2/python-api.html#adding-columns says:

SQLite types you can specify are "TEXT", "INTEGER", "FLOAT" or "BLOB".

As discovered in #92 this isn't the right list of values. I should expand this to match https://www.sqlite.org/datatype3.html

sqlite-utils 140912432 issue    
652961907 MDU6SXNzdWU2NTI5NjE5MDc= 121 Improved (and better documented) support for transactions simonw 9599 open 0     3 2020-07-08T04:56:51Z 2020-09-24T20:36:46Z   OWNER  

_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/118#issuecomment-655283393_

We should put some thought into how this library supports and encourages smart use of transactions.

sqlite-utils 140912432 issue    
573578548 MDU6SXNzdWU1NzM1Nzg1NDg= 89 Ability to customize columns used by extracts= feature simonw 9599 open 0     2 2020-03-01T16:54:48Z 2020-09-24T20:36:45Z   OWNER  

@simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, .lookup() allows you to define the "value" column (the documentation uses name), but when you use extracts keyword as part of .insert(), .upsert() etc. the lookup must be done against a column named "value". I have an existing lookup table that I've populated with columns "id" and "name" as opposed to "id" and "value", and seems I can't use extracts=, unless I'm missing something...

Initial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so:

table = db.table("trees", extracts={"species_id": ("Species", "name"})

I haven't dug too much into the existing code yet, but does this make sense? Worth doing?

_Originally posted by @chrishas35 in https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503_

sqlite-utils 140912432 issue    
688352145 MDU6SXNzdWU2ODgzNTIxNDU= 141 insert-files support for compressed values simonw 9599 open 0     0 2020-08-28T20:59:46Z 2020-09-24T20:36:08Z   OWNER  

The sqlar format supports this, it would be useful if insert-files could support this too.

https://www.sqlite.org/sqlar.html

sqlite-utils 140912432 issue    
688351054 MDU6SXNzdWU2ODgzNTEwNTQ= 140 Idea: insert-files mechanism for adding extra columns with fixed values simonw 9599 open 0     0 2020-08-28T20:57:36Z 2020-09-24T20:36:07Z   OWNER  

Say for example you want to populate a file_type column with the value gif. That could work like this:

sqlite-utils insert-files gifs.db images *.gif \
    -c path -c md5 -c last_modified:mtime \
    -c file_type:text:gif --pk=path

So a column defined as a text column with a value that follows a second colon.

sqlite-utils 140912432 issue    
708261775 MDU6SXNzdWU3MDgyNjE3NzU= 175 Add docs for .transform(column_order=) simonw 9599 closed 0     3 2020-09-24T15:19:04Z 2020-09-24T20:35:48Z 2020-09-24T16:00:56Z OWNER  

Need to update docs for .transform() now that column_order= is available.
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/174#discussion_r494403327_

Maybe also add this as an option to sqlite-utils transform - since reordering columns is actually a pretty nice capability.

sqlite-utils 140912432 issue    
702386948 MDU6SXNzdWU3MDIzODY5NDg= 159 .delete_where() does not auto-commit (unlike .insert() or .upsert()) spdkils 11712349 open 0     6 2020-09-16T01:55:52Z 2020-09-24T20:35:47Z   NONE  

When you use the delete_where() function on a table, it never commits....

Is that intentional?

sqlite-utils 140912432 issue    
705190723 MDU6SXNzdWU3MDUxOTA3MjM= 160 table.enable_fts(..., replace=True) simonw 9599 closed 0   2.19 5896742 1 2020-09-20T21:36:23Z 2020-09-24T20:35:47Z 2020-09-20T22:05:51Z OWNER  

I noticed that https://til.simonwillison.net/ search doesn't use porter stemming. I'd like to add that, but since the build script always operates on an existing database (to avoid re-rendering markdown and re-building image thumbnails) I'd like it to only add porter stemming if it's not there already.

So I'd like to be able to say "set up FTS to look like this, and fix it if it doesn't".

I think the neatest way to do that is with a replace=True argument to .enable_fts(), for consistency with def .create_view(self, name, sql, replace=True).

So the replace=True argument would check and see if the configured FTS exists already with the correct options (columns, stemming, triggers) - and if any of those are incorrect it would call .disable_fts() and then create a new FTS configuration with the correct options.

sqlite-utils 140912432 issue    
706001517 MDU6SXNzdWU3MDYwMDE1MTc= 163 Idea: conversions= could take Python functions simonw 9599 open 0     1 2020-09-22T00:37:12Z 2020-09-24T20:35:47Z   OWNER  

Right now you use conversions= like this:

db["example"].insert({
    "name": "The Bigfoot Discovery Museum"
}, conversions={"name": "upper(?)"})

How about if you could optionally provide a Python function (or a lambda) like this?

db["example"].insert({
    "name": "The Bigfoot Discovery Museum"
}, conversions={"name": lambda s: s.upper()})

This would work by creating a random name for that function, registering it (similar to #162), executing the SQL and then un-registering the custom function at the end.

sqlite-utils 140912432 issue    
706091046 MDU6SXNzdWU3MDYwOTEwNDY= 165 Make .transform() a keyword arguments only function simonw 9599 closed 0   2.20 5897911 0 2020-09-22T05:37:29Z 2020-09-24T20:35:47Z 2020-09-22T06:39:12Z OWNER  

And rename the first argument from columns= to types=

sqlite-utils 140912432 issue    
695377804 MDU6SXNzdWU2OTUzNzc4MDQ= 153 table.optimize() should delete junk rows from *_fts_docsize simonw 9599 closed 0     3 2020-09-07T20:31:09Z 2020-09-24T20:35:46Z 2020-09-07T21:16:33Z OWNER  

The second challenge here is cleaning up all of those junk rows in existing *_fts_docsize tables. Doing that just to the demo database from https://github-to-sqlite.dogsheep.net/github.db dropped its size from 22MB to 16MB! Here's the SQL:
sql DELETE FROM [licenses_fts_docsize] WHERE id NOT IN ( SELECT rowid FROM [licenses_fts]);
I can do that as part of the existing table.optimize() method, which optimizes FTS tables.
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688501064_

sqlite-utils 140912432 issue    
696045581 MDU6SXNzdWU2OTYwNDU1ODE= 155 rebuild-fts command and table.rebuild_fts() method simonw 9599 closed 0     2 2020-09-08T17:19:26Z 2020-09-24T20:35:46Z 2020-09-08T23:16:10Z OWNER  

https://sqlite.org/forum/forumpost/fa777fff86

Easiest thing would be to run a 'rebuild' to rebuild the FTS index from scratch based on the contents of the content table. i.e.

INSERT INTO licenses_fts(licenses_fts) VALUES('rebuild');

https://www.sqlite.org/fts5.html#the_rebuild_command

sqlite-utils 140912432 issue    
708301810 MDU6SXNzdWU3MDgzMDE4MTA= 177 Simplify .transform(drop_foreign_keys=) and sqlite-transform --drop-foreign-key simonw 9599 closed 0     1 2020-09-24T16:13:50Z 2020-09-24T20:35:03Z 2020-09-24T16:19:13Z OWNER  

These both currently require you to provide three strings, for column, other_table, other_column.

Just providing column should be enough information.

sqlite-utils 140912432 issue    
708293114 MDU6SXNzdWU3MDgyOTMxMTQ= 176 sqlite-utils transform column order option simonw 9599 closed 0     2 2020-09-24T16:01:21Z 2020-09-24T20:34:51Z 2020-09-24T16:11:59Z OWNER  

Split from #175

sqlite-utils 140912432 issue    
697179806 MDU6SXNzdWU2OTcxNzk4MDY= 157 sqlite-utils add-foreign-keys command simonw 9599 closed 0   2.19 5896742 2 2020-09-09T21:44:30Z 2020-09-24T20:34:50Z 2020-09-20T20:14:30Z OWNER  

Like add-foreign-key but can do multiple foreign keys at once. Inspired by https://github.com/simonw/calands-datasette/blob/99de39dd80a906f5c1f16724467b0cd55ba4ef36/build.sh which does this:

sqlite-utils add-foreign-key calands.db units_with_maps ACCESS_TYP
sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_NAME
sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_LEV
sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_TYP
sqlite-utils add-foreign-key calands.db units_with_maps LAYER
sqlite-utils add-foreign-key calands.db units_with_maps MNG_AGENCY
sqlite-utils add-foreign-key calands.db units_with_maps MNG_AG_LEV
sqlite-utils add-foreign-key calands.db units_with_maps MNG_AG_TYP
sqlite-utils add-foreign-key calands.db units_with_maps COUNTY
sqlite-utils add-foreign-key calands.db units_with_maps DES_TP
sqlite-utils 140912432 issue    
706017416 MDU6SXNzdWU3MDYwMTc0MTY= 164 sqlite-utils transform sub-command simonw 9599 closed 0   2.20 5897911 4 2020-09-22T01:32:20Z 2020-09-24T20:34:50Z 2020-09-22T07:48:05Z OWNER  

The .transform() method in #114 warrants an equivalent CLI tool.

sqlite-utils 140912432 issue    
706757891 MDU6SXNzdWU3MDY3NTc4OTE= 169 Progress bar for "sqlite-utils extract" simonw 9599 closed 0   2.20 5897911 0 2020-09-22T23:40:21Z 2020-09-24T20:34:50Z 2020-09-23T00:02:40Z OWNER  

Since these operations could take a long time against large tables, it would be neat if there was a progress bar option for the CLI command.

The operations are full table scans so calculating progress shouldn't be too difficult.
_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/42#issuecomment-513246831_

sqlite-utils 140912432 issue    
708289783 MDU6SXNzdWU3MDgyODk3ODM= 976 Idea: -o could open to a more convenient location simonw 9599 open 0     1 2020-09-24T15:56:35Z 2020-09-24T17:42:35Z   OWNER  

Idea: if a database only has a single table, this could open straight to /db/table. If it has multiple tables but a single database it could open straight to /db.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/970#issuecomment-698434236_

datasette 107914493 issue    
705108492 MDU6SXNzdWU3MDUxMDg0OTI= 970 request an "-o" option on "datasette server" to open the default browser at the running url secretGeek 2861690 closed 0     4 2020-09-20T13:16:34Z 2020-09-24T15:56:50Z 2020-09-22T14:27:04Z NONE  

This is a request for a "convenience" feature, and only a nice to have. It's based on seeing this feature in several little command line hypertext server apps.

If you run, for example:

datasette.exe serve --open "mydb.s3db"

I would like it if default browser is launched, at the URL that is being served.

The angular cli does this, for example

ng serve <project> --open #see https://angular.io/cli/serve

...as does my usual mini web server of choice when inspecting local static files....

npx http-server -o # see https://www.npmjs.com/package/http-server

Just a tiny thing. Love your work!

datasette 107914493 issue    
707944044 MDExOlB1bGxSZXF1ZXN0NDkyMjU3NDA1 174 Much, much faster extract() implementation simonw 9599 closed 0     7 2020-09-24T07:52:31Z 2020-09-24T15:44:00Z 2020-09-24T15:43:56Z OWNER simonw/sqlite-utils/pulls/174

Takes my test down from ten minutes to four seconds. Refs #172.

sqlite-utils 140912432 pull    
707427200 MDU6SXNzdWU3MDc0MjcyMDA= 172 Improve performance of extract operations simonw 9599 closed 0     9 2020-09-23T14:40:50Z 2020-09-24T15:43:57Z 2020-09-24T15:43:57Z OWNER  

This command took about 12 minutes (against a 150MB file with 680,000 rows in it):

sqlite-utils extract salaries.db salaries \
   'Organization Group Code' 'Organization Group' \
  --table 'organization_groups' \
  --fk-column 'organization_group_id' \
  --rename 'Organization Group Code' code \
  --rename 'Organization Group' name

I'm pretty confident we can do better than that.

sqlite-utils 140912432 issue    
275125561 MDU6SXNzdWUyNzUxMjU1NjE= 123 Datasette serve should accept paths/URLs to CSVs and other file formats simonw 9599 open 0     6 2017-11-19T02:05:48Z 2020-09-24T07:42:05Z   OWNER  

This would remove the csvs-to-sqlite step which I end up using for almost everything.

I'm hesitant to introduce pandas as a required dependency though since it require compiling numpy. Could build it so this option is only available if you have pandas installed.

datasette 107914493 issue    
707849175 MDU6SXNzdWU3MDc4NDkxNzU= 974 static assets and favicon aren't cached by the browser obra 45416 open 0     1 2020-09-24T04:44:55Z 2020-09-24T04:52:58Z   NONE  

Using datasette to solve some frustrating problems with our fulfillment provider today, I was surprised to see repeated requests for assets under /-/static and the favicon. While it won't likely be a huge performance bottleneck, I bet datasette would feel a bit zippier if you had Uvicorn serving up some caching-related headers telling the browser it was safe to cache static assets.

datasette 107914493 issue    
520655983 MDU6SXNzdWU1MjA2NTU5ODM= 619 "Invalid SQL" page should let you edit the SQL simonw 9599 open 0     7 2019-11-10T20:54:12Z 2020-09-23T23:31:46Z   OWNER  

https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D

Would be useful if this page showed you the invalid SQL you entered so you can edit it and try again.

datasette 107914493 issue    
274615452 MDU6SXNzdWUyNzQ2MTU0NTI= 111 Add “last_updated” to metadata simonw 9599 open 0     5 2017-11-16T18:22:20Z 2020-09-23T15:29:12Z   OWNER  

To give an indication as to when the data was last updated.

This should be a field in the metadata that is then shown on the index page and in the footer, if it is set.

Also support setting it using an option to “datasette publish” and “datasette package” - which can either be a string or can be the magic string “today” to set it to today’s date:

datasette publish file.db --last_updated=today
datasette 107914493 issue    
707407567 MDU6SXNzdWU3MDc0MDc1Njc= 171 Idea: transitive closure tables for tree structures mhalle 649467 open 0     0 2020-09-23T14:17:33Z 2020-09-23T14:17:33Z   NONE  

I just read that sqlite has a transitive closure table extension using a virtual table in order to represent trees:

https://charlesleifer.com/blog/querying-tree-structures-in-sqlite-using-python-and-the-transitive-closure-extension/

Even without this extension, though, a util to build a transitive closure table would allow trees to be queried easily. Since it relies on self-referential foreign keys, the relationships might even be able to be automatically detected.

sqlite-utils 140912432 issue    
706768798 MDU6SXNzdWU3MDY3Njg3OTg= 170 Release notes for 2.20 simonw 9599 closed 0   2.20 5897911 1 2020-09-23T00:13:22Z 2020-09-23T00:31:25Z 2020-09-23T00:31:25Z OWNER  

https://github.com/simonw/sqlite-utils/compare/2.19...b8e004

sqlite-utils 140912432 issue    
706098005 MDU6SXNzdWU3MDYwOTgwMDU= 167 Review the foreign key pragma stuff simonw 9599 closed 0   2.20 5897911 1 2020-09-22T05:55:20Z 2020-09-23T00:13:02Z 2020-09-23T00:13:02Z OWNER  

It is not possible to enable or disable foreign key constraints in the middle of a multi-statement transaction (when SQLite is not in autocommit mode). Attempting to do so does not return an error; it simply has no effect.

https://sqlite.org/foreignkeys.html

sqlite-utils 140912432 issue    
470345929 MDU6SXNzdWU0NzAzNDU5Mjk= 42 table.extract(...) method and "sqlite-utils extract" command simonw 9599 closed 0   2.20 5897911 21 2019-07-19T14:09:36Z 2020-09-22T23:39:31Z 2020-09-22T23:37:49Z OWNER  

One of my favourite features of csvs-to-sqlite is that it can "extract" columns into a separate lookup table - for example:

csvs-to-sqlite big_csv_file.csv -c country output.db

This will turn the country column in the resulting table into a integer foreign key against a new country table. You can see an example of what that looks like here: https://san-francisco.datasettes.com/registered-business-locations-3d50679/Business+Corridor was extracted from https://san-francisco.datasettes.com/registered-business-locations-3d50679/Registered_Business_Locations_-_San_Francisco?Business%20Corridor=1

I'd like to have the same capability in sqlite-utils - but with the ability to run it against an existing SQLite table rather than just against a CSV.

sqlite-utils 140912432 issue    
706486323 MDU6SXNzdWU3MDY0ODYzMjM= 973 'bool' object is not callable error simonw 9599 closed 0     2 2020-09-22T15:30:54Z 2020-09-22T15:40:35Z 2020-09-22T15:40:35Z OWNER  

I'm getting this when latest is deployed to Cloud Run:

Traceback (most recent call last):
  File "/usr/local/bin/datasette", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/datasette/cli.py", line 406, in serve
    inspect_data = json.load(open(inspect_file))
TypeError: 'bool' object is not callable

I think I may have broken things in #970 - a980199e61fe7ccf02c2123849d86172d2ae54ff

datasette 107914493 issue    
705057955 MDU6SXNzdWU3MDUwNTc5NTU= 969 Passing additional flags to tools used during publishing betatim 1448859 open 0     1 2020-09-20T06:54:53Z 2020-09-22T15:15:14Z   NONE  

This issue is about how best to pass additional options to tools used for publishing datasettes. A concrete example is wanting to pass the --tar flag to the heroku CLI tool. I think there are at least two options for doing this: documentation for each publishing tool to explain how to set flags via env variables (if possible) or building a mechanism that lets users pass additional flags through datasette.

When using datasette publish heroku binder-launches.db --extra-options="--config facet_time_limit_ms:35000 --config sql_time_limit_ms:35000" --name=binderlytics --install=datasette-vega to publish https://binderlytics.herokuapp.com/ the following error happens:

 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
Setting WEB_CONCURRENCY and restarting ⬢ binderlytics... done, v13
WEB_CONCURRENCY: 1
 ›   Warning: heroku update available from 7.42.1 to 7.43.0.
 ▸    Couldn't detect GNU tar. Builds could fail due to decompression errors
 ▸    See https://devcenter.heroku.com/articles/platform-api-deploying-slugs#create-slug-archive
 ▸    Please install it, or specify the '--tar' option
 ▸    Falling back to node's built-in compressor
buffer.js:358
    throw new ERR_INVALID_OPT_VALUE.RangeError('size', size);
    ^

RangeError [ERR_INVALID_OPT_VALUE]: The value "3303763968" is invalid for option "size"
    at Function.alloc (buffer.js:367:3)
    at new Buffer (buffer.js:281:19)
    at Readable.<anonymous> (/Users/thead/.local/share/heroku/node_modules/archiver-utils/index.js:39:15)
    at Readable.emit (events.js:322:22)
    at endReadableNT (/Users/thead/.local/share/heroku/node_modules/readable-stream/lib/_stream_readable.js:1010:12)
    at processTicksAndRejections (internal/process/task_queues.js:84:21) {
  code: 'ERR_INVALID_OPT_VALUE'
}

After installing GNU tar with brew install gnu-tar and modifying datasette/publish/heroku.py to include the --tar=/path/to/gnu-tar publishing works.

I think the problem occurs once your heroku slug reaches a certain size. At least when I add only a few 100 entries to the datasette then the error does not occcur.

datasette version 0.49.1
OSX 10.14.6 (18G103)

datasette 107914493 issue    
681375466 MDU6SXNzdWU2ODEzNzU0NjY= 943 await datasette.client.get(path) mechanism for executing internal requests simonw 9599 open 0     31 2020-08-18T22:17:42Z 2020-09-22T15:00:39Z   OWNER  

datasette-graphql works by making internal requests to the TableView class (in order to take advantage of existing pagination logic, plus options like ?_search= and ?_where=) - see #915

I want to support a mod_rewrite style mechanism for putting nicer URLs on top of Datasette pages - I botched that together for a project here using an internal ASGI proxying trick: https://github.com/natbat/tidepools_near_me/commit/ec102c6da5a5d86f17628740d90b6365b671b5e1

If the datasette object provided a documented method for executing internal requests (in a way that makes sense with logging etc - i.e. doesn't get logged as a separate request) both of these use-cases would be much neater.

datasette 107914493 issue    
706167456 MDU6SXNzdWU3MDYxNjc0NTY= 168 Automate (as much as possible) updates published to Homebrew simonw 9599 open 0     1 2020-09-22T08:08:37Z 2020-09-22T08:11:30Z   OWNER  

I'd like to get new sqlite-utils (and Datasette) releases submitted to Homebrew as painlessly as possible.

sqlite-utils 140912432 issue    
455486286 MDU6SXNzdWU0NTU0ODYyODY= 26 Mechanism for turning nested JSON into foreign keys / many-to-many simonw 9599 open 0     10 2019-06-13T00:52:06Z 2020-09-22T07:56:07Z   OWNER  

The GitHub JSON APIs have a really interesting convention with respect to related objects.

Consider https://api.github.com/repos/simonw/sqlite-utils/issues - here's a truncated subset:

  {
    "id": 449818897,
    "node_id": "MDU6SXNzdWU0NDk4MTg4OTc=",
    "number": 24,
    "title": "Additional Column Constraints?",
    "user": {
      "login": "IgnoredAmbience",
      "id": 98555,
      "node_id": "MDQ6VXNlcjk4NTU1",
      "avatar_url": "https://avatars0.githubusercontent.com/u/98555?v=4",
      "gravatar_id": ""
    },
    "labels": [
      {
        "id": 993377884,
        "node_id": "MDU6TGFiZWw5OTMzNzc4ODQ=",
        "url": "https://api.github.com/repos/simonw/sqlite-utils/labels/enhancement",
        "name": "enhancement",
        "color": "a2eeef",
        "default": true
      }
    ],
    "state": "open"
  }

The user column lists a complete user. The labels column has a list of labels.

Since both user and label have populated id field this is actually enough information for us to create records for them AND set up the corresponding foreign key (for user) and m2m relationships (for labels).

It would be really neat if sqlite-utils had some kind of mechanism for correctly processing these kind of patterns.

Thanks to jq there's not much need for extra customization of the shape here - if we support a narrowly defined structure users can use jq to reshape arbitrary JSON to match.

sqlite-utils 140912432 issue    
621989740 MDU6SXNzdWU2MjE5ODk3NDA= 114 table.transform() method for advanced alter table simonw 9599 closed 0   2.20 5897911 26 2020-05-20T18:20:46Z 2020-09-22T07:51:37Z 2020-09-22T04:20:02Z OWNER  

SQLite's ALTER TABLE can only do the following:

  • Rename a table
  • Rename a column
  • Add a column

Notably, it cannot drop columns - so tricks like "add a float version of this text column, populate it, then drop the old one and rename" won't work.

The docs here https://www.sqlite.org/lang_altertable.html#making_other_kinds_of_table_schema_changes describe a way of implementing full alters safely within a transaction, but it's fiddly.

  1. Create new table
  2. Copy data
  3. Drop old table
  4. Rename new into old

It would be great if sqlite-utils provided an abstraction to help make these kinds of changes safely.

sqlite-utils 140912432 issue    
557830332 MDExOlB1bGxSZXF1ZXN0MzY5MzQ4MDg0 78 New conversions= feature, refs #77 simonw 9599 closed 0     0 2020-01-31T00:02:33Z 2020-09-22T07:48:29Z 2020-01-31T00:24:31Z OWNER simonw/sqlite-utils/pulls/78 sqlite-utils 140912432 pull    
705975133 MDExOlB1bGxSZXF1ZXN0NDkwNjA3OTQ5 161 table.transform() method simonw 9599 closed 0   2.20 5897911 13 2020-09-21T23:16:59Z 2020-09-22T07:48:24Z 2020-09-22T04:20:02Z OWNER simonw/sqlite-utils/pulls/161

Refs #114

  • Ability to change the primary key
  • Support for changing default value for columns
  • Support for changing NOT NULL status of columns
  • Support for copying existing foreign keys and removing them
  • <strike>Support for conversions= parameter</strike>
  • Detailed documentation
  • PRAGMA foreign_keys stuff
sqlite-utils 140912432 pull    
706092617 MDExOlB1bGxSZXF1ZXN0NDkwNzAzMTcz 166 Keyword only arguments for transform() simonw 9599 closed 0     0 2020-09-22T05:41:44Z 2020-09-22T06:39:11Z 2020-09-22T06:39:11Z OWNER simonw/sqlite-utils/pulls/166

Refs #165

sqlite-utils 140912432 pull    
705995722 MDU6SXNzdWU3MDU5OTU3MjI= 162 A decorator for registering custom SQL functions simonw 9599 closed 0     2 2020-09-22T00:18:32Z 2020-09-22T00:40:44Z 2020-09-22T00:32:17Z OWNER  

Syntactic sugar for db.conn.create_function - it would work something like this:

db = sqlite_utils.Database("mydb.db")

@db.register_function
def scramble(text):
    chars = list(text)
    random.shuffle(chars)
    return "".join(chars)

The decorator would inspect the function to find its name and arity (number of arguments). Having run the above you could then do:

db.execute("select scramble('hello')").fetchall()
sqlite-utils 140912432 issue    
705840673 MDU6SXNzdWU3MDU4NDA2NzM= 972 Support faceting against arbitrary SQL queries simonw 9599 open 0     1 2020-09-21T19:00:43Z 2020-09-21T19:01:25Z   OWNER  

... support for running facets against arbitrary custom SQL queries is half-done in that facets now execute against wrapped subqueries as-of ea66c45df96479ef66a89caa71fff1a97a862646

https://github.com/simonw/datasette/blob/ea66c45df96479ef66a89caa71fff1a97a862646/datasette/facets.py#L192-L200
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/971#issuecomment-696307922_

datasette 107914493 issue    
705827457 MDU6SXNzdWU3MDU4Mjc0NTc= 971 Support the dbstat table simonw 9599 closed 0     7 2020-09-21T18:38:53Z 2020-09-21T19:00:02Z 2020-09-21T18:59:52Z OWNER  

dbstat is a table that is usually available on SQLite giving statistics about the database. For example:

https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=SELECT+*+FROM+%22dbstat%22+WHERE+name%3D%27bachelorette%2Fbachelorette%27%3B

<table> <thead> <tr> <th>name</th> <th>path</th> <th>pageno</th> <th>pagetype</th> <th>ncell</th> <th>payload</th> <th>unused</th> <th>mx_payload</th> <th>pgoffset</th> <th>pgsize</th> </tr> </thead> <tbody> <tr> <td>bachelorette/bachelorette</td> <td>/</td> <td>89</td> <td>internal</td> <td>13</td> <td>0</td> <td>3981</td> <td>0</td> <td>360448</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/000/</td> <td>91</td> <td>leaf</td> <td>66</td> <td>3792</td> <td>32</td> <td>74</td> <td>368640</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/001/</td> <td>92</td> <td>leaf</td> <td>67</td> <td>3800</td> <td>14</td> <td>74</td> <td>372736</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/002/</td> <td>93</td> <td>leaf</td> <td>65</td> <td>3717</td> <td>46</td> <td>70</td> <td>376832</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/003/</td> <td>94</td> <td>leaf</td> <td>68</td> <td>3742</td> <td>6</td> <td>71</td> <td>380928</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/004/</td> <td>95</td> <td>leaf</td> <td>70</td> <td>3696</td> <td>42</td> <td>66</td> <td>385024</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/005/</td> <td>96</td> <td>leaf</td> <td>69</td> <td>3721</td> <td>22</td> <td>71</td> <td>389120</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/006/</td> <td>97</td> <td>leaf</td> <td>70</td> <td>3737</td> <td>1</td> <td>72</td> <td>393216</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/007/</td> <td>98</td> <td>leaf</td> <td>69</td> <td>3728</td> <td>15</td> <td>69</td> <td>397312</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/008/</td> <td>99</td> <td>leaf</td> <td>73</td> <td>3715</td> <td>8</td> <td>64</td> <td>401408</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/009/</td> <td>100</td> <td>leaf</td> <td>73</td> <td>3705</td> <td>18</td> <td>62</td> <td>405504</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00a/</td> <td>101</td> <td>leaf</td> <td>75</td> <td>3681</td> <td>32</td> <td>62</td> <td>409600</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00b/</td> <td>102</td> <td>leaf</td> <td>77</td> <td>3694</td> <td>9</td> <td>62</td> <td>413696</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00c/</td> <td>103</td> <td>leaf</td> <td>74</td> <td>3673</td> <td>45</td> <td>62</td> <td>417792</td> <td>4096</td> </tr> <tr> <td>bachelorette/bachelorette</td> <td>/00d/</td> <td>104</td> <td>leaf</td> <td>5</td> <td>228</td> <td>3835</td> <td>48</td> <td>421888</td> <td>4096</td> </tr> </tbody> </table>

Other than direct select * from dbsat queries it is completely invisible.

It would be cool if https://fivethirtyeight.datasettes.com/fivethirtyeight/dbstat didn't 404 (on databases for which that table was available).

datasette 107914493 issue    
564833696 MDU6SXNzdWU1NjQ4MzM2OTY= 670 Prototoype for Datasette on PostgreSQL simonw 9599 open 0     10 2020-02-13T17:17:55Z 2020-09-21T14:46:10Z   OWNER  

I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite.

Some of the factors making me think PostgreSQL support could be worth the effort:
- Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL.
- Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using db-to-sqlite but that's a pretty big barrier to getting started - being able to run datasette postgresql://connection-string and start trying it out would be a massively better experience.
- Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus.
- Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else.
- It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't.
- Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future.

The above reasons feel strong enough to justify a prototype.

datasette 107914493 issue    
705215230 MDU6SXNzdWU3MDUyMTUyMzA= 26 Pagination simonw 9599 open 0     7 2020-09-21T00:14:37Z 2020-09-21T02:55:54Z   MEMBER  

Useful for #16 (timeline view) since you can now filter to just the items on a specific day - but if there are more than 50 items you can't see them all.

dogsheep-beta 197431109 issue    
694493566 MDU6SXNzdWU2OTQ0OTM1NjY= 16 Timeline view simonw 9599 open 0     3 2020-09-06T19:13:58Z 2020-09-21T02:42:29Z   MEMBER  

Ability to browse (and facet) by date.

dogsheep-beta 197431109 issue    
616271236 MDU6SXNzdWU2MTYyNzEyMzY= 112 add_foreign_key(...., ignore=True) simonw 9599 closed 0   2.19 5896742 4 2020-05-12T00:24:00Z 2020-09-20T22:17:34Z 2020-09-20T22:17:34Z OWNER  

When using this library I often find myself wanting to "add this foreign key, but only if it doesn't exist yet". The ignore=True parameter is increasingly being used for this else where in the library (e.g. in create_view()).

sqlite-utils 140912432 issue    
531583658 MDU6SXNzdWU1MzE1ODM2NTg= 68 Add support for porter stemming in FTS simonw 9599 closed 0     1 2019-12-02T22:35:52Z 2020-09-20T04:25:53Z 2020-09-20T04:25:47Z OWNER  

FTS5 can have porter stemming enabled.

sqlite-utils 140912432 issue    
694136490 MDU6SXNzdWU2OTQxMzY0OTA= 15 Add a bunch of config examples simonw 9599 open 0     1 2020-09-05T17:58:43Z 2020-09-18T23:17:39Z   MEMBER  

I can bring these over from my personal Dogsheep.

dogsheep-beta 197431109 issue    
703970713 MDU6SXNzdWU3MDM5NzA3MTM= 23 Sort option should persist between multiple searches simonw 9599 closed 0     0 2020-09-17T23:21:26Z 2020-09-18T22:39:12Z 2020-09-18T22:39:12Z MEMBER  

Following #21

dogsheep-beta 197431109 issue    
703970814 MDU6SXNzdWU3MDM5NzA4MTQ= 24 the JSON object must be str, bytes or bytearray, not 'Undefined' simonw 9599 closed 0     8 2020-09-17T23:21:41Z 2020-09-18T22:33:32Z 2020-09-18T22:33:32Z MEMBER  

Got this on a search results page.

dogsheep-beta 197431109 issue    
704685890 MDU6SXNzdWU3MDQ2ODU4OTA= 25 template_debug mechanism simonw 9599 closed 0     2 2020-09-18T22:11:09Z 2020-09-18T22:12:21Z 2020-09-18T22:12:03Z MEMBER  

I'd prefer it if errors in these template fragments were displayed as errors inline where the fragment should have been inserted, rather than 500ing the whole page - especially since the template fragments are user-provided and could have all kinds of odd errors in them which should be as easy to debug as possible.
_Originally posted by @simonw in https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694554584_

dogsheep-beta 197431109 issue    
703962917 MDU6SXNzdWU3MDM5NjI5MTc= 22 Bug: UI says sorted by relevance in timeline view simonw 9599 closed 0     0 2020-09-17T23:02:07Z 2020-09-17T23:13:14Z 2020-09-17T23:13:14Z MEMBER  

In regular timeline view sort defaults to newest, not relevance - so this UI is incorrect:

https://user-images.githubusercontent.com/9599/93536956-1facf900-f8ff-11ea-889b-bc8356e366df.png">

dogsheep-beta 197431109 issue    
703951918 MDU6SXNzdWU3MDM5NTE5MTg= 21 Option to sort search results by date simonw 9599 closed 0     0 2020-09-17T22:32:39Z 2020-09-17T22:55:35Z 2020-09-17T22:55:35Z MEMBER  

Sometimes I want to sort by date, not by relevance.

dogsheep-beta 197431109 issue    
703218756 MDU6SXNzdWU3MDMyMTg3NTY= 50 Commands for making authenticated API calls simonw 9599 open 0     6 2020-09-17T02:39:07Z 2020-09-17T04:02:39Z   MEMBER  

Similar to twitter-to-sqlite fetch, see https://github.com/dogsheep/twitter-to-sqlite/issues/51

github-to-sqlite 207052882 issue    
703246031 MDU6SXNzdWU3MDMyNDYwMzE= 51 github-to-sqlite get should follow rate limits simonw 9599 open 0     0 2020-09-17T04:01:50Z 2020-09-17T04:01:50Z   MEMBER  

From #50 - right now it will crash with an error of it hits the rate limit. Since the rate limit information (including reset time) is available in the headers it could automatically sleep and try again instead.

github-to-sqlite 207052882 issue    
455852801 MDU6SXNzdWU0NTU4NTI4MDE= 507 Every datasette plugin on the ecosystem page should have a screenshot simonw 9599 open 0     4 2019-06-13T17:02:51Z 2020-09-17T02:47:35Z   OWNER  

https://github.com/simonw/datasette/blob/master/docs/ecosystem.rst

datasette 107914493 issue    
703218448 MDU6SXNzdWU3MDMyMTg0NDg= 51 Documentation for twitter-to-sqlite fetch simonw 9599 open 0     0 2020-09-17T02:38:10Z 2020-09-17T02:38:10Z   MEMBER  

It's mentioned in passing in the README but it deserves its own section:

$ twitter-to-sqlite fetch \
    "https://api.twitter.com/1.1/account/verify_credentials.json" \
    | grep '"id"' | head -n 1
twitter-to-sqlite 206156866 issue    
703216044 MDU6SXNzdWU3MDMyMTYwNDQ= 49 Feature: gists and starred gists simonw 9599 open 0     0 2020-09-17T02:30:52Z 2020-09-17T02:30:52Z   MEMBER  

https://developer.github.com/v3/gists/#list-starred-gists

github-to-sqlite 207052882 issue    
697203800 MDExOlB1bGxSZXF1ZXN0NDgzMTc1NTA5 158 Fix accidental mega long line in docs tomviner 167319 closed 0     1 2020-09-09T22:31:23Z 2020-09-16T06:21:43Z 2020-09-16T06:21:43Z CONTRIBUTOR simonw/sqlite-utils/pulls/158 sqlite-utils 140912432 pull    
653529088 MDU6SXNzdWU2NTM1MjkwODg= 891 Consider using enable_callback_tracebacks(True) simonw 9599 closed 0     5 2020-07-08T19:07:16Z 2020-09-15T21:59:27Z 2020-09-15T21:59:27Z OWNER  

From https://docs.python.org/3/library/sqlite3.html#sqlite3.enable_callback_tracebacks

sqlite3.``enable_callback_tracebacks(flag)

By default you will not get any tracebacks in user-defined functions, aggregates, converters, authorizer callbacks etc. If you want to debug them, you can call this function with flag set to True. Afterwards, you will get tracebacks from callbacks on sys.stderr. Use False to disable the feature again.

Maybe turn this on for all of Datasette? Are there any disadvantages to doing that?

datasette 107914493 issue    
688427751 MDU6SXNzdWU2ODg0Mjc3NTE= 956 Push to Docker Hub failed - but it shouldn't run for alpha releases anyway simonw 9599 closed 0     7 2020-08-29T01:09:12Z 2020-09-15T20:46:41Z 2020-09-15T20:36:34Z OWNER  

https://github.com/simonw/datasette/runs/1043709494?check_suite_focus=true

https://user-images.githubusercontent.com/9599/91625110-80c55a80-e959-11ea-8fea-70508c53fcfb.png">

  • This step should not run if a release is an alpha or beta
  • When it DOES run it should work
  • See it work for both an alpha and a non-alpha release, then close this ticket
datasette 107914493 issue    
648421105 MDU6SXNzdWU2NDg0MjExMDU= 877 Consider dropping explicit CSRF protection entirely? simonw 9599 closed 0     9 2020-06-30T19:00:55Z 2020-09-15T20:42:05Z 2020-09-15T20:42:04Z OWNER  

https://scotthelme.co.uk/csrf-is-dead/ from Feb 2017 has background here. The SameSite=lax cookie property effectively eliminates CSRF in modern browsers. https://caniuse.com/#search=SameSite shows 92.13% global support for it.

Datasette already uses SameSite=lax when it sets cookies by default: https://github.com/simonw/datasette/blob/af350ba4571b8e3f9708c40f2ddb48fea7ac1084/datasette/utils/asgi.py#L327-L341

A few options then. I could ditch CSRF protection entirely. I could make it optional - turn it off by default, but let users who care about that remaining 7.87% of global users opt back into it.

One catch: login CSRF: I don't see how SameSite=lax protects against that attack.

datasette 107914493 issue    
649907676 MDU6SXNzdWU2NDk5MDc2NzY= 889 asgi_wrapper plugin hook is crashing at startup amjith 49260 closed 0     3 2020-07-02T12:53:13Z 2020-09-15T20:40:52Z 2020-09-15T20:40:52Z CONTRIBUTOR  

Steps to reproduce:

  1. Install datasette-media plugin
    pip install datasette-media
  2. Launch datasette
    datasette databasename.db
  3. Error
INFO:     Started server process [927704]
INFO:     Waiting for application startup.
ERROR:    Exception in 'lifespan' protocol
Traceback (most recent call last):
  File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/lifespan/on.py", line 48, in main
    await app(scope, self.receive, self.send)
  File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/datasette_media/__init__.py", line 9, in wrapped_app
    path = scope["path"]
KeyError: 'path'
ERROR:    Application startup failed. Exiting.
datasette 107914493 issue    
657747959 MDU6SXNzdWU2NTc3NDc5NTk= 895 SQL query output should show numeric values in a different colour simonw 9599 closed 0     1 2020-07-16T00:28:03Z 2020-09-15T20:40:08Z 2020-09-15T20:40:08Z OWNER  

Compare https://latest.datasette.io/fixtures/sortable with https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+order+by+pk1%2C+pk2+limit+101

https://user-images.githubusercontent.com/9599/87612845-82e09c00-c6c0-11ea-806e-93764ca468c4.png">

datasette 107914493 issue    
649702801 MDU6SXNzdWU2NDk3MDI4MDE= 888 URLs in release notes point to 127.0.0.1 abdusco 3243482 closed 0     1 2020-07-02T07:28:04Z 2020-09-15T20:39:50Z 2020-09-15T20:39:49Z CONTRIBUTOR  

Just a quick heads up:

Release notes for 0.45 include urls that point to localhost.

https://github.com/simonw/datasette/releases/tag/0.45

datasette 107914493 issue    
522352520 MDU6SXNzdWU1MjIzNTI1MjA= 634 Don't run tests twice when releasing a tag simonw 9599 closed 0     2 2019-11-13T17:02:42Z 2020-09-15T20:37:58Z 2020-09-15T20:37:58Z OWNER  

Shipping a release currently runs the tests twice: https://travis-ci.org/simonw/datasette/builds/611463728

It does a regular test run on Python 3.6/7/8 - then the "Release tagged version" step runs the tests again before publishing to PyPI! This second run is not necessary.

datasette 107914493 issue    
639072811 MDU6SXNzdWU2MzkwNzI4MTE= 849 Rename master branch to main simonw 9599 closed 0   Datasette 1.0 3268330 10 2020-06-15T19:05:54Z 2020-09-15T20:37:14Z 2020-09-15T20:37:14Z OWNER  

I was waiting for consensus to form around this (and kind-of hoping for trunk since I like the tree metaphor) and it looks like main is it.

I've seen convincing arguments against trunk too - it indicates that the branch has some special significance like in Subversion (where all branches come from trunk) when it doesn't. So main is better anyway.

datasette 107914493 issue    
682184050 MDU6SXNzdWU2ODIxODQwNTA= 946 Exception in tracing code simonw 9599 closed 0     1 2020-08-19T21:12:27Z 2020-09-15T20:16:50Z 2020-09-15T20:16:50Z OWNER  

When using ?_trace=1:

Traceback (most recent call last):
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 390, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/datasette/utils/asgi.py", line 150, in __call__
    await self.app(scope, receive, send)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/datasette/tracer.py", line 137, in __call__
    await self.app(scope, receive, wrapped_send)
  File "/usr/local/opt/python@3.8/Frameworks/Python.framework/Versions/3.8/lib/python3.8/contextlib.py", line 120, in __exit__
    next(self.gen)
  File "/Users/simon/.local/share/virtualenvs/rockybeaches-09H592sC/lib/python3.8/site-packages/datasette/tracer.py", line 63, in capture_traces
    del tracers[task_id]
KeyError: 4575365856
datasette 107914493 issue    
702069429 MDU6SXNzdWU3MDIwNjk0Mjk= 967 Writable canned queries with magic parameters fail if POST body is empty simonw 9599 closed 0     11 2020-09-15T16:14:43Z 2020-09-15T20:13:10Z 2020-09-15T20:13:10Z OWNER  

When I try to use the new ?_json=1 feature from #880 with magic parameters from #842 I get this error:

Incorrect number of bindings supplied. The current statement uses 1, and there are 0 supplied

datasette 107914493 issue    
449854604 MDU6SXNzdWU0NDk4NTQ2MDQ= 492 Facets not correctly persisted in hidden form fields simonw 9599 closed 0   Datasette 1.0 3268330 4 2019-05-29T14:49:39Z 2020-09-15T20:12:29Z 2020-09-15T20:12:29Z OWNER  

Steps to reproduce: visit https://2a4b892.datasette.io/fixtures/roadside_attractions?_facet_m2m=attraction_characteristic and click "Apply"

Result is a 500: no such column: attraction_characteristic

The error occurs because of this hidden HTML input:

<input type="hidden" name="_facet" value="attraction_characteristic">

This should be:

<input type="hidden" name="_facet_m2m" value="attraction_characteristic">
datasette 107914493 issue    
701584448 MDU6SXNzdWU3MDE1ODQ0NDg= 966 Remove _request_ip example from canned queries documentation simonw 9599 closed 0     0 2020-09-15T03:51:33Z 2020-09-15T03:52:45Z 2020-09-15T03:52:45Z OWNER  

_request_ip isn't valid, so it shouldn't be in the example: https://github.com/simonw/datasette/blob/cb515a9d75430adaf5e545a840bbc111648e8bfd/docs/sql_queries.rst#L320-L322

datasette 107914493 issue    
688622148 MDU6SXNzdWU2ODg2MjIxNDg= 957 Simplify imports of common classes simonw 9599 open 0   Datasette 1.0 3268330 5 2020-08-29T23:44:04Z 2020-09-14T22:20:13Z   OWNER  

There are only a few classes that plugins need to import. It would be nice if these imports were as short and memorable as possible.

For example:

from datasette.app import Datasette
from datasette.utils.asgi import Response

Could both become:

from datasette import Datasette
from datasette import Response
datasette 107914493 issue    

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Query took 240.149ms · About: github-to-sqlite