home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

592 rows where author_association = "CONTRIBUTOR" sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user >30

  • fgregg 79
  • eyeseast 74
  • russss 39
  • dependabot[bot] 34
  • abdusco 26
  • psychemedia 24
  • bgrins 24
  • asg017 24
  • mroswell 22
  • chapmanjacobd 20
  • cldellow 18
  • brandonrobertz 15
  • jacobian 14
  • RhetTbull 14
  • wragge 12
  • rixx 11
  • bobwhitelock 9
  • rgieseke 7
  • amjith 6
  • jefftriplett 6
  • tsibley 6
  • simonwiles 6
  • mcarpenter 6
  • davidbgk 5
  • jaywgraves 5
  • dependabot-preview[bot] 5
  • bollwyvl 4
  • ctb 4
  • r4vi 4
  • jsfenfen 4
  • …

issue >30

  • Upgrade to CodeMirror 6, add SQL autocomplete 21
  • Database page loads too slowly with many large tables (due to table counts) 13
  • Stream all results for arbitrary SQL and canned queries 10
  • docker image is duplicating db files somehow 10
  • Handle spatialite geometry columns better 7
  • base_url configuration setting 7
  • Exceeding Cloud Run memory limits when deploying a 4.8G database 7
  • create-index should run analyze after creating index 7
  • Add new spatialite helper methods 7
  • Add register_output_renderer hook 6
  • Helper methods for working with SpatiaLite 6
  • Plugin hook for dynamic metadata 6
  • clean checkout & clean environment has test failures 6
  • feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 6
  • [WIP] Add publish to heroku support 5
  • Scripted exports 5
  • datasette publish lambda plugin 4
  • Build Dockerfile with recent Sqlite + Spatialite 4
  • Documentation with recommendations on running Datasette in production without using Docker 4
  • bpylist.archiver.CircularReference: archive has a cycle with uid(13) 4
  • Add insert --truncate option 4
  • Make it easier to insert geometries, with documentation and maybe code 4
  • Advanced class-based `conversions=` mechanism 4
  • Proposal: datasette query 4
  • Writable canned queries fail with useless non-error against immutable databases 4
  • Ability to merge databases and tables 4
  • API to insert a single record into an existing table 4
  • Exclude virtual tables from datasette inspect 4
  • array facet: don't materialize unnecessary columns 4
  • Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 4
  • …

author_association 1

  • CONTRIBUTOR · 592 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
1729961503 https://github.com/simonw/datasette/pull/2190#issuecomment-1729961503 https://api.github.com/repos/simonw/datasette/issues/2190 IC_kwDOBm6k_c5nHR4f asg017 15178711 2023-09-21T16:56:57Z 2023-09-21T16:56:57Z CONTRIBUTOR

TODO: add similar checks for permissions/allow/canned queries

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Raise an exception if a "plugins" block exists in metadata.json 1901483874  
1722848454 https://github.com/simonw/datasette/issues/2188#issuecomment-1722848454 https://api.github.com/repos/simonw/datasette/issues/2188 IC_kwDOBm6k_c5msJTG asg017 15178711 2023-09-18T06:58:53Z 2023-09-18T06:58:53Z CONTRIBUTOR

Thinking about this more, here a list of things I imagine a "compile-to-sql" plugin would want to do:

  1. Attach itself to the SQL code editor (switch from SQL -> PRQL/Logica, additional syntax highlighting)
  2. Add "Query using PRQL" buttons in various parts of Datasette's UI, like /dbname page
  3. Use $LANGUAGE= instead of sql= in the JSON API and the SQL results pages
  4. Have their own dedicated code editor page

1) and 2) would be difficult to do with current plugin hooks, unless we add the concept of "slots" and get the JS plugin support in. 3) could maybe be done with the asgi_wrapper(datasette) hook? And 4) ca n be done easily with the register_routes() hooks.

So it really only sounds like extending the SQL editor will be the hard part. In #2094 I want to add JavaScript plugin hooks for extending the SQL editor, which may work here.

If I get the time/motivation, I might try out a datasette-prql extension, just because I like playing with it. It'd be really cool if I can get the asgi_wrapper() hook to work right there...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin Hooks for "compile to SQL" languages 1900026059  
1722845490 https://github.com/simonw/datasette/issues/1191#issuecomment-1722845490 https://api.github.com/repos/simonw/datasette/issues/1191 IC_kwDOBm6k_c5msIky asg017 15178711 2023-09-18T06:55:52Z 2023-09-18T06:55:52Z CONTRIBUTOR

One note here: this feature could be called "slots", similar to Layout Slots in Vitepress.

In Vitepress, you can add custom components/widget/gadgets into determined named "slots", like so:

doc-top doc-bottom doc-footer-before doc-before doc-after ...

Would be great to do in both Python and Javascript, with the upcoming JavaScript API #2052. In datasette-write-ui, all we do is add a few "Insert row" and "edit this row" buttons and that required completely capturing the table.html template, which isn't great for other plugins. But having "slots" like table-footer-before or table-row-id or something would be great to work with.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Ability for plugins to collaborate when adding extra HTML to blocks in default templates 787098345  
1719451803 https://github.com/simonw/datasette/pull/2182#issuecomment-1719451803 https://api.github.com/repos/simonw/datasette/issues/2182 IC_kwDOBm6k_c5mfMCb dependabot[bot] 49699333 2023-09-14T13:27:26Z 2023-09-14T13:27:26Z CONTRIBUTOR

Looks like these dependencies are updatable in another way, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 2 updates 1890593563  
1716801971 https://github.com/simonw/datasette/pull/2183#issuecomment-1716801971 https://api.github.com/repos/simonw/datasette/issues/2183 IC_kwDOBm6k_c5mVFGz asg017 15178711 2023-09-13T01:34:01Z 2023-09-13T01:34:01Z CONTRIBUTOR

@simonw docs are finished, this is ready for review!

One thing: I added "Configuration" as a top-level item in the documentation site, at the very bottom. Not sure if this is the best, maybe it can be named "datasette.yaml Configuration" or something similar?

Mostly because "Configuration" by itself can mean many things, but adding "datasette.yaml" would make it pretty clear it's about that specific file, and is easier to scan. I'd also be fine with using "datasette.yaml" instead of "datasette.json", since writing in YAML is much more forgiving (and advanced users will know JSON is also supported)

Also, maybe this is a chance to consolidate the docs a bit? I think "Settings", "Configuration", "Metadata", and "Authentication and permissions" should possibly be under the same section. Maybe even consolidate the different Plugin pages that exist?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette.yaml` plugin support 1891212159  
1700291967 https://github.com/simonw/datasette/issues/2157#issuecomment-1700291967 https://api.github.com/repos/simonw/datasette/issues/2157 IC_kwDOBm6k_c5lWGV_ asg017 15178711 2023-08-31T02:45:56Z 2023-08-31T02:45:56Z CONTRIBUTOR

@simonw what do you think about adding a DATASETTE_INTERNAL_DB_PATH env variable, where when defined, is the default location of the internal DB? This means when the --internal flag is NOT provided, Datasette would check to see if DATASETTE_INTERNAL_DB_PATH exists, and if so, uses that as the internal database (and would fallback to an ephemeral memory database)

My rationale: some plugins may require, or strongly encourage, a persistent internal database (datasette-comments, datasette-bookmarks, datasette-link-shortener, etc.). However, for users that have a global installation of Datasette (say from brew install or a global pip install), it would be annoying having to specify --internal every time. So instead, they can just add export DATASETTE_INTERNAL_DB_PATH="/path/to/internal.db" to their bashrc/zshrc/whereever to not have to worry about --internal

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Proposal: Make the `_internal` database persistent, customizable, and hidden 1865869205  
1696591957 https://github.com/simonw/datasette/pull/2148#issuecomment-1696591957 https://api.github.com/repos/simonw/datasette/issues/2148 IC_kwDOBm6k_c5lH_BV dependabot[bot] 49699333 2023-08-29T00:15:29Z 2023-08-29T00:15:29Z CONTRIBUTOR

This pull request was built based on a group rule. Closing it will not ignore any of these versions in future pull requests.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx, furo, blacken-docs dependencies 1859415334  
1695736691 https://github.com/simonw/datasette/pull/2152#issuecomment-1695736691 https://api.github.com/repos/simonw/datasette/issues/2152 IC_kwDOBm6k_c5lEuNz dependabot[bot] 49699333 2023-08-28T13:49:35Z 2023-08-28T13:49:35Z CONTRIBUTOR

Superseded by #2160.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 3 updates 1865174661  
1689198413 https://github.com/simonw/datasette/pull/2148#issuecomment-1689198413 https://api.github.com/repos/simonw/datasette/issues/2148 IC_kwDOBm6k_c5krx9N dependabot[bot] 49699333 2023-08-23T02:57:55Z 2023-08-23T02:57:55Z CONTRIBUTOR

Looks like this PR has been edited by someone other than Dependabot. That means Dependabot can't rebase it - sorry!

If you're happy for Dependabot to recreate it from scratch, overwriting any edits, you can request @dependabot recreate.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx, furo, blacken-docs dependencies 1859415334  
1688532012 https://github.com/simonw/datasette/issues/2093#issuecomment-1688532012 https://api.github.com/repos/simonw/datasette/issues/2093 IC_kwDOBm6k_c5kpPQs asg017 15178711 2023-08-22T16:21:40Z 2023-08-22T16:21:40Z CONTRIBUTOR

OK Here's the gameplan for this, which is closely tied to #2143 :

  • We will add a new datasette.json/datasette.yaml configuration file to datasette, which combines settings/plugin config/permissions/canned queries into a new file format
  • Metadata will NOT be a part of this file
  • TOML support is not planned, but maybe we can create a separate issue for support TOML with JSON/YAML
  • The settings.json file will be deprecated, and the --config arg will be brought back.
  • Command line arguments can still be used to overwrite values (ex --setting will overwrite settings in datasette.yaml

The format of datasette.json will follow what Simon listed here: https://github.com/simonw/datasette/issues/2143#issuecomment-1684484426

Here's the current implementation plan:

  1. Add a new --config flag and port over "settings" into a new datasette.json config file, remove settings.json
  2. Add top-level plugin config support to datasette.json
  3. Figure out database/table structure of config datasette.json
  4. Port over database/table level plugin config support datasette.json
  5. Port over permissions/auth settings to datasette.json
  6. Deprecate non-metadata values in metadata.json
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1781530343  
1686745094 https://github.com/simonw/datasette/issues/2145#issuecomment-1686745094 https://api.github.com/repos/simonw/datasette/issues/2145 IC_kwDOBm6k_c5kibAG asg017 15178711 2023-08-21T17:30:01Z 2023-08-21T17:30:01Z CONTRIBUTOR

Another point: The new Datasette write API should refuse to insert a row with a NULL primary key. That will likely decrease the likelihood someone find themselves with NULLs in their primary keys, at least with Datasette users. Especially buggy code that uses the write API, like our datasette-write-ui bug that led to this issue.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
If a row has a primary key of `null` various things break 1857234285  
1686366557 https://github.com/simonw/datasette/pull/2144#issuecomment-1686366557 https://api.github.com/repos/simonw/datasette/issues/2144 IC_kwDOBm6k_c5kg-ld dependabot[bot] 49699333 2023-08-21T13:48:15Z 2023-08-21T13:48:15Z CONTRIBUTOR

Superseded by #2148.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 3 updates 1856760386  
1684496274 https://github.com/simonw/datasette/issues/2143#issuecomment-1684496274 https://api.github.com/repos/simonw/datasette/issues/2143 IC_kwDOBm6k_c5kZ1-S asg017 15178711 2023-08-18T22:30:45Z 2023-08-18T22:30:45Z CONTRIBUTOR

That said, I do really like a bias towards settings that can be changed at runtime

Does this include things like --settings values or plugin config? I can totally see being able to update metadata without restarting, but not sure if that would work well with --setting, plugin config, or auth/permissions stuff.

Well it could work with --setting and auth/permissions, with a lot of core changes. But changing plugin config on the fly could be challenging, for plugin authors.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
De-tangling Metadata before Datasette 1.0 1855885427  
1684205563 https://github.com/simonw/datasette/issues/2143#issuecomment-1684205563 https://api.github.com/repos/simonw/datasette/issues/2143 IC_kwDOBm6k_c5kYu_7 asg017 15178711 2023-08-18T17:12:54Z 2023-08-18T17:12:54Z CONTRIBUTOR

Another option would be, instead of flat datasette.json/datasette.yaml files, we could instead use a Python file, like datasette_config.py. That way one could dynamically generate config (ex dev vs prod, auto-discover credentials, etc.). Kinda like Django settings.

Though I imagine Python imports might make this complex to do, and json/yaml is already supported and pretty easy to write

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
De-tangling Metadata before Datasette 1.0 1855885427  
1684202932 https://github.com/simonw/datasette/issues/2143#issuecomment-1684202932 https://api.github.com/repos/simonw/datasette/issues/2143 IC_kwDOBm6k_c5kYuW0 asg017 15178711 2023-08-18T17:10:21Z 2023-08-18T17:10:21Z CONTRIBUTOR

I agree with all your points!

I think the best solution would be having a datasette.json config file, where you "configure" your datasette instances, with settings, permissions/auth, plugin configuration, and table settings (sortable column, label columns, etc.). Which #2093 would do.

Then optionally, you have a metadata.json, or use datasette_metadata, or some other plugin to define metadata (ex the future sqlite-docs plugin).

Everything in datasette.json could also be overwritten by CLI flags, like --setting key value, --plugin xxxx key value.

We could even completely remove settings.json in favor or just datasette.json. Mostly because I think the less files the better, especially if they have generic names like settings.json or config.json.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
De-tangling Metadata before Datasette 1.0 1855885427  
1683950031 https://github.com/simonw/datasette/pull/2142#issuecomment-1683950031 https://api.github.com/repos/simonw/datasette/issues/2142 IC_kwDOBm6k_c5kXwnP dependabot[bot] 49699333 2023-08-18T13:49:24Z 2023-08-18T13:49:24Z CONTRIBUTOR

Looks like these dependencies are updatable in another way, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 2 updates 1854970601  
1682256251 https://github.com/simonw/datasette/pull/2141#issuecomment-1682256251 https://api.github.com/repos/simonw/datasette/issues/2141 IC_kwDOBm6k_c5kRTF7 dependabot[bot] 49699333 2023-08-17T13:07:43Z 2023-08-17T13:07:43Z CONTRIBUTOR

Looks like blacken-docs is updatable in another way, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump the python-packages group with 1 update 1853289039  
1668187546 https://github.com/simonw/datasette/pull/2125#issuecomment-1668187546 https://api.github.com/repos/simonw/datasette/issues/2125 IC_kwDOBm6k_c5jboWa dependabot[bot] 49699333 2023-08-07T16:20:26Z 2023-08-07T16:20:26Z CONTRIBUTOR

Looks like sphinx is up-to-date now, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 7.1.2 1833193570  
1668186872 https://github.com/simonw/datasette/pull/2121#issuecomment-1668186872 https://api.github.com/repos/simonw/datasette/issues/2121 IC_kwDOBm6k_c5jboL4 dependabot[bot] 49699333 2023-08-07T16:20:19Z 2023-08-07T16:20:19Z CONTRIBUTOR

Looks like furo is up-to-date now, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump furo from 2023.3.27 to 2023.7.26 1824399610  
1668186815 https://github.com/simonw/datasette/pull/2098#issuecomment-1668186815 https://api.github.com/repos/simonw/datasette/issues/2098 IC_kwDOBm6k_c5jboK_ dependabot[bot] 49699333 2023-08-07T16:20:18Z 2023-08-07T16:20:18Z CONTRIBUTOR

Looks like blacken-docs is up-to-date now, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump blacken-docs from 1.14.0 to 1.15.0 1796830110  
1668113177 https://github.com/simonw/sqlite-utils/issues/578#issuecomment-1668113177 https://api.github.com/repos/simonw/sqlite-utils/issues/578 IC_kwDOCGYnMM5jbWMZ eyeseast 25778 2023-08-07T15:41:49Z 2023-08-07T15:41:49Z CONTRIBUTOR

I wonder if this should be two hooks: input and output. The current --csv (and --tsv) options apply to both. Haven't looked at how it's implemented. Or maybe it's one hook that returns a format for reading and for writing.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for adding new output formats 1818838294  
1662215579 https://github.com/simonw/datasette/pull/2124#issuecomment-1662215579 https://api.github.com/repos/simonw/datasette/issues/2124 IC_kwDOBm6k_c5jE2Wb dependabot[bot] 49699333 2023-08-02T13:28:43Z 2023-08-02T13:28:43Z CONTRIBUTOR

Superseded by #2125.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 7.1.1 1826424151  
1655678215 https://github.com/simonw/datasette/pull/2107#issuecomment-1655678215 https://api.github.com/repos/simonw/datasette/issues/2107 IC_kwDOBm6k_c5ir6UH dependabot[bot] 49699333 2023-07-28T13:23:16Z 2023-07-28T13:23:16Z CONTRIBUTOR

Superseded by #2124.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 7.1.0 1820346348  
1653652665 https://github.com/simonw/datasette/pull/2077#issuecomment-1653652665 https://api.github.com/repos/simonw/datasette/issues/2077 IC_kwDOBm6k_c5ikLy5 dependabot[bot] 49699333 2023-07-27T13:40:52Z 2023-07-27T13:40:52Z CONTRIBUTOR

Superseded by #2121.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump furo from 2023.3.27 to 2023.5.20 1719759468  
1649849249 https://github.com/simonw/datasette/pull/2075#issuecomment-1649849249 https://api.github.com/repos/simonw/datasette/issues/2075 IC_kwDOBm6k_c5iVrOh dependabot[bot] 49699333 2023-07-25T13:28:35Z 2023-07-25T13:28:35Z CONTRIBUTOR

Superseded by #2107.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 7.0.1 1710164693  
1648339661 https://github.com/simonw/sqlite-utils/issues/578#issuecomment-1648339661 https://api.github.com/repos/simonw/sqlite-utils/issues/578 IC_kwDOCGYnMM5iP6rN eyeseast 25778 2023-07-24T17:44:30Z 2023-07-24T17:44:30Z CONTRIBUTOR

A related feature would be support for plugins to add new ways of ingesting data - currently sqlite-utils insert works against JSON, newline-JSON, CSV and TSV.

This is my goal, to have one plugin that handles input and output symmetrically. I'd like to be able to do something like this:

```sh sqlite-utils insert data.db table file.geojson --format geojson

... explore and manipulate in Datasette

sqlite-utils query data.db ... --format geojson > output.geojson ```

This would work especially well with datasette-query-files, since I already have the queries I need saved in standalone SQL files.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for adding new output formats 1818838294  
1646656283 https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646656283 https://api.github.com/repos/simonw/sqlite-utils/issues/567 IC_kwDOCGYnMM5iJfsb eyeseast 25778 2023-07-22T19:32:24Z 2023-07-22T19:32:24Z CONTRIBUTOR

Cool. I might try to add a geojson plugin that handles both input and output. That would help me out a lot.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin system 1801394744  
1642808866 https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1642808866 https://api.github.com/repos/simonw/sqlite-utils/issues/567 IC_kwDOCGYnMM5h60Yi eyeseast 25778 2023-07-19T21:54:27Z 2023-07-19T21:54:27Z CONTRIBUTOR

Would this possibly make a bunch of x-to-sqlite tools obsolete? Or nudge some to become plugins?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin system 1801394744  
1641082395 https://github.com/simonw/datasette/issues/2104#issuecomment-1641082395 https://api.github.com/repos/simonw/datasette/issues/2104 IC_kwDOBm6k_c5h0O4b asg017 15178711 2023-07-18T22:41:37Z 2023-07-18T22:41:37Z CONTRIBUTOR

For filtering virtual table's "shadow tables" (ex the FTS5 _content and most the spatialite tables), you can use pragma_table_list (first appeared in SQLite 3.37 (2021-11-27), which has a type column that calls out type="shadow" tables https://www.sqlite.org/pragma.html#pragma_table_list

{
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
Tables starting with an underscore should be treated as hidden 1808215339  
1638910473 https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1638910473 https://api.github.com/repos/simonw/sqlite-utils/issues/567 IC_kwDOCGYnMM5hr8oJ asg017 15178711 2023-07-17T21:27:41Z 2023-07-17T21:27:41Z CONTRIBUTOR

Another use-case: I want to make a sqlite-utils plugin that'll help me insert data into Datasette.

bash sqlite-utils insert-datasette \ --token $DATASETTE_API_KEY \ https://latest.datasette.io/fixtures/my-table \ 'select ...'

This could also be a datasette plugin (ex datasette upload-data ..., but you can also think of sqlite-utils plugins that upload to S3, a postgres table, other DBMS's, etc.)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin system 1801394744  
1616853644 https://github.com/simonw/datasette/issues/2087#issuecomment-1616853644 https://api.github.com/repos/simonw/datasette/issues/2087 IC_kwDOBm6k_c5gXzqM asg017 15178711 2023-07-02T22:00:48Z 2023-07-02T22:00:48Z CONTRIBUTOR

I just saw in the docs that Dasette auto-detects settings.json:

settings.json - settings that would normally be passed using --setting - here they should be stored as a JSON object of key/value pairs Source

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--settings settings.json` option 1765870617  
1616286848 https://github.com/simonw/datasette/issues/2093#issuecomment-1616286848 https://api.github.com/repos/simonw/datasette/issues/2093 IC_kwDOBm6k_c5gVpSA asg017 15178711 2023-07-02T02:17:46Z 2023-07-02T02:17:46Z CONTRIBUTOR

Storing metadata in the database won't be required. I imagine there'll be many different ways to store metadata, including any possible datasette_metadata or sqlite-docs, or the older metadata.json way.

The next question will be how precedence should work - i'd imagine metadata.json > plugins > datasette_metadata > sqlite-docs

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1781530343  
1616095810 https://github.com/simonw/datasette/pull/2052#issuecomment-1616095810 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5gU6pC asg017 15178711 2023-07-01T20:31:31Z 2023-07-01T20:31:31Z CONTRIBUTOR

Just curious, is there a query that can be used to compile this programmatically, or did you identify these through memory?

I just did a github search for user:simonw "def extra_js_urls(" ! Though I'm sure other plugins made by people other than Simon also exist out there https://github.com/search?q=user%3Asimonw+%22def+extra_js_urls%28%22&type=code

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1613896210 https://github.com/simonw/datasette/issues/2093#issuecomment-1613896210 https://api.github.com/repos/simonw/datasette/issues/2093 IC_kwDOBm6k_c5gMhoS asg017 15178711 2023-06-29T22:53:33Z 2023-06-29T22:53:33Z CONTRIBUTOR

Maybe we can have a separate issue for revamping metadata.json? A datasette_metadata table or the sqlite-docs extension seem like two reasonable additions that we can work through. Storing metadata inside a SQLite database makes sense, but I don't think storing datasette.* style config (ex ports, settings, etc.) inside a SQLite DB makes sense, since it's very environment-dependent

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1781530343  
1613895188 https://github.com/simonw/datasette/issues/2093#issuecomment-1613895188 https://api.github.com/repos/simonw/datasette/issues/2093 IC_kwDOBm6k_c5gMhYU asg017 15178711 2023-06-29T22:51:53Z 2023-06-29T22:51:53Z CONTRIBUTOR

I agree with not liking metadata.json stuff in a datasette.* config file. Editing description of a table/column in a file like datasette.* seems odd to me.

Though since plugin configuration currently lives in metadata.json, I think it should be removed from there and placed in datasette.*, at least for top-level config like datasette-auth-github's config. Keeping metadata.json strictly for documentation/licensing/column units makes sense to me, but anything plugin related should be in some config file, like datasette.*.

And ya, supporting both datasette.* and CLI flags makes a lot of sense to me. Any --setting flag should override anything in datasette.* for easier debugging, with possibly a warning message so people don't get confused. Same with --port and a port defined in datasette.*

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1781530343  
1613778296 https://github.com/simonw/datasette/pull/2052#issuecomment-1613778296 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5gME14 asg017 15178711 2023-06-29T20:36:09Z 2023-06-29T20:36:09Z CONTRIBUTOR

Ok @hydrosquall a couple things before this PR should be good to go:

  • Can we move datasette/static/table-example-plugins.js into demos/plugins/static?
  • For datasetteManager.VERSION, can we fill that in or just comment it out for now? Not sure how difficult it'll be to inject it server-side. I imagine we could also have a small build process with esbuild/rollup that just injects a version string into manager.js directly, so we don't have to worry about server-rendering (but that can be a future PR)

In terms of how to integrate this into Datasette, a few options I can see working:

  • Push this as-is and figure it out before the next release
  • Hide this feature behind a settings flag (--setting unstable-js-plugins on) and use that setting to hide/show <script src="{{ urls.static('datasette-manager.js') }}" defer></script> in base.html

I'll let @simonw decide which one to work with. I kindof like the idea of having an "unstable" opt-in process to enable JS plugins, to give us time to try it out with a wide variety of plugins until we feel its ready.

I'm also curious to see how "plugins for a plugin' would work, like #1542. For example, if the leaflet plugin showed default markers, but also included its own hook for other plugins to add more markers/styling. I'm imagine that the individual plugin would re-create their own plugin system compared to this, since handling "plugins of plugins" at the top with Datasette seems really convoluted.

Also for posterity, here's a list of Simon's Datasette plugins that use "extra_js_urls()", which probably means they can be ported/re-written to use this new plugin system:

  • datasette-vega
  • datasette-cluster-map
  • datasette-leaflet-geojson
  • datasette-pretty-traces
  • datasette-youtube-embed
  • datasette-leaflet-freedraw
  • datasette-hovercards
  • datasette-mp3-audio
  • datasette-geojson-map
{
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1606352600 https://github.com/simonw/datasette/pull/2052#issuecomment-1606352600 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5fvv7Y asg017 15178711 2023-06-26T00:17:04Z 2023-06-26T00:17:04Z CONTRIBUTOR

:wave: would love to see this get merged soon! I want to make a javascript plugin on top of the code-mirror editor to make a few things nicer (function auto-complete, table/column descriptions, etc.), and this would help out a bunch

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1592110694 https://github.com/simonw/sqlite-utils/issues/529#issuecomment-1592110694 https://api.github.com/repos/simonw/sqlite-utils/issues/529 IC_kwDOCGYnMM5e5a5m chapmanjacobd 7908073 2023-06-14T23:11:47Z 2023-06-14T23:12:12Z CONTRIBUTOR

sorry i was wrong. sqlite-utils --raw-lines works correctly

``` sqlite-utils --raw-lines :memory: "SELECT * FROM (VALUES ('test'), ('line2'))" | cat -A test$ line2$

sqlite-utils --csv --no-headers :memory: "SELECT * FROM (VALUES ('test'), ('line2'))" | cat -A test$ line2$ ```

I think this was fixed somewhat recently

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Microsoft line endings 1581090327  
1264218914 https://github.com/simonw/sqlite-utils/issues/491#issuecomment-1264218914 https://api.github.com/repos/simonw/sqlite-utils/issues/491 IC_kwDOCGYnMM5LWnMi chapmanjacobd 7908073 2022-10-01T03:18:36Z 2023-06-14T22:14:24Z CONTRIBUTOR

some good concrete use-cases in mind

I actually found myself wanting something like this the past couple days. The use-case was databases with slightly different schema but same table names.

here is a full script:

``` import argparse from pathlib import Path

from sqlite_utils import Database

def connect(args, conn=None, kwargs) -> Database: db = Database(conn or args.database, kwargs) with db.conn: db.conn.execute("PRAGMA main.cache_size = 8000") return db

def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument("database") parser.add_argument("dbs_folder") parser.add_argument("--db", "-db", help=argparse.SUPPRESS) parser.add_argument("--verbose", "-v", action="count", default=0) args = parser.parse_args()

if args.db:
    args.database = args.db
Path(args.database).touch()
args.db = connect(args)

return args

def merge_db(args, source_db): source_db = str(Path(source_db).resolve())

s_db = connect(argparse.Namespace(database=source_db, verbose = args.verbose))
for table in s_db.table_names():
    data = s_db[table].rows
    args.db[table].insert_all(data, alter=True, replace=True)

args.db.conn.commit()

def merge_directory(): args = parse_args() source_dbs = list(Path(args.dbs_folder).glob('*.db')) for s_db in source_dbs: merge_db(args, s_db)

if name == 'main': merge_directory() ```

edit: I've made some improvements to this and put it on PyPI:

``` $ pip install xklb $ lb merge-db -h usage: library merge-dbs DEST_DB SOURCE_DB ... [--only-target-columns] [--only-new-rows] [--upsert] [--pk PK ...] [--table TABLE ...]

Merge-DBs will insert new rows from source dbs to target db, table by table. If primary key(s) are provided,
and there is an existing row with the same PK, the default action is to delete the existing row and insert the new row
replacing all existing fields.

Upsert mode will update matching PK rows such that if a source row has a NULL field and
the destination row has a value then the value will be preserved instead of changed to the source row's NULL value.

Ignore mode (--only-new-rows) will insert only rows which don't already exist in the destination db

Test first by using temp databases as the destination db.
Try out different modes / flags until you are satisfied with the behavior of the program

    library merge-dbs --pk path (mktemp --suffix .db) tv.db movies.db

Merge database data and tables

    library merge-dbs --upsert --pk path video.db tv.db movies.db
    library merge-dbs --only-target-columns --only-new-rows --table media,playlists --pk path audio-fts.db audio.db

    library merge-dbs --pk id --only-tables subreddits reddit/81_New_Music.db audio.db
    library merge-dbs --only-new-rows --pk subreddit,path --only-tables reddit_posts reddit/81_New_Music.db audio.db -v

positional arguments: database source_dbs ```

Also if you want to dedupe a table based on a "business key" which isn't explicitly your primary key(s) you can run this:

``` $ lb dedupe-db -h usage: library dedupe-dbs DATABASE TABLE --bk BUSINESS_KEYS [--pk PRIMARY_KEYS] [--only-columns COLUMNS]

Dedupe your database (not to be confused with the dedupe subcommand)

It should not need to be said but *backup* your database before trying this tool!

Dedupe-DB will help remove duplicate rows based on non-primary-key business keys

    library dedupe-db ./video.db media --bk path

If --primary-keys is not provided table metadata primary keys will be used
If --only-columns is not provided all non-primary and non-business key columns will be upserted

positional arguments: database table

options: -h, --help show this help message and exit --skip-0 --only-columns ONLY_COLUMNS Comma separated column names to upsert --primary-keys PRIMARY_KEYS, --pk PRIMARY_KEYS Comma separated primary keys --business-keys BUSINESS_KEYS, --bk BUSINESS_KEYS Comma separated business keys ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Ability to merge databases and tables 1383646615  
1592052320 https://github.com/simonw/sqlite-utils/issues/535#issuecomment-1592052320 https://api.github.com/repos/simonw/sqlite-utils/issues/535 IC_kwDOCGYnMM5e5Mpg chapmanjacobd 7908073 2023-06-14T22:05:28Z 2023-06-14T22:05:28Z CONTRIBUTOR

piping to jq is good enough usually

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
rows: --transpose or psql extended view-like functionality 1655860104  
1592047502 https://github.com/simonw/sqlite-utils/issues/555#issuecomment-1592047502 https://api.github.com/repos/simonw/sqlite-utils/issues/555 IC_kwDOCGYnMM5e5LeO chapmanjacobd 7908073 2023-06-14T22:00:10Z 2023-06-14T22:01:57Z CONTRIBUTOR

You may want to try doing a performance comparison between this and just selecting all the ids with few constraints and then doing the filtering within python.

That might seem like a lazy-programmer, inefficient way but queries with large resultsets are a different profile than what databases like SQLITE are designed for. That is not to say that SQLITE is slow or that python is always faster but when you start reading >20% of an index there is an equilibrium that is reached. Especially when adding in writing extra temp tables and stuff to memory/disk. And especially given the NOT IN style of query...

You may also try chunking like this:

```py def chunks(lst, n) -> Generator: for i in range(0, len(lst), n): yield lst[i : i + n]

SQLITE_PARAM_LIMIT = 32765

data = [] chunked = chunks(video_ids, consts.SQLITE_PARAM_LIMIT) for ids in chunked: data.expand( list( db.query( f"""SELECT * from videos WHERE id in (""" + ",".join(["?"] * len(ids)) + ")", (*ids,), ) ) ) ```

but that actually won't work with your NOT IN requirements. You need to query the full resultset to check any row.

Since you are doing stuff with files/videos in SQLITE you might be interested in my side project: https://github.com/chapmanjacobd/library

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Filter table by a large bunch of ids 1733198948  
1590531892 https://github.com/simonw/sqlite-utils/issues/557#issuecomment-1590531892 https://api.github.com/repos/simonw/sqlite-utils/issues/557 IC_kwDOCGYnMM5ezZc0 chapmanjacobd 7908073 2023-06-14T06:09:21Z 2023-06-14T06:09:21Z CONTRIBUTOR

I put together a simple script to upsert and remove duplicate rows based on business keys. If anyone has similar problems with above this might help

``` CREATE TABLE my_table ( id INTEGER PRIMARY KEY, column1 TEXT, column2 TEXT, column3 TEXT );

INSERT INTO my_table (column1, column2, column3) VALUES ('Value 1', 'Duplicate 1', 'Duplicate A'), ('Value 2', 'Duplicate 2', 'Duplicate B'), ('Value 3', 'Duplicate 2', 'Duplicate C'), ('Value 4', 'Duplicate 3', 'Duplicate D'), ('Value 5', 'Duplicate 3', 'Duplicate E'), ('Value 6', 'Duplicate 3', 'Duplicate F'); ```

library dedupe-db test.db my_table --bk column2

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Aliased ROWID option for tables created from alter=True commands 1740150327  
1577355134 https://github.com/simonw/sqlite-utils/issues/557#issuecomment-1577355134 https://api.github.com/repos/simonw/sqlite-utils/issues/557 IC_kwDOCGYnMM5eBId- chapmanjacobd 7908073 2023-06-05T19:26:26Z 2023-06-05T19:26:26Z CONTRIBUTOR

this isn't really actionable... I'm just being a whiny baby. I have tasted the milk of being able to use upsert_all, insert_all, etc without having to write DDL to create tables. The meat of the issue is that SQLITE doesn't make rowid stable between vacuums so it is not possible to take shortcuts

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Aliased ROWID option for tables created from alter=True commands 1740150327  
1575310378 https://github.com/simonw/sqlite-utils/issues/556#issuecomment-1575310378 https://api.github.com/repos/simonw/sqlite-utils/issues/556 IC_kwDOCGYnMM5d5VQq mcint 601708 2023-06-04T01:21:15Z 2023-06-04T01:21:15Z CONTRIBUTOR

I've resolved my use, with the line-buffered output and while read loop for line buffered input, but I leave this here so the incremental saving or line-buffered use-case can be explicitly handled or rejected (or deferred).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support storing incrementally piped values 1740026046  
1548617257 https://github.com/simonw/datasette/pull/2052#issuecomment-1548617257 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5cTgYp cldellow 193185 2023-05-15T21:32:20Z 2023-05-15T21:32:20Z CONTRIBUTOR

Were you picturing that the whole plugin config object could be returned as a promise, or that the individual hooks (like makeColumnActions or makeAboveTablePanelConfigs supported returning a promise of arrays instead only returning plain arrays?

The latter - that you could return a promise of arrays, so it parallels the "await me maybe" pattern in Datasette, where you can return either a value, a callable or an awaitable.

I have a hunch that what you're describing might be achievable without adding Promises to the API with something

Oops, I did a poor job explaining. Yes, this would work - but it requires me to continue to communicate the column names out of band (in order to fetch the facet data per-column before registering my plugin), vs being able to re-use them from the plugin implementation.

This isn't that big of a deal - it'd be a nice ergonomic improvement, but nowhere near as a big of an improvement as having an officially sanctioned way to add stuff to the column menus in the first place.

This could also be layered on in a future commit without breaking v1 users, too, so it's not at all urgent.

especially if those lines are encapsulated by a function we provide (maybe something that's available on the window provided by Datasette as an inline script tag

Ah, this is maybe the the key point. Since it's all hosted inside Datasette, Datasette can provide some arbitrary sugar to make it easier to work with.

My experience with async scripts in JS is that people sometimes don't understand the race conditions inherent to them. If they copy/paste from a tutorial, it does just work. But then they'll delete half the code, and by chance it still works on their machine/Datasette templates, and now someone's headed for an annoying debugging session -- maybe them, maybe someone else who tries to re-use their plugin.

Again, a fairly minor thing, though.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1547911570 https://github.com/simonw/datasette/pull/2068#issuecomment-1547911570 https://api.github.com/repos/simonw/datasette/issues/2068 IC_kwDOBm6k_c5cQ0GS dependabot[bot] 49699333 2023-05-15T13:59:35Z 2023-05-15T13:59:35Z CONTRIBUTOR

Superseded by #2075.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 7.0.0 1690842199  
1540900733 https://github.com/simonw/sqlite-utils/issues/527#issuecomment-1540900733 https://api.github.com/repos/simonw/sqlite-utils/issues/527 IC_kwDOCGYnMM5b2Ed9 mcarpenter 167893 2023-05-09T21:15:05Z 2023-05-09T21:15:05Z CONTRIBUTOR

Sorry, I completely missed your first comment whilst on Easter break.

This looks like a good practical compromise before v4. Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`Table.convert()` skips falsey values 1578790070  
1530822437 https://github.com/simonw/datasette/pull/2052#issuecomment-1530822437 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5bPn8l cldellow 193185 2023-05-02T03:35:30Z 2023-05-02T16:02:38Z CONTRIBUTOR

Also, just checking - is this how I'd write bulletproof plugin registration code that is robust against the order in which the script tags load (eg if both my code and the Datasette code are loaded via a <script async src='...'/> tag)?

```js if (window.DATASETTE) go(window.DATASETTE); else document.addEventListener("datasette_init", (evt) => go(evt.detail));

function go(manager) { manager.registerPlugin(...) } ```

I don't know if it'd make sense, but you could also consider the asynchronous queuing pattern that Google Analytics uses (see this Stack Overflow post for more details):

```js DATASETTE = DATASETTE || []; DATASETTE.push(go);

function go(manager) { manager.registerPlugin(...); } ```

{
    "total_count": 2,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1530817667 https://github.com/simonw/datasette/pull/2052#issuecomment-1530817667 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5bPmyD cldellow 193185 2023-05-02T03:24:53Z 2023-05-02T03:24:53Z CONTRIBUTOR

Thanks for putting this together! I've been slammed with work/personal stuff so haven't been able to actually prototype anything with this. :(

tl;dr: I think this would be useful immediately as is. It might also be nice if the plugins could return Promises.

The long version: I read the design notes and example plugin. I think I'd be able to use this in datasette-ui-extras for my lazy-facets feature.

The lazy-facets feature tries to provide a snappier user experience. It does this by altering how suggested facets work.

First, at page render time: (A) it lies to Datasette and claims that no columns support facets, this avoids the lengthy delays/timeouts that can happen if the dataset is large. (B) there's a python plugin that implements the extra_body_script hook, to write out the list of column names for future use by JavaScript

Second, at page load time: there is some JavaScript that: (C) makes AJAX requests to suggest facets for each column - it makes 1 request per column, using the data from (B) (D) wires up the column menus to add Facet-by-this options for each facet

With the currently proposed plugin scheme, I think (D) could be moved into the plugin. I'd do the ajax requests, then register the plugin.

If the plugin scheme also supported promises, I think (B) and (C) could also be moved into the plugin.

Does that make sense? Sorry for the wall of text!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1529737426 https://github.com/simonw/datasette/pull/2064#issuecomment-1529737426 https://api.github.com/repos/simonw/datasette/issues/2064 IC_kwDOBm6k_c5bLfDS dependabot[bot] 49699333 2023-05-01T13:58:50Z 2023-05-01T13:58:50Z CONTRIBUTOR

Superseded by #2068.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 6.2.1 1683229834  
1521837780 https://github.com/simonw/datasette/pull/2063#issuecomment-1521837780 https://api.github.com/repos/simonw/datasette/issues/2063 IC_kwDOBm6k_c5atWbU dependabot[bot] 49699333 2023-04-25T13:57:52Z 2023-04-25T13:57:52Z CONTRIBUTOR

Superseded by #2064.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 6.1.3 to 6.2.0 1681339696  
1501017004 https://github.com/simonw/sqlite-utils/pull/531#issuecomment-1501017004 https://api.github.com/repos/simonw/sqlite-utils/issues/531 IC_kwDOCGYnMM5Zd7Os eyeseast 25778 2023-04-09T01:49:43Z 2023-04-09T01:49:43Z CONTRIBUTOR

I'm going to close this in favor of #536. Will try a cleaner approach to custom paths once that one is merge.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add paths for homebrew on Apple silicon 1620164673  
1487999503 https://github.com/simonw/datasette/pull/2014#issuecomment-1487999503 https://api.github.com/repos/simonw/datasette/issues/2014 IC_kwDOBm6k_c5YsRIP dependabot[bot] 49699333 2023-03-29T06:09:11Z 2023-03-29T06:09:11Z CONTRIBUTOR

Superseded by #2047.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump black from 22.12.0 to 23.1.0 1566081801  
1486944644 https://github.com/simonw/datasette/pull/2043#issuecomment-1486944644 https://api.github.com/repos/simonw/datasette/issues/2043 IC_kwDOBm6k_c5YoPmE dependabot[bot] 49699333 2023-03-28T13:58:20Z 2023-03-28T13:58:20Z CONTRIBUTOR

Superseded by #2046.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump furo from 2022.12.7 to 2023.3.23 1639446870  
1465315726 https://github.com/simonw/sqlite-utils/pull/531#issuecomment-1465315726 https://api.github.com/repos/simonw/sqlite-utils/issues/531 IC_kwDOCGYnMM5XVvGO eyeseast 25778 2023-03-12T22:21:56Z 2023-03-12T22:21:56Z CONTRIBUTOR

Exactly, that's what I was running into. On my M2 MacBook, SpatiaLite ends up in what is -- for the moment -- a non-standard location, so even when I passed in the location with --load-extension, I still hit an error on create-spatial-index.

What I learned doing this originally is that SQLite needs to load the extension for each connection, even if all the SpatiaLite stuff is already in the database. So that's why init_spatialite() gets called again.

Here's the code where I hit the error: https://github.com/eyeseast/boston-parcels/blob/main/Makefile#L30 It works using this branch.

I'm not attached to this solution if you can think of something better. And I'm not sure, TBH, my test would actually catch what I'm after here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add paths for homebrew on Apple silicon 1620164673  
1457172180 https://github.com/simonw/datasette/issues/2033#issuecomment-1457172180 https://api.github.com/repos/simonw/datasette/issues/2033 IC_kwDOBm6k_c5W2q7U eyeseast 25778 2023-03-06T22:54:52Z 2023-03-06T22:54:52Z CONTRIBUTOR

This would be a nice feature to have with datasette publish too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette install -r requirements.txt` 1612296210  
1444474487 https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1444474487 https://api.github.com/repos/simonw/sqlite-utils/issues/433 IC_kwDOCGYnMM5WGO53 mcarpenter 167893 2023-02-24T20:57:43Z 2023-02-24T22:22:18Z CONTRIBUTOR

I think I see what is happening here, although I haven't quite work out a fix yet. Usually:

  • click.progressbar.render_progress() renders the cursor invisible on each invocation (update of the bar)
  • When the progress bar goes out of scope, the __exit()__ method is invoked, which calls render_finish() to make the cursor re-appear.

(See terminal escape sequences BEFORE_BAR and AFTER_BAR in click).

However the sqlite-utils utils.file_progress context manager wraps click.progressbar and yields an instance of a helper class:

python @contextlib.contextmanager def file_progress(file, silent=False, **kwargs): ... with click.progressbar(length=file_length, **kwargs) as bar: yield UpdateWrapper(file, bar.update)

The yielded UpdateWrapper goes out of scope quickly and click.progressbar.__exit__() is called. The cursor is made un-invisible. Hoewever bar is still live and so when the caller iterates on the yielded wrapper this invokes the bar's update method, calling render_progress(), each time printing the "make cursor invisible" escape code. The progressbar.__exit__ function is not called again, so the cursor doesn't re-appear.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
CLI eats my cursor 1239034903  
1437671409 https://github.com/simonw/datasette/issues/1258#issuecomment-1437671409 https://api.github.com/repos/simonw/datasette/issues/1258 IC_kwDOBm6k_c5VsR_x brandonrobertz 2670795 2023-02-20T23:39:58Z 2023-02-20T23:39:58Z CONTRIBUTOR

This is pretty annoying for FTS because sqlite throws an error instead of just doing something like returning all or no results. This makes users who are unfamiliar with SQL and Datasette think the canned query page is broken and is a frequent source of confusion.

To anyone dealing with this: My solution is to modify the canned query so that it returns no results which cues people to fill in the blank parameters.

So instead of emails_fts match escape_fts(:search))

My canned queries now look like this:

emails_fts match escape_fts(iif(:search=="", "*", :search))

There are no asterisks in my data so the result is always blank.

Ultimately it would be nice to be able to handle this in the metadata. Either making some named parameters required or setting some default values.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Allow canned query params to specify default values 828858421  
1435318713 https://github.com/simonw/sqlite-utils/issues/525#issuecomment-1435318713 https://api.github.com/repos/simonw/sqlite-utils/issues/525 IC_kwDOCGYnMM5VjTm5 mcarpenter 167893 2023-02-17T21:55:01Z 2023-02-17T21:55:01Z CONTRIBUTOR

Meanwhile, a cheap workaround is to invalidate the registered function cache: python table.convert(...) db._registered_functions = set() table.convert(...)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Repeated calls to `Table.convert()` fail 1575131737  
1425974877 https://github.com/simonw/datasette/issues/2023#issuecomment-1425974877 https://api.github.com/repos/simonw/datasette/issues/2023 IC_kwDOBm6k_c5U_qZd cldellow 193185 2023-02-10T15:32:41Z 2023-02-10T15:32:41Z CONTRIBUTOR

I think this feature was removed in Datasette 0.61 and moved to a plugin. People who want hashed URLs can use the datasette-hashed-urls plugin to achieve the same affect.

It looks like you're trying to disable hashed urls, so I think you can just remove that config setting and things will work.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1579695809  
1423387341 https://github.com/simonw/sqlite-utils/issues/525#issuecomment-1423387341 https://api.github.com/repos/simonw/sqlite-utils/issues/525 IC_kwDOCGYnMM5U1yrN mcarpenter 167893 2023-02-08T23:48:52Z 2023-02-09T00:17:30Z CONTRIBUTOR

PR below

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Repeated calls to `Table.convert()` fail 1575131737  
1421571810 https://github.com/simonw/sqlite-utils/issues/520#issuecomment-1421571810 https://api.github.com/repos/simonw/sqlite-utils/issues/520 IC_kwDOCGYnMM5Uu3bi mcarpenter 167893 2023-02-07T22:43:09Z 2023-02-07T22:43:09Z CONTRIBUTOR

Hey, isn't this essentially the same issue as #448 ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
rows_from_file() raises confusing error if file-like object is not in binary mode 1516644980  
1420941334 https://github.com/simonw/datasette/pull/564#issuecomment-1420941334 https://api.github.com/repos/simonw/datasette/issues/564 IC_kwDOBm6k_c5UsdgW psychemedia 82988 2023-02-07T15:14:10Z 2023-02-07T15:14:10Z CONTRIBUTOR

Is this feature covered by any more recent updates to datasette, or via any plugins that you're aware of?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
First proof-of-concept of Datasette Library 473288428  
1419357290 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1419357290 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Umaxq eyeseast 25778 2023-02-06T16:21:44Z 2023-02-06T16:21:44Z CONTRIBUTOR

SQLite doesn't have a native DATETIME type. It stores dates internally as strings and then has functions to work with date-like strings. Yes it's weird.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1407767434 https://github.com/simonw/datasette/issues/1696#issuecomment-1407767434 https://api.github.com/repos/simonw/datasette/issues/1696 IC_kwDOBm6k_c5T6NOK cldellow 193185 2023-01-29T20:56:20Z 2023-01-29T20:56:20Z CONTRIBUTOR

I did some horrible things in https://github.com/cldellow/datasette-ui-extras/issues/2 to enable this in my plugin -- example here: https://dux-demo.fly.dev/cooking/posts?_facet=owner_user_id&owner_user_id=67

The implementation relies on two things:

  • a filters_from_request hook that adds a good human description (unfortunately, without the benefit of the CSS styling you mention)
  • doing something evil to hijack the exact and not operators in the Filters class. We can't leave them as is, or we'll get 2 human descriptions -- the built-in Datasette one and the one from my plugin. We can't remove them, or the filters UI will stop supporting the = and != operators

This got me thinking: it'd be neat if the list of operators that the filters UI supported wasn't a closed set.

A motivating example: adding a geospatial NEAR operator. Ideally it'd take two arguments - a target point and a radius, so you could express a filter like find me all rows whose lat/lng are within 10km of 43.4516° N, 80.4925° W. (Optionally, the UI could be enhanced if the geonames database was loaded and queried, so a user could say find me all rows whose lat/lng are within 10km of Kitchener, ON, and the city gets translated to a lat/lng for them)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Show foreign key label when filtering 1186696202  
1407716963 https://github.com/simonw/datasette/pull/2008#issuecomment-1407716963 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T6A5j cldellow 193185 2023-01-29T17:04:03Z 2023-01-29T17:04:03Z CONTRIBUTOR

Performance tests - I think most places don't have them as a formal gate enforced by CI. TypeScript and scalac seem to have tests that run to capture timings. The timings are included by a bot as a comment or build check, and also stored in a database so you can graph changes over time to spot regressions. Probably overkill for Datasette!

Window functions - oh, good point. Looks like Ubuntu shipped JSON1 support as far back as sqlite 3.11. I'll let this PR linger until there's a way to run against different SQLite versions. For now, I'm shipping this with datasette-ui-extras, since I think it's OK for a plugin to enforce a higher minimum requirement.

Tests - there actually did end up being test changes to capture the undercount bug of the current implementation, so the current implementation would fail against the new tests.

Perhaps a non-window function version could be written that uses random() instead of row_number() over () in order to get a unique key. It's technically not unique, but in practice, I imagine it'll work well.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  
1407561308 https://github.com/simonw/datasette/pull/2008#issuecomment-1407561308 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T5a5c cldellow 193185 2023-01-29T04:50:50Z 2023-01-29T04:50:50Z CONTRIBUTOR

I pushed a revised version which ends up being faster -- the example which currently takes 4 seconds now runs in 500ms.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  
1407558284 https://github.com/simonw/datasette/pull/2008#issuecomment-1407558284 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T5aKM cldellow 193185 2023-01-29T04:23:58Z 2023-01-29T04:24:27Z CONTRIBUTOR

Ack, this PR is broken. I see now that the inner.* is necessary for ensuring the correct count in the face of rows having duplicate values in views.

That fixes the overcounting, but I think can undercount when the rows have the same data, eg a view like:

sql SELECT '["bar"]' tags UNION ALL SELECT '["bar"]'

will produce a count of {"bar": 1 }, when it should be {"bar": 2}. In fact, this could apply in tables without primary keys, too.

If inner came from a base table that had a primary key or a rowid, we could use those column(s) to solve that case.

I guess a general solution would be to compute a window function so we have a distinct ID for each row. Will fiddle to see if I can get that working.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  
1407523547 https://github.com/simonw/datasette/issues/1973#issuecomment-1407523547 https://api.github.com/repos/simonw/datasette/issues/1973 IC_kwDOBm6k_c5T5Rrb cldellow 193185 2023-01-29T00:40:31Z 2023-01-29T00:40:31Z CONTRIBUTOR

A +1 for switching to CustomRow: I think you currently only get a CustomRow if the result set had a column that was an fkey (this code)

Otherwise you get vanilla sqlite3.Rows, which will fail if you try to access .columns or lookup the cell by name, which surprised me recently

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
render_cell plugin hook's row object is not a sqlite.Row 1515815014  
1407470429 https://github.com/simonw/datasette/pull/2008#issuecomment-1407470429 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T5Etd cldellow 193185 2023-01-28T19:34:29Z 2023-01-28T19:34:29Z CONTRIBUTOR

I don't know how/if you do automated tests for performance, so I haven't changed any of the tests.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  
1407264466 https://github.com/simonw/sqlite-utils/issues/523#issuecomment-1407264466 https://api.github.com/repos/simonw/sqlite-utils/issues/523 IC_kwDOCGYnMM5T4SbS fgregg 536941 2023-01-28T02:41:14Z 2023-01-28T02:41:14Z CONTRIBUTOR

I also often then run another little script to cast all empty strings to null, but i save that for another issue if this gets accepted.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Feature request: trim all leading and trailing white space for all columns for all tables in a database 1560651350  
1404070841 https://github.com/simonw/sqlite-utils/pull/203#issuecomment-1404070841 https://api.github.com/repos/simonw/sqlite-utils/issues/203 IC_kwDOCGYnMM5TsGu5 fgregg 536941 2023-01-25T18:47:18Z 2023-01-25T18:47:18Z CONTRIBUTOR

i'll adopt this PR to make the changes @simonw suggested https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
changes to allow for compound foreign keys 743384829  
1404065571 https://github.com/simonw/datasette/pull/2003#issuecomment-1404065571 https://api.github.com/repos/simonw/datasette/issues/2003 IC_kwDOBm6k_c5TsFcj fgregg 536941 2023-01-25T18:44:42Z 2023-01-25T18:44:42Z CONTRIBUTOR

see this related discussion to a change in API in sqlite-utils https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Show referring tables and rows when the referring foreign key is compound 1555701851  
1403084856 https://github.com/simonw/datasette/issues/2001#issuecomment-1403084856 https://api.github.com/repos/simonw/datasette/issues/2001 IC_kwDOBm6k_c5ToWA4 cldellow 193185 2023-01-25T04:31:02Z 2023-01-25T04:31:02Z CONTRIBUTOR

Aha, it's user error on my part.

Adding

sqlite3_db_config.argtypes = [ctypes.c_void_p, ctypes.c_int, ctypes.c_int, ctypes.c_int]

makes it work reliably both on the CLI and from datasette, and now I can reproduce the errors you mentioned in the issue description.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Datasette is not compatible with SQLite's strict quoting compilation option 1553615704  
1403078134 https://github.com/simonw/datasette/issues/2001#issuecomment-1403078134 https://api.github.com/repos/simonw/datasette/issues/2001 IC_kwDOBm6k_c5ToUX2 cldellow 193185 2023-01-25T04:20:43Z 2023-01-25T04:22:28Z CONTRIBUTOR

I'm on Ubuntu, unfortunately. :( Would it still be relevant?

I think I've narrowed things down a bit more.

Even sqlite3_free(sqlite3_malloc(128)) segfaults -- this suggests to me that it's something about the sqlite3 library that was loaded, vs, say, getting the wrong db handle when I go spelunking in the Connection object.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Datasette is not compatible with SQLite's strict quoting compilation option 1553615704  
1403053144 https://github.com/simonw/datasette/issues/2001#issuecomment-1403053144 https://api.github.com/repos/simonw/datasette/issues/2001 IC_kwDOBm6k_c5ToORY cldellow 193185 2023-01-25T03:34:53Z 2023-01-25T03:34:53Z CONTRIBUTOR

Your comment introduced me to this issue in sqlite and to the ctypes module - thanks!

I also hope that the datasette developers will enable this mode in a test environment [...] perhaps we could figure out how to invoke it using ctypes

I'm not a Datasette developer, but I am curious to learn more about getting unholy access to the sqlite C APIs inside of Datasette. (Such access could also help #1293, and if done without grovelling inside of pysqlite's Connection object for the db handle, could even be relatively safe.)

I experimented a bit. I came up with https://gist.github.com/cldellow/85bba507c314b127f85563869cd94820

If you run python3 enable-strict-quoting-sqlite3.py, it seems to set those flags correctly -- SELECT "foo" fails where it would normally succeed.

But if you put it in a plugins/ dir and run datasette --plugins-dir plugins/, it segfaults when it tries to call sqlite3_db_config on the connections created by Datasette.

I am... confused. I'm pretty sure I'm using the same python and the same libsqlite3 in both scenarios, so I would expect it to work.

@gwk do you know anything that might help me debug the segfault? I gather that my approach of going grovelling inside of a PyObject is particularly dangerous, but I was thinking (a) it's necessary in order to test Datasette's use of the sqlite3 library and (b) even if it's not portable, it'd be good enough for running the tests on a single machine.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Datasette is not compatible with SQLite's strict quoting compilation option 1553615704  
1402900354 https://github.com/simonw/datasette/issues/1099#issuecomment-1402900354 https://api.github.com/repos/simonw/datasette/issues/1099 IC_kwDOBm6k_c5Tno-C fgregg 536941 2023-01-25T00:58:26Z 2023-01-25T00:58:26Z CONTRIBUTOR

My original idea for compound foreign keys was to turn both of those columns into links, but that doesn't fit here because database_name is already part of a different foreign key.

it's pretty hard to know what the right thing to do is if a field is part of multiple foreign keys.

but, if that's not the case, what about making each of the columns a link. seems like an improvement over the status quo.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support linking to compound foreign keys 743371103  
1402898291 https://github.com/simonw/datasette/issues/1099#issuecomment-1402898291 https://api.github.com/repos/simonw/datasette/issues/1099 IC_kwDOBm6k_c5Tnodz fgregg 536941 2023-01-25T00:55:06Z 2023-01-25T00:55:06Z CONTRIBUTOR

I went ahead and spiked something together, in #2003

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support linking to compound foreign keys 743371103  
1402898033 https://github.com/simonw/datasette/pull/2003#issuecomment-1402898033 https://api.github.com/repos/simonw/datasette/issues/2003 IC_kwDOBm6k_c5TnoZx fgregg 536941 2023-01-25T00:54:41Z 2023-01-25T00:54:41Z CONTRIBUTOR

@simonw, let me know what you think about this approach!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Show referring tables and rows when the referring foreign key is compound 1555701851  
1402563930 https://github.com/simonw/datasette/issues/1099#issuecomment-1402563930 https://api.github.com/repos/simonw/datasette/issues/1099 IC_kwDOBm6k_c5TmW1a fgregg 536941 2023-01-24T20:11:11Z 2023-01-24T20:11:11Z CONTRIBUTOR

hi @simonw, this bug bit me today.

the UX for linking from a table to the foreign key seems tough!

the design in the other direction seems a lot easier, for a given primary key detail page, add links back to the tables that refer to the row.

would you be open to a PR that solved the second problem but not the first?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support linking to compound foreign keys 743371103  
1399847946 https://github.com/simonw/datasette/issues/2000#issuecomment-1399847946 https://api.github.com/repos/simonw/datasette/issues/2000 IC_kwDOBm6k_c5Tb_wK cldellow 193185 2023-01-23T06:08:00Z 2023-01-23T06:08:00Z CONTRIBUTOR

Actually, I discovered your post showing how a plugin can add a Datasette hook. That's wild! I've released datasette-rewrite-sql that adds this ability, albeit via monkey patching.

I had hoped to be able to expose request to the hook (or, even better actor) when the SQL was being run as a result of a user's HTTP request.

But some spelunking in the code makes me suspect that would actually require co-operation from Datasette itself. I'd be happy to be wrong and pointed in the right direction, though!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
rewrite_sql hook 1552368054  
1399589414 https://github.com/simonw/datasette/pull/1159#issuecomment-1399589414 https://api.github.com/repos/simonw/datasette/issues/1159 IC_kwDOBm6k_c5TbAom cldellow 193185 2023-01-22T19:48:41Z 2023-01-22T19:48:41Z CONTRIBUTOR

Hey @lovasoa, I hope you don't mind - I pulled this PR into datasette-ui-extras, a plugin I'm making that collects UI tweaks to Datasette.

You can apply it to your own Datasette instance by running datasette install datasette-ui-extras

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Improve the display of facets information 774332247  
1376620851 https://github.com/simonw/datasette/pull/1982#issuecomment-1376620851 https://api.github.com/repos/simonw/datasette/issues/1982 IC_kwDOBm6k_c5SDZEz dependabot[bot] 49699333 2023-01-10T02:03:18Z 2023-01-10T02:03:18Z CONTRIBUTOR

Looks like sphinx is up-to-date now, so this is no longer needed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 5.3.0 to 6.1.2 1525560504  
1375810027 https://github.com/simonw/datasette/issues/1983#issuecomment-1375810027 https://api.github.com/repos/simonw/datasette/issues/1983 IC_kwDOBm6k_c5SATHr eyeseast 25778 2023-01-09T15:35:58Z 2023-01-09T15:35:58Z CONTRIBUTOR

Yes please, and thank you. I realized I was maybe getting myself in trouble using that, but I think it's a good way to standardize JSON handling.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Make CustomJSONEncoder a documented public API 1525815985  
1375708725 https://github.com/simonw/datasette/issues/1978#issuecomment-1375708725 https://api.github.com/repos/simonw/datasette/issues/1978 IC_kwDOBm6k_c5R_6Y1 eyeseast 25778 2023-01-09T14:30:00Z 2023-01-09T14:30:00Z CONTRIBUTOR

Totally missed that issue. I can close this as a duplicate.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document datasette.urls.row and row_blob 1522778923  
1375596856 https://github.com/simonw/datasette/pull/1977#issuecomment-1375596856 https://api.github.com/repos/simonw/datasette/issues/1977 IC_kwDOBm6k_c5R_fE4 dependabot[bot] 49699333 2023-01-09T13:06:14Z 2023-01-09T13:06:14Z CONTRIBUTOR

Superseded by #1982.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 5.3.0 to 6.1.1 1522552817  
1373592231 https://github.com/simonw/datasette/pull/1976#issuecomment-1373592231 https://api.github.com/repos/simonw/datasette/issues/1976 IC_kwDOBm6k_c5R31qn dependabot[bot] 49699333 2023-01-06T13:02:15Z 2023-01-06T13:02:15Z CONTRIBUTOR

Superseded by #1977.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 5.3.0 to 6.1.0 1520712722  
1372188571 https://github.com/simonw/datasette/pull/1974#issuecomment-1372188571 https://api.github.com/repos/simonw/datasette/issues/1974 IC_kwDOBm6k_c5Rye-b dependabot[bot] 49699333 2023-01-05T13:02:40Z 2023-01-05T13:02:40Z CONTRIBUTOR

Superseded by #1976.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump sphinx from 5.3.0 to 6.0.0 1516376583  
1369044959 https://github.com/simonw/datasette/issues/1973#issuecomment-1369044959 https://api.github.com/repos/simonw/datasette/issues/1973 IC_kwDOBm6k_c5Rmfff cldellow 193185 2023-01-02T15:41:40Z 2023-01-02T15:41:40Z CONTRIBUTOR

Thanks for the response!

Yes, it does seem like a pretty nice developer experience--both the automagical labelling of fkeys, and the ability to index the row by column name in addition to column index.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
render_cell plugin hook's row object is not a sqlite.Row 1515815014  
1364345119 https://github.com/simonw/datasette/issues/1614#issuecomment-1364345119 https://api.github.com/repos/simonw/datasette/issues/1614 IC_kwDOBm6k_c5RUkEf fgregg 536941 2022-12-23T21:27:10Z 2022-12-23T21:27:10Z CONTRIBUTOR

is this issue closed by #1893?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Try again with SQLite codemirror support 1115435536  
1364345071 https://github.com/simonw/datasette/issues/1796#issuecomment-1364345071 https://api.github.com/repos/simonw/datasette/issues/1796 IC_kwDOBm6k_c5RUkDv fgregg 536941 2022-12-23T21:27:02Z 2022-12-23T21:27:02Z CONTRIBUTOR

@simonw is this issue closed by #1893?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Research an upgrade to CodeMirror 6 1355148385  
1339916064 https://github.com/simonw/datasette/pull/1931#issuecomment-1339916064 https://api.github.com/repos/simonw/datasette/issues/1931 IC_kwDOBm6k_c5P3X8g davidbgk 3556 2022-12-06T19:42:45Z 2022-12-06T19:42:45Z CONTRIBUTOR

The "return": true option is really nice!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
/db/table/-/upsert 1473814539  
1339906969 https://github.com/simonw/datasette/issues/1929#issuecomment-1339906969 https://api.github.com/repos/simonw/datasette/issues/1929 IC_kwDOBm6k_c5P3VuZ davidbgk 3556 2022-12-06T19:34:20Z 2022-12-06T19:34:20Z CONTRIBUTOR

I confirm that it works 👍

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Incorrect link from the API explorer to the JSON API documentation 1473659191  
1332310772 https://github.com/simonw/datasette/issues/1605#issuecomment-1332310772 https://api.github.com/repos/simonw/datasette/issues/1605 IC_kwDOBm6k_c5PaXL0 eyeseast 25778 2022-11-30T15:06:37Z 2022-11-30T15:06:37Z CONTRIBUTOR

I'll add issues for both and do a documentation PR.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Scripted exports 1108671952  
1331187551 https://github.com/simonw/datasette/issues/1605#issuecomment-1331187551 https://api.github.com/repos/simonw/datasette/issues/1605 IC_kwDOBm6k_c5PWE9f eyeseast 25778 2022-11-29T19:29:42Z 2022-11-29T19:29:42Z CONTRIBUTOR

Interesting. I started a version using metadata like I outlined up top, but I realized that there's no documented way for a plugin to access either metadata or canned queries. Or at least, I couldn't find a way.

There is this method: https://github.com/simonw/datasette/blob/main/datasette/app.py#L472 but I don't want to rely on it if it's not documented. Same with this: https://github.com/simonw/datasette/blob/main/datasette/app.py#L544

If those are safe, I'll build on them. I'm also happy to document them, if that greases the wheels.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Scripted exports 1108671952  
1321460293 https://github.com/simonw/datasette/issues/1884#issuecomment-1321460293 https://api.github.com/repos/simonw/datasette/issues/1884 IC_kwDOBm6k_c5Ow-JF asg017 15178711 2022-11-21T04:40:55Z 2022-11-21T04:40:55Z CONTRIBUTOR

Counting any virtual tables can be pretty tricky. On one hand, counting a CSV virtual table would return the number of rows in the CSV, which is helpful (but can be I/O intensive). Counting a FTS5 virtual table would return the number of entries in the FTS index, which is kindof helpful, but can be misleading in some cases.

On the other hand, arbitrarily running COUNT(*) on some virtual tables can be incredibly expensive. SQLite offers new shortcuts/pushdowns on COUNT(*) queries for virtual tables, and instead calls the underlying vtab implementation and iterates through all rows in the table without discretion. For example, a virtual table that's backed by a Postgres table would call select * from pg_table, which would use up a lot of network and CPU calls. Or a virtual table backed by a google sheet would make network/API requests to get all the rows from the sheet just to make a count.

The pragma_table_list pragma tells you when a table is a regular table or virtual (in the type column), but was only added in version 3.37.0 (2021-11-27).

Personally, I wouldnt try to COUNT(*) virtual tables - it depends on how the virtual table is implemented, it requires that the connection has the proper extensions loaded, and it may accientally cause perf issues for new-age extensions. A few extensions that I'm writing have virtual tables that wouldn't benefit much from COUNT(*), and the fact that SQLite iterates through all rows in a table to count just makes things worse.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Exclude virtual tables from datasette inspect 1439009231  
1321241426 https://github.com/simonw/datasette/issues/1886#issuecomment-1321241426 https://api.github.com/repos/simonw/datasette/issues/1886 IC_kwDOBm6k_c5OwItS fgregg 536941 2022-11-20T20:58:54Z 2022-11-20T20:58:54Z CONTRIBUTOR

i wrote up a blog post of how i'm using it! https://bunkum.us/2022/11/20/mgdo-stack.html

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Call for birthday presents: if you're using Datasette, let us know how you're using it here 1447050738  
1319533445 https://github.com/simonw/datasette/issues/1897#issuecomment-1319533445 https://api.github.com/repos/simonw/datasette/issues/1897 IC_kwDOBm6k_c5OpnuF bgrins 95570 2022-11-18T04:38:03Z 2022-11-18T04:38:03Z CONTRIBUTOR

Are you tracking the change to send the JSON over to the frontend separately or was that part of this? Something like this is probably pretty close https://github.com/bgrins/datasette/commit/8431c98850c7a552dbcde2a4dd0c3dc942a97d25#diff-0c93232bfd5477eeac96382e52769108b41433d960d5277ffcccf2f464e60abdR9

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Serve schema JSON to the SQL editor to enable autocomplete 1452457263  
1318897922 https://github.com/simonw/datasette/issues/1899#issuecomment-1318897922 https://api.github.com/repos/simonw/datasette/issues/1899 IC_kwDOBm6k_c5OnMkC bgrins 95570 2022-11-17T16:32:42Z 2022-11-17T16:32:42Z CONTRIBUTOR

Another idea would be to just not set a min-height and allow the 1 line input to be 1 line heigh

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused  1452495049  
1297788531 https://github.com/simonw/sqlite-utils/pull/508#issuecomment-1297788531 https://api.github.com/repos/simonw/sqlite-utils/issues/508 IC_kwDOCGYnMM5NWq5z chapmanjacobd 7908073 2022-10-31T22:54:33Z 2022-11-17T15:11:16Z CONTRIBUTOR

Maybe this is actually a problem in the python sqlite bindings. Given SQLITE's stance on this they should probably use encode('utf-8', 'surrogatepass'). As far as I understand the error here won't actually be resolved by this PR as-is. We would need to modify the data with surrogateescape... :/ or modify the sqlite3 module to use surrogatepass

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Allow surrogates in parameters 1430563092  

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 1696.029ms · About: github-to-sqlite