home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

9,673 rows sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

These facets timed out: author_association

user >30

  • simonw 8,166
  • codecov[bot] 189
  • fgregg 79
  • eyeseast 68
  • russss 39
  • psychemedia 35
  • abdusco 26
  • bgrins 24
  • mroswell 22
  • cldellow 21
  • aborruso 19
  • chrismp 18
  • chapmanjacobd 17
  • brandonrobertz 15
  • dependabot[bot] 15
  • jacobian 14
  • carlmjohnson 14
  • RhetTbull 14
  • tballison 13
  • wragge 12
  • tsibley 11
  • rixx 11
  • stonebig 11
  • frafra 10
  • terrycojones 10
  • rayvoelker 10
  • maxhawkins 9
  • clausjuhl 9
  • bobwhitelock 9
  • 20after4 8
  • …

issue >30

  • Redesign default .json format 54
  • Show column metadata plus links for foreign keys on arbitrary query results 51
  • Rethink how .ext formats (v.s. ?_format=) works before 1.0 48
  • Upgrade to CodeMirror 6, add SQL autocomplete 48
  • JavaScript plugin hooks mechanism similar to pluggy 47
  • Updated Dockerfile with SpatiaLite version 5.0 45
  • Complete refactor of TableView and table.html template 45
  • Port Datasette to ASGI 42
  • Authentication (and permissions) as a core concept 40
  • invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 36
  • Deploy a live instance of demos/apache-proxy 34
  • await datasette.client.get(path) mechanism for executing internal requests 33
  • Maintain an in-memory SQLite table of connected databases and their tables 32
  • Research: demonstrate if parallel SQL queries are worthwhile 32
  • Ability to sort (and paginate) by column 31
  • Default API token authentication mechanism 30
  • Port as many tests as possible to async def tests against ds_client 29
  • link_or_copy_directory() error - Invalid cross-device link 28
  • Export to CSV 27
  • base_url configuration setting 27
  • Documentation with recommendations on running Datasette in production without using Docker 27
  • Optimize all those calls to index_list and foreign_key_list 27
  • Support cross-database joins 26
  • Ability for a canned query to write to the database 26
  • table.transform() method for advanced alter table 26
  • New pattern for views that return either JSON or HTML, available for plugins 26
  • Add ?_extra= mechanism for requesting extra properties in JSON 25
  • Proof of concept for Datasette on AWS Lambda with EFS 25
  • WIP: Add Gmail takeout mbox import 25
  • Redesign register_output_renderer callback 24
  • …
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
1461047607 https://github.com/simonw/datasette/pull/1999#issuecomment-1461047607 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFdE3 simonw 9599 2023-03-08T23:51:46Z 2023-03-08T23:51:46Z OWNER

This feels quite nice:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1461044477 https://github.com/simonw/datasette/pull/1999#issuecomment-1461044477 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFcT9 simonw 9599 2023-03-08T23:47:26Z 2023-03-08T23:47:26Z OWNER

I want to package together all of the extras that are needed for the HTML format. A few options for doing that:

  • Introduce ?_extra=_html where the leading underscore indicates that this is a "bundle" of extras, then define a bundle that's everything needed for the HTML renderer
  • Have some other mechanism whereby different renderers can request a bundle of extras.

I'm leaning towards the first option. I'll try that and see what it looks like.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1461023559 https://github.com/simonw/datasette/pull/1999#issuecomment-1461023559 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFXNH simonw 9599 2023-03-08T23:23:02Z 2023-03-08T23:23:02Z OWNER

To get this unblocked, I'm going to allow myself to pass non-JSON-serializable objects to the HTML template version of things. If I can get that working (and get the existing tests to pass) I can consider a later change that makes those JSON serializable - or admit that it's OK for the templates to have non-JSON data passed to them and figure out how best to document those variables independently from the JSON documentation.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1461002039 https://github.com/simonw/datasette/pull/1999#issuecomment-1461002039 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFR83 simonw 9599 2023-03-08T22:58:16Z 2023-03-08T23:02:09Z OWNER

The reason for that Row thing is that it allows custom templates that do things like this:

https://docs.datasette.io/en/stable/changelog.html#easier-custom-templates-for-table-rows

html+jinja {% for row in display_rows %} <div> <h2>{{ row["title"] }}</h2> <p>{{ row["description"] }}<lp> <p>Category: {{ row.display("category_id") }}</p> </div> {% endfor %} Is that a good design? the .display() thing feels weird - I wonder if anyone has ever actually used that.

It's documented here: https://docs.datasette.io/en/0.64.2/custom_templates.html#custom-templates

If you want to output the rendered HTML version of a column, including any links to foreign keys, you can use {{ row.display("column_name") }}.

I can't see any examples of anyone using it in this code search: https://cs.github.com/?scopeName=All+repos&scope=&q=datasette+row.display

It is however useful to have some kind of abstraction layer here that insulates the SQLite Row object, since having an extra layer will help if Datasette ever grows support for alternative database backends such as DuckDB or PostgreSQL.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460988975 https://github.com/simonw/datasette/pull/1999#issuecomment-1460988975 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFOwv simonw 9599 2023-03-08T22:42:57Z 2023-03-08T22:42:57Z OWNER

Aside idea: it might be interesting if there were "lazy" template variables available in the context: things that are not actually executed unless a template author requests them.

Imagine if metadata was a lazy template reference, such that custom templates that don't display any metadata don't trigger it to be resolved (which might involve additional database queries some day).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460986533 https://github.com/simonw/datasette/pull/1999#issuecomment-1460986533 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFOKl simonw 9599 2023-03-08T22:40:28Z 2023-03-08T22:40:28Z OWNER

Figuring out what to do with display_columns_and_rows() is hard. That returns rows as this special kind of object, which is designed to be accessed from the HTML templates:

https://github.com/simonw/datasette/blob/96e94f9b7b2db53865e61390bcce6761727f26d8/datasette/views/table.py#L45-L71

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460970807 https://github.com/simonw/datasette/pull/1999#issuecomment-1460970807 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFKU3 simonw 9599 2023-03-08T22:31:49Z 2023-03-08T22:33:03Z OWNER

For the HTML version, I need to decide where all of the stuff that happens in async def extra_template() is going to live.

I think it's another one of those extra functions, triggered for ?_extra=context.

https://github.com/simonw/datasette/blob/96e94f9b7b2db53865e61390bcce6761727f26d8/datasette/views/table.py#L813-L912

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460943097 https://github.com/simonw/datasette/pull/1999#issuecomment-1460943097 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XFDj5 simonw 9599 2023-03-08T22:09:24Z 2023-03-08T22:09:47Z OWNER

The ease with which I added that ?_extra=query feature in https://github.com/simonw/datasette/pull/1999/commits/96e94f9b7b2db53865e61390bcce6761727f26d8 made me feel really confident that this architecture is going in the right direction.

```diff diff --git a/datasette/views/table.py b/datasette/views/table.py index 8d3bb2c930..3e1db9c85f 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -1913,6 +1913,13 @@ async def extra_request(): "args": request.args._data, }

  • async def extra_query():
  • "Details of the underlying SQL query"
  • return {
  • "sql": sql,
  • "params": params,
  • } + async def extra_extras(): "Available ?_extra= blocks" return { @@ -1938,6 +1945,7 @@ async def extra_extras(): extra_primary_keys, extra_debug, extra_request,
  • extra_query, extra_extras, ) ```
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460916405 https://github.com/simonw/datasette/pull/1999#issuecomment-1460916405 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XE9C1 simonw 9599 2023-03-08T21:43:27Z 2023-03-08T21:43:27Z OWNER

Just noticed that _json=colname is not working, and that's because it's handled by the renderer here:

https://github.com/simonw/datasette/blob/56b0758a5fbf85d01ff80a40c9b028469d7bb65f/datasette/renderer.py#L29-L40

But that's not currently being called by my new code.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460907148 https://github.com/simonw/datasette/pull/1999#issuecomment-1460907148 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XE6yM simonw 9599 2023-03-08T21:34:30Z 2023-03-08T21:34:30Z OWNER

I'm going to hold off on that refactor until later, when I have tests to show me if the refactor works or not.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460906741 https://github.com/simonw/datasette/pull/1999#issuecomment-1460906741 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XE6r1 simonw 9599 2023-03-08T21:34:08Z 2023-03-08T21:34:08Z OWNER

So maybe I can refactor it to look a bit more like this:

https://github.com/simonw/datasette/blob/db1a88f4e17a1f50bdaa681e8beddb2276503e7c/datasette/views/table.py#L1602-L1604

One thing that's useful here is that is_view is handled early, like this:

https://github.com/simonw/datasette/blob/db1a88f4e17a1f50bdaa681e8beddb2276503e7c/datasette/views/table.py#L466-L472

So if I omit the is_view bit from the extracted function I can simplify more.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460905469 https://github.com/simonw/datasette/pull/1999#issuecomment-1460905469 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XE6X9 simonw 9599 2023-03-08T21:32:56Z 2023-03-08T21:32:56Z OWNER

I'd really like to extract this ugly logic out into a helper function:

https://github.com/simonw/datasette/blob/db1a88f4e17a1f50bdaa681e8beddb2276503e7c/datasette/views/table.py#L1617-L1684

I copied it in and asked ChatGPT to "List all of the variable that are either modified or assigned to by the above code":

Modified variables:

offset
where_clauses
params
order_by

Assigned variables:

sort_value
next_by_pk_clauses
order_by (when sort or sort_desc is not None)

Then I asked which variables were used as inputs, and argued with it a bit about whether it should be counting functions. Eventually got to this:

My apologies for the oversight. Here are the variables needed as input by the above code, excluding any functions:

_next
is_view
sort
sort_desc
use_rowid
pks
order_by

Note that use_rowid is actually defined earlier in terms of two of those other variables: https://github.com/simonw/datasette/blob/db1a88f4e17a1f50bdaa681e8beddb2276503e7c/datasette/views/table.py#L1540

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460866243 https://github.com/simonw/datasette/issues/2036#issuecomment-1460866243 https://api.github.com/repos/simonw/datasette/issues/2036 IC_kwDOBm6k_c5XEwzD simonw 9599 2023-03-08T20:57:34Z 2023-03-08T20:57:34Z OWNER

This fix is released in 0.64.2 https://docs.datasette.io/en/stable/changelog.html#v0-64-2

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 1615862295  
1460848869 https://github.com/simonw/datasette/issues/2036#issuecomment-1460848869 https://api.github.com/repos/simonw/datasette/issues/2036 IC_kwDOBm6k_c5XEsjl simonw 9599 2023-03-08T20:40:55Z 2023-03-08T20:40:55Z OWNER

Here's the https://latest.datasette.io/ deployment that just went out, further demonstrating that this change is working correctly:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 1615862295  
1460840620 https://github.com/simonw/datasette/issues/2037#issuecomment-1460840620 https://api.github.com/repos/simonw/datasette/issues/2037 IC_kwDOBm6k_c5XEqis simonw 9599 2023-03-08T20:33:00Z 2023-03-08T20:33:00Z OWNER

Got the same failure again for a recent commit: https://github.com/simonw/datasette/actions/runs/4368239376/jobs/7640567282

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError 1615891776  
1460838797 https://github.com/simonw/datasette/issues/2037#issuecomment-1460838797 https://api.github.com/repos/simonw/datasette/issues/2037 IC_kwDOBm6k_c5XEqGN simonw 9599 2023-03-08T20:31:15Z 2023-03-08T20:31:15Z OWNER

It's this test here:

https://github.com/simonw/datasette/blob/1ad92a1d87d79084ebe524ed186c900ff042328c/tests/test_cli.py#L181-L189

Added in: - #2033

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError 1615891776  
1460838109 https://github.com/simonw/datasette/issues/2037#issuecomment-1460838109 https://api.github.com/repos/simonw/datasette/issues/2037 IC_kwDOBm6k_c5XEp7d simonw 9599 2023-03-08T20:30:36Z 2023-03-08T20:30:36Z OWNER

Instead of using isolated_filesystem() I could use a tmpdir fixture instead.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError 1615891776  
1460827178 https://github.com/simonw/datasette/issues/2036#issuecomment-1460827178 https://api.github.com/repos/simonw/datasette/issues/2036 IC_kwDOBm6k_c5XEnQq simonw 9599 2023-03-08T20:25:10Z 2023-03-08T20:25:10Z OWNER

https://console.cloud.google.com/run/detail/us-central1/new-service/revisions?project=datasette-222320 confirms that the image deployed is:

Compared to https://console.cloud.google.com/run/detail/us-central1/datasette-io/revisions?project=datasette-222320 which shows that datasette.io is running:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 1615862295  
1460816528 https://github.com/simonw/datasette/issues/2036#issuecomment-1460816528 https://api.github.com/repos/simonw/datasette/issues/2036 IC_kwDOBm6k_c5XEkqQ simonw 9599 2023-03-08T20:22:50Z 2023-03-08T20:23:20Z OWNER

Testing this manually:

``` % datasette publish cloudrun content.db --service new-service Creating temporary tarball archive of 2 file(s) totalling 13.8 MiB before compression. Uploading tarball of [.] to [gs://datasette-222320_cloudbuild/source/1678306859.271661-805303f364144b6094cc9c8532ab5133.tgz] Created [https://cloudbuild.googleapis.com/v1/projects/datasette-222320/locations/global/builds/290f41a4-e29a-443c-a1e5-c54513c6143d]. Logs are available at [ https://console.cloud.google.com/cloud-build/builds/290f41a4-e29a-443c-a1e5-c54513c6143d?project=99025868001 ]. ---- REMOTE BUILD OUTPUT ---- starting build "290f41a4-e29a-443c-a1e5-c54513c6143d"

FETCHSOURCE Fetching storage object: gs://datasette-222320_cloudbuild/source/1678306859.271661-805303f364144b6094cc9c8532ab5133.tgz#1678306862810483 Copying gs://datasette-222320_cloudbuild/source/1678306859.271661-805303f364144b6094cc9c8532ab5133.tgz#1678306862810483... / [1 files][ 3.9 MiB/ 3.9 MiB]
Operation completed over 1 objects/3.9 MiB. BUILD Already have image (with digest): gcr.io/cloud-builders/docker Sending build context to Docker daemon 14.52MB Step 1/9 : FROM python:3.11.0-slim-bullseye ... Installing collected packages: rfc3986, typing-extensions, sniffio, PyYAML, python-multipart, pluggy, pint, mergedeep, MarkupSafe, itsdangerous, idna, hupper, h11, click, certifi, asgiref, aiofiles, uvicorn, Jinja2, janus, click-default-group-wheel, asgi-csrf, anyio, httpcore, httpx, datasette Successfully installed Jinja2-3.1.2 MarkupSafe-2.1.2 PyYAML-6.0 aiofiles-23.1.0 anyio-3.6.2 asgi-csrf-0.9 asgiref-3.6.0 certifi-2022.12.7 click-8.1.3 click-default-group-wheel-1.2.2 datasette-0.64.1 h11-0.14.0 httpcore-0.16.3 httpx-0.23.3 hupper-1.11 idna-3.4 itsdangerous-2.1.2 janus-1.0.0 mergedeep-1.3.4 pint-0.20.1 pluggy-1.0.0 python-multipart-0.0.6 rfc3986-1.5.0 sniffio-1.3.0 typing-extensions-4.5.0 uvicorn-0.20.0 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

[notice] A new release of pip available: 22.3 -> 23.0.1 [notice] To update, run: pip install --upgrade pip Removing intermediate container 8ccebfebebc9 ---> b972c85b38bb ... Successfully built 606b7c286d7f Successfully tagged gcr.io/datasette-222320/datasette-new-service:latest PUSH Pushing gcr.io/datasette-222320/datasette-new-service The push refers to repository [gcr.io/datasette-222320/datasette-new-service] 667b1dc69e5e: Preparing ... d8ddfcff216f: Pushed latest: digest: sha256:452daffb2d3d7a8579c2ab39854be285155252c9428b4c1c50caac6a3a269e3f size: 2004 DONE


ID CREATE_TIME DURATION SOURCE IMAGES STATUS 290f41a4-e29a-443c-a1e5-c54513c6143d 2023-03-08T20:21:03+00:00 39S gs://datasette-222320_cloudbuild/source/1678306859.271661-805303f364144b6094cc9c8532ab5133.tgz gcr.io/datasette-222320/datasette-new-service (+1 more) SUCCESS Deploying container to Cloud Run service [new-service] in project [datasette-222320] region [us-central1] ✓ Deploying new service... Done.
✓ Creating Revision...
✓ Routing traffic...
✓ Setting IAM Policy...
Done.
Service [new-service] revision [new-service-00001-zon] has been deployed and is serving 100 percent of traffic. Service URL: https://new-service-j7hipcg4aq-uc.a.run.app ``` https://new-service-j7hipcg4aq-uc.a.run.app/ was deployed successfully.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 1615862295  
1460810523 https://github.com/simonw/datasette/issues/2036#issuecomment-1460810523 https://api.github.com/repos/simonw/datasette/issues/2036 IC_kwDOBm6k_c5XEjMb simonw 9599 2023-03-08T20:17:01Z 2023-03-08T20:17:01Z OWNER

I'm going to solve this by using the service name in that image_id instead:

python image_id = f"gcr.io/{project}/{service_name}" This is a nasty bug, so I'm going to backport it to a 0.64.2 release as well.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 1615862295  
1460809643 https://github.com/simonw/datasette/issues/2036#issuecomment-1460809643 https://api.github.com/repos/simonw/datasette/issues/2036 IC_kwDOBm6k_c5XEi-r simonw 9599 2023-03-08T20:16:10Z 2023-03-08T20:16:10Z OWNER

I think the code at fault is here:

https://github.com/simonw/datasette/blob/1ad92a1d87d79084ebe524ed186c900ff042328c/datasette/publish/cloudrun.py#L176-L182

That name ends up defaulting to datasette - so multiple different projects may end up deploying to the same image_id.

What I think happened in the datasette.io bug is that this workflow: https://github.com/simonw/simonwillisonblog-backup/blob/bfb573e96d8622ab52b22fdcd54724fe6e59fd24/.github/workflows/backup.yml and this workflow: https://github.com/simonw/datasette.io/blob/4676db5bf4a3fc9f792ee270ec0c59eb902cd2c3/.github/workflows/deploy.yml both happened to run at the exact same time.

And so the image that was pushed to gcr.io/datasette-222320/datasette:latest by the simonw/simonwillisonblog-backup action was then deployed by the simonw/datasette.io/ action, which broke the site.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 1615862295  
1460808028 https://github.com/simonw/datasette/issues/2035#issuecomment-1460808028 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XEilc ar-jan 1176293 2023-03-08T20:14:47Z 2023-03-08T20:14:47Z NONE

+1, I have been wishing for this feature (also for use with template-sql). It was requested before here #1304.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460760116 https://github.com/simonw/datasette/pull/1999#issuecomment-1460760116 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5XEW40 simonw 9599 2023-03-08T19:48:52Z 2023-03-08T19:48:52Z OWNER

I'm trying to get http://127.0.0.1:8001/fixtures/compound_three_primary_keys?_next=a,d,v to return the correct results.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1460682625 https://github.com/simonw/datasette/issues/2035#issuecomment-1460682625 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XED-B simonw 9599 2023-03-08T18:40:57Z 2023-03-08T18:40:57Z OWNER

Pushed that prototype to a branch: https://github.com/simonw/datasette/commit/0fe844e9adb006a0138e83102ced1329d9155c59 / https://github.com/simonw/datasette/compare/sql-list-parameters

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460679434 https://github.com/simonw/datasette/issues/2035#issuecomment-1460679434 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XEDMK simonw 9599 2023-03-08T18:39:35Z 2023-03-08T18:39:35Z OWNER

I should consider the existing design of magic parameters here: https://docs.datasette.io/en/stable/sql_queries.html#magic-parameters

  • _actor_*
  • _header_*
  • _cookie_
  • _now_epoch
  • _now_date_utc
  • _now_datetime_utc
  • _random_chars_*

Should this new id__list syntax look more like those magic parameters, or is it OK to use name__magic syntax here instead?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460668431 https://github.com/simonw/datasette/issues/2035#issuecomment-1460668431 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XEAgP simonw 9599 2023-03-08T18:35:34Z 2023-03-08T18:35:34Z OWNER

To implement this properly need to do the following: - Get the page to display multiple id: [ text input here ] fields such that re-submission works - Figure out how this should work for canned queries and for writable canned queries - Tests that cover queries, canned queries, writable canned queries

And a bonus feature: what if the Datasette UI layer spotted :id__list parameters and used them to add a bit of JavaScript that allowed users to click a + button next to an id form field to add another one?

Also, when a page is re-displayed for on of these queries it could potentially add an extra form field allowing people to add another value.

Though this has an annoying problem: how to tell the difference between an additional id input field that the user chose not to populate, v.s. one that is supposed to represent an empty string?

Maybe only support multiple id fields for users with JavaScript in order to avoid this problem.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460664619 https://github.com/simonw/datasette/issues/2035#issuecomment-1460664619 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD_kr simonw 9599 2023-03-08T18:32:29Z 2023-03-08T18:32:29Z OWNER

Got a prototype working: ```diff diff --git a/datasette/views/database.py b/datasette/views/database.py index 8d289105..6f9d8a44 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -226,6 +226,12 @@ class QueryView(DataView): ): db = await self.ds.resolve_database(request) database = db.name + # Disallow x__list query string parameters + invalid_params = [k for k in request.args if k.endswith("__list")] + if invalid_params: + raise DatasetteError( + "Invalid query string parameters: {}".format(", ".join(invalid_params)) + ) params = {key: request.args.get(key) for key in request.args} if "sql" in params: params.pop("sql") @@ -258,6 +264,11 @@ class QueryView(DataView): for named_parameter in named_parameters if not named_parameter.startswith("_") } + # Handle any __list parameters + for named_parameter in named_parameters: + if named_parameter.endswith("__list"): + list_values = request.args.getlist(named_parameter[:-6]) + params[named_parameter] = json.dumps(list_values)

     # Set to blank string if missing from params
     for named_parameter in named_parameters:

`` This isn't yet doing the right thing on form re-submission: it breaks because it attempts to pass through the?id__list=` invalid parameter. But I did manage to get it to do this through careful editing of the URL:

That was this URL: http://127.0.0.1:8034/content?sql=select+%3Aid__list%2C*+from+releases+where+id+in+(select+value+from+json_each(%3Aid__list))&id=62642726&id=18402901&id=38714866

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460659382 https://github.com/simonw/datasette/issues/2035#issuecomment-1460659382 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD-S2 simonw 9599 2023-03-08T18:28:00Z 2023-03-08T18:28:00Z OWNER

Also: datasette-explain may need to be updated to understand how to handle this:

ERROR: conn=<sqlite3.Connection object at 0x102834940>, sql = 'explain select * from releases where id in (select id from json_each(:id__list))', params = None: You did not supply a value for binding parameter :id__list.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460654136 https://github.com/simonw/datasette/issues/2035#issuecomment-1460654136 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD9A4 simonw 9599 2023-03-08T18:25:46Z 2023-03-08T18:25:46Z OWNER

Trickiest part of the implementation here is that it needs to know to output three id HTML form fields on the page, such that their values are persisted when the form is submitted a second time.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460639749 https://github.com/simonw/datasette/issues/2035#issuecomment-1460639749 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD5gF simonw 9599 2023-03-08T18:17:31Z 2023-03-08T18:17:31Z OWNER

Since we are pre-1.0 it's still OK to implement a feature that disallows ?id__list= in the URL, but allows :id__list in SQL queries to reference the JSON list of parameters.

So I'm going to prototype this as the :id__list feature and see how it feels.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460637906 https://github.com/simonw/datasette/issues/2035#issuecomment-1460637906 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD5DS simonw 9599 2023-03-08T18:16:31Z 2023-03-08T18:16:31Z OWNER

I'm pretty sold on this as a feature now. The main question I have is which of these options to implement:

  1. ?id=1&?id=2 results in :id in the query being ["1", "2"] - no additional syntax required
  2. :id in the query continues to reference just the first of those parameters - but :id__list (or some other custom syntax) instead gets ["1", "2"] - or, if the URL is ?id=1 - gets ["1"]

Actually on writing these out I realize that option 2 is the ONLY valid option. It's no good building a query that works against a JSON list if the user might pass just a single ID, ?id=1, resulting in their query breaking.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460632758 https://github.com/simonw/datasette/issues/2035#issuecomment-1460632758 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD3y2 simonw 9599 2023-03-08T18:13:49Z 2023-03-08T18:13:49Z OWNER

https://github.com/rclement/datasette-dashboards/issues/54 makes the excellent point that the <select multiple> default HTML widget produces this exact format of query string:

```html

<form action="https://www.example.com/"> <select multiple name="id"> <option>21</option> <option>32</option> <option>15</option> <option>63</option> </select> </form>

`` Submitting that form with the middle two options selected navigates to:https://www.example.com/?id=32&id=15`

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460628199 https://github.com/simonw/datasette/issues/2035#issuecomment-1460628199 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD2rn simonw 9599 2023-03-08T18:11:31Z 2023-03-08T18:11:31Z OWNER

One variant on this idea: maybe you have to specify in your query that you want it to be the JSON list version, not the single item (first ?id= parameter version)? Maybe with syntax like this:

where id in (select value from json_each(:id__list))

Datasette would automatically pass {"id": "11", "id__list": '["11", "32", "62"]'} as arguments to the db.execute() method, if the page was called with ?id=11&id=32&id=62.

This is more explicit, though the syntax is a bit uglier (maybe there's a nicer design for this?). I also worry about ?id__list= conflicting with this, but I think that's a risk I can take - tell people not to do that, or even block ?id__list= style parameters entirely.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460621871 https://github.com/simonw/datasette/issues/2035#issuecomment-1460621871 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD1Iv simonw 9599 2023-03-08T18:08:25Z 2023-03-08T18:09:04Z OWNER

My current preferred solution is to lean into SQLite's JSON support.

What if the query page spotted ?id=11&id=32&id=62 and turned that into a JSON string called :id: with a value of ["11", "32", "62"]?

Note that this is still a string, not a list. This avoids a nasty problem that occurred in PHP world, where ?id[]=1&id[]=2 would result in an actual PHP array object, which often broke underlying code that had expected $_GET["id"] to be a string, not an array.

So in a query you'd be able to do this:

where id in (select value from json_each(:id))

And then call it with ?id=11&id=32&id=62.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1460618433 https://github.com/simonw/datasette/issues/2035#issuecomment-1460618433 https://api.github.com/repos/simonw/datasette/issues/2035 IC_kwDOBm6k_c5XD0TB simonw 9599 2023-03-08T18:06:34Z 2023-03-08T18:06:34Z OWNER

One way to do this would be to dynamically generate the where id in (?, ?, ?) with the correct number of question marks, then feed in a list from request.args.getlist("id") - but that would require rewriting the SQL query text to add those question marks.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential feature: special support for `?a=1&a=2` on the query page 1615692818  
1459455356 https://github.com/simonw/datasette/issues/2027#issuecomment-1459455356 https://api.github.com/repos/simonw/datasette/issues/2027 IC_kwDOBm6k_c5W_YV8 dmick 1350673 2023-03-08T04:42:22Z 2023-03-08T04:42:22Z NONE

I managed to make it work by using nginx's 'exact match' (=) combined with 'prefix match'; that is, match explicitly on /, and redirect to /<db>/<table>, and then have the normal ProxyPath for the unadorned (prefix-matching) /.

location = / { return 302 /<db>/<table>; } location / { proxy_pass http://127.0.0.1:8001/; proxy_set_header Host $host; }

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
How to redirect from "/" to a specific db/table 1590183272  
1457243738 https://github.com/simonw/datasette/pull/2031#issuecomment-1457243738 https://api.github.com/repos/simonw/datasette/issues/2031 IC_kwDOBm6k_c5W28Za tmcl-it 82332573 2023-03-07T00:05:25Z 2023-03-07T00:12:09Z NONE

I've implemented the test (thanks for pointing me in the right direction!).

At tmcl-it/datasette:0.64.1+row-view-expand-labels I also have a variant of this patch that applies to the 0.64.x branch. Please let me know if you'd be interested in merging that as well and I'll open another PR.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Expand foreign key references in row view as well 1605481359  
1457172180 https://github.com/simonw/datasette/issues/2033#issuecomment-1457172180 https://api.github.com/repos/simonw/datasette/issues/2033 IC_kwDOBm6k_c5W2q7U eyeseast 25778 2023-03-06T22:54:52Z 2023-03-06T22:54:52Z CONTRIBUTOR

This would be a nice feature to have with datasette publish too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette install -r requirements.txt` 1612296210  
1457117383 https://github.com/simonw/datasette/issues/2033#issuecomment-1457117383 https://api.github.com/repos/simonw/datasette/issues/2033 IC_kwDOBm6k_c5W2djH simonw 9599 2023-03-06T22:28:55Z 2023-03-06T22:28:55Z OWNER

Documentation: https://docs.datasette.io/en/latest/plugins.html#installing-plugins

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`datasette install -r requirements.txt` 1612296210  
1456997425 https://github.com/simonw/datasette/pull/2031#issuecomment-1456997425 https://api.github.com/repos/simonw/datasette/issues/2031 IC_kwDOBm6k_c5W2AQx simonw 9599 2023-03-06T21:04:27Z 2023-03-06T21:06:34Z OWNER

This is a very neat fix, for something I've been wanting for a while.

Add a unit test for the row HTML page - I suggest against this page: https://latest.datasette.io/fixtures/foreign_key_references/1 - and I'll land this PR.

You can model it on this test here: https://github.com/simonw/datasette/blob/a53b893c46453f35decc8c145c138671cee6140c/tests/test_table_html.py#L609-L632

I think adding it to test_table_html.py is OK, even though it's technically for the row page and not the table page.

Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Expand foreign key references in row view as well 1605481359  
1456925875 https://github.com/simonw/datasette/pull/2028#issuecomment-1456925875 https://api.github.com/repos/simonw/datasette/issues/2028 IC_kwDOBm6k_c5W1uyz codecov[bot] 22429695 2023-03-06T20:26:53Z 2023-03-06T20:26:53Z NONE

Codecov Report

Patch and project coverage have no change.

Comparison is base (0b4a286) 92.11% compared to head (a8dde13) 92.11%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2028 +/- ## ======================================= Coverage 92.11% 92.11% ======================================= Files 38 38 Lines 5555 5555 ======================================= Hits 5117 5117 Misses 438 438 ``` Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)

:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
add Python 3.11 classifier 1590839187  
1456914694 https://github.com/simonw/datasette/pull/2028#issuecomment-1456914694 https://api.github.com/repos/simonw/datasette/issues/2028 IC_kwDOBm6k_c5W1sEG simonw 9599 2023-03-06T20:19:37Z 2023-03-06T20:19:37Z OWNER

Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
add Python 3.11 classifier 1590839187  
1455196849 https://github.com/simonw/datasette/issues/1619#issuecomment-1455196849 https://api.github.com/repos/simonw/datasette/issues/1619 IC_kwDOBm6k_c5WvIqx BryantD 969875 2023-03-05T20:29:55Z 2023-03-05T20:30:14Z NONE

I have this same issue, which is happening with both json links and facets. It is not happening with column sort links in the gear popup menus, but it is happening with the sort arrow that results after you use one of those links. I'm using Apache as a proxy to Datasette; the relevant configs are:

ProxyPass /datasette/ http://127.0.0.1:8000/datasette/ nocanon ProxyPreserveHost on

{ "base_url": "/datasette/" }

If it would be useful to get a look at the running installation via the Web, Simon, let me know.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JSON link on row page is 404 if base_url setting is used 1121583414  
1444474487 https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1444474487 https://api.github.com/repos/simonw/sqlite-utils/issues/433 IC_kwDOCGYnMM5WGO53 mcarpenter 167893 2023-02-24T20:57:43Z 2023-02-24T22:22:18Z NONE

I think I see what is happening here, although I haven't quite work out a fix yet. Usually:

  • click.progressbar.render_progress() renders the cursor invisible on each invocation (update of the bar)
  • When the progress bar goes out of scope, the __exit()__ method is invoked, which calls render_finish() to make the cursor re-appear.

(See terminal escape sequences BEFORE_BAR and AFTER_BAR in click).

However the sqlite-utils utils.file_progress context manager wraps click.progressbar and yields an instance of a helper class:

python @contextlib.contextmanager def file_progress(file, silent=False, **kwargs): ... with click.progressbar(length=file_length, **kwargs) as bar: yield UpdateWrapper(file, bar.update)

The yielded UpdateWrapper goes out of scope quickly and click.progressbar.__exit__() is called. The cursor is made un-invisible. Hoewever bar is still live and so when the caller iterates on the yielded wrapper this invokes the bar's update method, calling render_progress(), each time printing the "make cursor invisible" escape code. The progressbar.__exit__ function is not called again, so the cursor doesn't re-appear.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
CLI eats my cursor 1239034903  
1440854834 https://github.com/simonw/datasette/issues/2030#issuecomment-1440854834 https://api.github.com/repos/simonw/datasette/issues/2030 IC_kwDOBm6k_c5V4bMy gk7279 19700859 2023-02-22T21:54:39Z 2023-02-22T21:54:39Z NONE

Thanks @dmick . I chose to create a firewall rule under my GCP to open the port of interest and datasette works.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
How to use Datasette with apache webserver on GCP? 1594383280  
1440814680 https://github.com/simonw/datasette/issues/2030#issuecomment-1440814680 https://api.github.com/repos/simonw/datasette/issues/2030 IC_kwDOBm6k_c5V4RZY dmick 1350673 2023-02-22T21:22:42Z 2023-02-22T21:22:42Z NONE

@gk7279, you had asked in a separate bug about how to redirect web servers in general. The datasette docs actually have pretty good information on this for both nginx and apache2: https://docs.datasette.io/en/stable/deploying.html#running-datasette-behind-a-proxy

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
How to use Datasette with apache webserver on GCP? 1594383280  
1440811364 https://github.com/simonw/datasette/issues/2027#issuecomment-1440811364 https://api.github.com/repos/simonw/datasette/issues/2027 IC_kwDOBm6k_c5V4Qlk gk7279 19700859 2023-02-22T21:19:47Z 2023-02-22T21:19:47Z NONE

yes @dmick . How did you make your public IP redirect to your uvicorn server?

Instead of nginx, I have apache2 on my GCP VM. Any pointers here are helpful too.

Thanks.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
How to redirect from "/" to a specific db/table 1590183272  
1440762383 https://github.com/simonw/datasette/issues/2027#issuecomment-1440762383 https://api.github.com/repos/simonw/datasette/issues/2027 IC_kwDOBm6k_c5V4EoP dmick 1350673 2023-02-22T20:35:16Z 2023-02-22T20:35:16Z NONE

Was that query to me, @gk7279?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
How to redirect from "/" to a specific db/table 1590183272  
1440355080 https://github.com/simonw/datasette/issues/2027#issuecomment-1440355080 https://api.github.com/repos/simonw/datasette/issues/2027 IC_kwDOBm6k_c5V2hMI gk7279 19700859 2023-02-22T16:26:41Z 2023-02-22T16:26:41Z NONE

Can you please help or share your expertise with #2030 ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
How to redirect from "/" to a specific db/table 1590183272  
1437671409 https://github.com/simonw/datasette/issues/1258#issuecomment-1437671409 https://api.github.com/repos/simonw/datasette/issues/1258 IC_kwDOBm6k_c5VsR_x brandonrobertz 2670795 2023-02-20T23:39:58Z 2023-02-20T23:39:58Z CONTRIBUTOR

This is pretty annoying for FTS because sqlite throws an error instead of just doing something like returning all or no results. This makes users who are unfamiliar with SQL and Datasette think the canned query page is broken and is a frequent source of confusion.

To anyone dealing with this: My solution is to modify the canned query so that it returns no results which cues people to fill in the blank parameters.

So instead of emails_fts match escape_fts(:search))

My canned queries now look like this:

emails_fts match escape_fts(iif(:search=="", "*", :search))

There are no asterisks in my data so the result is always blank.

Ultimately it would be nice to be able to handle this in the metadata. Either making some named parameters required or setting some default values.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Allow canned query params to specify default values 828858421  
1435318713 https://github.com/simonw/sqlite-utils/issues/525#issuecomment-1435318713 https://api.github.com/repos/simonw/sqlite-utils/issues/525 IC_kwDOCGYnMM5VjTm5 mcarpenter 167893 2023-02-17T21:55:01Z 2023-02-17T21:55:01Z NONE

Meanwhile, a cheap workaround is to invalidate the registered function cache: python table.convert(...) db._registered_functions = set() table.convert(...)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Repeated calls to `Table.convert()` fail 1575131737  
1426158181 https://github.com/simonw/datasette/issues/1775#issuecomment-1426158181 https://api.github.com/repos/simonw/datasette/issues/1775 IC_kwDOBm6k_c5VAXJl metamoof 805751 2023-02-10T18:04:40Z 2023-02-10T18:04:40Z NONE

Is this where we talk about i18n of results? Or is that a separate thread.

e.g. Having country_long show España in the Spanish version of the global power plants demo site instead of Spain.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
i18n support 1323346408  
1426031395 https://github.com/simonw/datasette/issues/2024#issuecomment-1426031395 https://api.github.com/repos/simonw/datasette/issues/2024 IC_kwDOBm6k_c5U_4Mj simonw 9599 2023-02-10T16:11:53Z 2023-02-10T16:11:53Z OWNER

Relevant: https://til.simonwillison.net/sqlite/enabling-wal-mode

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mention WAL mode in documentation 1579973223  
1425988018 https://github.com/simonw/datasette/issues/2023#issuecomment-1425988018 https://api.github.com/repos/simonw/datasette/issues/2023 IC_kwDOBm6k_c5U_tmy mlaparie 80409402 2023-02-10T15:39:59Z 2023-02-10T15:39:59Z NONE

Thanks for confirming my doubts! I removed it after opening this issue, yup, then had another issue with default_cache_ttl_hashed which I assume was removed at the same time. Sorry for not trying that before opening the issue.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1579695809  
1425974877 https://github.com/simonw/datasette/issues/2023#issuecomment-1425974877 https://api.github.com/repos/simonw/datasette/issues/2023 IC_kwDOBm6k_c5U_qZd cldellow 193185 2023-02-10T15:32:41Z 2023-02-10T15:32:41Z CONTRIBUTOR

I think this feature was removed in Datasette 0.61 and moved to a plugin. People who want hashed URLs can use the datasette-hashed-urls plugin to achieve the same affect.

It looks like you're trying to disable hashed urls, so I think you can just remove that config setting and things will work.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1579695809  
1424848569 https://github.com/simonw/datasette/issues/2022#issuecomment-1424848569 https://api.github.com/repos/simonw/datasette/issues/2022 IC_kwDOBm6k_c5U7Xa5 DavidPratten 1667631 2023-02-09T21:13:50Z 2023-02-09T21:13:50Z NONE

Nulls in primary keys, does it every time.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Error 500 - not clear the cause 1578609658  
1423387341 https://github.com/simonw/sqlite-utils/issues/525#issuecomment-1423387341 https://api.github.com/repos/simonw/sqlite-utils/issues/525 IC_kwDOCGYnMM5U1yrN mcarpenter 167893 2023-02-08T23:48:52Z 2023-02-09T00:17:30Z NONE

PR below

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Repeated calls to `Table.convert()` fail 1575131737  
1423067724 https://github.com/simonw/datasette/issues/262#issuecomment-1423067724 https://api.github.com/repos/simonw/datasette/issues/262 IC_kwDOBm6k_c5U0kpM simonw 9599 2023-02-08T18:33:32Z 2023-02-08T18:36:48Z OWNER

Just realized that it's useful to be able to tell what parameters were used to generate a page... but reflecting things like _next back in the JSON is confusing in the presence of next.

So I'm going to add an extra for that information too.

Not sure what to call it though:

  • params - confusing because in the code that's usually used for params passed to SQL queries
  • query_string - wouldn't that be a string, not params as a dictionary?

I'm going to experiment with a request extra that returns some bits of information about the request.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add ?_extra= mechanism for requesting extra properties in JSON 323658641  
1422681850 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1422681850 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5UzGb6 4l1fe 21095447 2023-02-08T14:25:50Z 2023-02-08T14:29:09Z NONE

I live the patch here for others:

original code shell $ which sqlite-utils | xargs cat ```python

!/usr/bin/python3

-- coding: utf-8 --

import re import sys from sqlite_utils.cli import cli

if name == 'main': sys.argv[0] = re.sub(r'(-script.pyw|.exe)?$', '', sys.argv[0]) sys.exit(cli()) ```

patched/sqlite-utils.py ```python

!/usr/bin/python3

-- coding: utf-8 --

import re import sys from sqlite_utils.cli import cli

New imports

from unittest.mock import patch from sqlite_utils.cli import VALID_COLUMN_TYPES

if name == 'main': # Choices of the option --type cli.commands['transform'].params[2].type.types[1].choices.append('DATETIME')

# The dicts has to be extended with a new type
with patch.dict('sqlite_utils.db.COLUMN_TYPE_MAPPING', {'DATETIME': 'DATETIME'}),\
     patch('sqlite_utils.cli.VALID_COLUMN_TYPES', VALID_COLUMN_TYPES + ("DATETIME", )):

    # Command is unchanged
    sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
    sys.exit(cli())

```

And now it's working ```bash $ sqlite-utils schema events.sqlite cards.chunk.get CREATE TABLE "cards.chunk.get" ( [id] INTEGER PRIMARY KEY NOT NULL, [timestamp] TEXT, )

$ python patched/sqlite-utils.py transform events.sqlite cards.chunk.get --type timestamp DATETIME

$ sqlite-utils schema events.sqlite cards.chunk.get CREATE TABLE "cards.chunk.get" ( [id] INTEGER PRIMARY KEY NOT NULL, [timestamp] DATETIME, ) ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1421988953 https://github.com/simonw/datasette/pull/1999#issuecomment-1421988953 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5UwdRZ simonw 9599 2023-02-08T04:35:44Z 2023-02-08T05:27:48Z OWNER

Next step: get ?_next=... working (it is ignored at the moment, even though the returned JSON includes the "next" key).

Then... figure out how to render HTML and other requested formats.

Then get the tests to pass!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1421784930 https://github.com/simonw/datasette/issues/2019#issuecomment-1421784930 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5Uvrdi simonw 9599 2023-02-08T01:28:25Z 2023-02-08T01:40:46Z OWNER

Rather than duplicate this rather awful hack:

https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L694-L714

I'm tempted to say that the code that calls the new pagination helper needs to ensure that the sort or sort_desc columns are selected. If it wants to ditch them later (e.g. because they were not included in ?_col=) it can do that later once the results have come back.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1421600789 https://github.com/simonw/datasette/issues/2019#issuecomment-1421600789 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5Uu-gV simonw 9599 2023-02-07T23:12:40Z 2023-02-07T23:16:20Z OWNER

Most complicated example of a paginated query: https://latest.datasette.io/fixtures?sql=select%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++content%2C%0D%0A++sortable%2C%0D%0A++sortable_with_nulls%2C%0D%0A++sortable_with_nulls_2%2C%0D%0A++text%0D%0Afrom%0D%0A++sortable%0D%0Awhere%0D%0A++(%0D%0A++++sortable_with_nulls+is+null%0D%0A++++and+(%0D%0A++++++(pk1+%3E+%3Ap0)%0D%0A++++++or+(%0D%0A++++++++pk1+%3D+%3Ap0%0D%0A++++++++and+pk2+%3E+%3Ap1%0D%0A++++++)%0D%0A++++)%0D%0A++)%0D%0Aorder+by%0D%0A++sortable_with_nulls+desc%2C%0D%0A++pk1%2C%0D%0A++pk2%0D%0Alimit%0D%0A++101&p0=h&p1=r

sql select pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable where ( sortable_with_nulls is null and ( (pk1 > :p0) or ( pk1 = :p0 and pk2 > :p1 ) ) ) order by sortable_with_nulls desc, pk1, pk2 limit 101 Generated by this page: https://latest.datasette.io/fixtures/sortable?_next=%24null%2Ch%2Cr&_sort_desc=sortable_with_nulls

The _next= parameter there decodes as $null,h,r - and those components are tilde-encoded, so this can be distinguished from an actual $null value which would be represented as ~24null.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1421571810 https://github.com/simonw/sqlite-utils/issues/520#issuecomment-1421571810 https://api.github.com/repos/simonw/sqlite-utils/issues/520 IC_kwDOCGYnMM5Uu3bi mcarpenter 167893 2023-02-07T22:43:09Z 2023-02-07T22:43:09Z NONE

Hey, isn't this essentially the same issue as #448 ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
rows_from_file() raises confusing error if file-like object is not in binary mode 1516644980  
1421274434 https://github.com/simonw/datasette/issues/2019#issuecomment-1421274434 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5Utu1C simonw 9599 2023-02-07T18:42:42Z 2023-02-07T18:42:42Z OWNER

I'm going to build completely separate tests for this in test_pagination.py.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1421177666 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421177666 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5UtXNC 4l1fe 21095447 2023-02-07T17:39:00Z 2023-02-07T17:39:00Z NONE

lets users make schema changes, so it's important to me that the tool work in a non-surprising way -- if you ask for a column of type X, you should get type X. If the column or table previously had CHECK constraints, they shouldn't be silently removed

I've got your concern. Let's see if we will be replied on it and i'll close the issue some later.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1421081939 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421081939 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Us_1T cldellow 193185 2023-02-07T16:42:25Z 2023-02-07T16:43:42Z NONE

Ha, yes, I might end up making something very niche. That's OK.

I'm building a UI for Datasette that lets users make schema changes, so it's important to me that the tool work in a non-surprising way -- if you ask for a column of type X, you should get type X. If the column or table previously had CHECK constraints, they shouldn't be silently removed. And so on. I had hoped that I could just lean on sqlite-utils, but I think it's a little too surprising.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1421055590 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421055590 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Us5Zm 4l1fe 21095447 2023-02-07T16:25:31Z 2023-02-07T16:25:31Z NONE

Ah, it looks like that is controlled by this dict: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/db.py#L178

I suspect you could overwrite the datetime entry to achieve what you want

And thank you for pointing me to it. At least, i can make a monkey patch for my need...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1421052195 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421052195 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Us4kj 4l1fe 21095447 2023-02-07T16:23:17Z 2023-02-07T16:23:57Z NONE

Isn't your suggestion too fundamental for the utility?

The bigger flexibility, the bigger complexity. Your idea make sense defenitely, but how often do you make schema changes? And how many people could benefit from it, what do you think?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1421033725 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421033725 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Us0D9 cldellow 193185 2023-02-07T16:12:13Z 2023-02-07T16:12:13Z NONE

I think the bigger issue is that sqlite-utils mixes mechanism (it implements the 12-step way to alter SQLite tables) and policy (it has an opinionated stance on what column types should be used).

That might be a design choice to make it accessible to users by providing a reasonable set of defaults, but it doesn't quite fit my use case.

It might make sense to extract a separate library that provides just the mechanisms, and then sqlite-utils would sit on top of that library with its opinionated set of policies.

That would be a very big change, though.

I might take a stab at extracting the library, but just for the table schema migration piece, not all the other features that sqlite-utils supports. I wouldn't expect sqlite-utils to depend on it.

Part of my motivation is that I want to provide some other abilities, too, like support for CHECK constraints. I see that the issue in this repo (https://github.com/simonw/sqlite-utils/issues/358) proposes a bunch of short-hand constraints, which I wouldn't want to accidentally expose to people -- I want a layer that is a 1:1 mapping to SQLite.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1421022917 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421022917 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5UsxbF 4l1fe 21095447 2023-02-07T16:06:03Z 2023-02-07T16:08:58Z NONE

Do you see a way to enable it without affecting existing users or bumping the major version number?

I don't see a clean solution, only extending code with a side variable that tells us we want to apply advanced types instead of basic.

it could be a similiar command like tranform-v2 --type column DATETIME or a cli option transform --adv-type column DATETIME along with a dict that contains the advanced types. Then with knowledge that we run an advanced command we take that dictionary somehow, we can wrap the current and new dictionaries by a superdict and work with it everywhere according to the knowledge. This way shouldn't affect users who are using the previous lib versions and it have to be merged in the next major one.

But this way looks a bad design, too messy.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1420992261 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420992261 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Usp8F cldellow 193185 2023-02-07T15:45:58Z 2023-02-07T15:45:58Z NONE

I'd support that, but I'm not the author of this library.

One challenge is that would be a breaking change. Do you see a way to enable it without affecting existing users or bumping the major version number?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1420966995 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420966995 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5UsjxT 4l1fe 21095447 2023-02-07T15:29:28Z 2023-02-07T15:29:28Z NONE

I could, of course.

Doest it worth bringing such the improvement to the library?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1420941334 https://github.com/simonw/datasette/pull/564#issuecomment-1420941334 https://api.github.com/repos/simonw/datasette/issues/564 IC_kwDOBm6k_c5UsdgW psychemedia 82988 2023-02-07T15:14:10Z 2023-02-07T15:14:10Z CONTRIBUTOR

Is this feature covered by any more recent updates to datasette, or via any plugins that you're aware of?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
First proof-of-concept of Datasette Library 473288428  
1420809773 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420809773 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Ur9Yt cldellow 193185 2023-02-07T13:53:01Z 2023-02-07T13:53:01Z NONE

Ah, it looks like that is controlled by this dict: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/db.py#L178

I suspect you could overwrite the datetime entry to achieve what you want

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1420496447 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420496447 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Uqw4_ 4l1fe 21095447 2023-02-07T09:57:38Z 2023-02-07T09:57:38Z NONE

That said, it looks like the check is only enforced at the CLI level. If you use the API directly, I think it'll work.

It works, but a column becomes TEXT

```python In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database('events.sqlite') In [3]: table = db['cards.chunk.get'] In [4]: table.columns_dict Out[4]: {'id': int, 'timestamp': float, 'data_chunk_number': int, 'user_id': str, 'meta_duplication_source_id': int, 'context_sort_attribute': str, 'context_sort_order': str}

In [5]: from datetime import datetime In [7]: table.transform(types={'timestamp': datetime}) In [8]: table.columns_dict Out[8]: {'id': int, 'timestamp': str, 'data_chunk_number': int, 'user_id': str, 'meta_duplication_source_id': int, 'context_sort_attribute': str, 'context_sort_order': str} ```

bash ❯ sqlite-utils schema events.sqlite cards.chunk.get CREATE TABLE "cards.chunk.get" ( [id] INTEGER PRIMARY KEY NOT NULL, [timestamp] TEXT, ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1420109153 https://github.com/simonw/datasette/issues/2019#issuecomment-1420109153 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UpSVh simonw 9599 2023-02-07T02:32:36Z 2023-02-07T02:32:36Z OWNER

Doing this as a class makes sense to me. There are a few steps:

  • Instantiate the class with the information it needs, which includes sort order, page size, tiebreaker columns and SQL query and parameters
  • Generate the new SQL query that will actually be executed - maybe this takes the optional _next parameter? This returns the SQL and params that should be executed, where the SQL now includes pagination logic plus order by and limit
  • The calling code then gets to execute the SQL query to fetch the rows
  • Last step: those rows are passed to a paginator method which returns (rows, next) - where rows is the rows truncated to the correct length (really just with the last one cut off if it's too long for the length) and next is either None or a token, depending on if there should be a next page.
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1420106315 https://github.com/simonw/datasette/issues/2019#issuecomment-1420106315 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UpRpL simonw 9599 2023-02-07T02:28:03Z 2023-02-07T02:28:36Z OWNER

So I think I can write an abstraction that applies keyset pagination to ANY arbitrary SQL query provided it is given the query, the existing params (so it can pick names for the new params that won't overlap with them), the desired sort order, any existing _next token AND the columns that should be used to tie-break any duplicates.

Those tie breakers will be either the primary key(s) or rowid if none are provided.

What about the case of SQL views, where offset/limit should be used instead? I'm inclined to have that as a separate pagination abstraction entirely, with the calling code deciding which pagination helper to use based on if keyset pagination makes sense or not.

Might be easier to design a class structure for this starting with OffsetPaginator, then using that to inform the design of KeysetPaginator.

Might put these in datasette.utils.pagination to start off with, then maybe extract them out to sqlite-utils later once they've proven themselves.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1420104254 https://github.com/simonw/datasette/issues/2019#issuecomment-1420104254 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UpRI- simonw 9599 2023-02-07T02:24:46Z 2023-02-07T02:24:46Z OWNER

Even more complicated: https://latest.datasette.io/fixtures/sortable?sortable_with_nulls__notnull=1&_next=0~2E692704598586882%2Ce%2Cr&_sort=sortable_with_nulls_2

The rewritten SQL for that is:

sql select * from (select pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable where "sortable_with_nulls" is not null) where (sortable_with_nulls_2 > :p2 or (sortable_with_nulls_2 = :p2 and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1)))) order by sortable_with_nulls_2, pk1, pk2 limit 101 And it still has the same number of explain steps as the current SQL witohut the subselect.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1420101175 https://github.com/simonw/datasette/issues/2019#issuecomment-1420101175 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UpQY3 simonw 9599 2023-02-07T02:22:11Z 2023-02-07T02:22:11Z OWNER

A more complex example: https://latest.datasette.io/fixtures/sortable?_next=0~2E2650566289400591%2Ca%2Cu&_sort=sortable_with_nulls_2

SQL:

sql select pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable where (sortable_with_nulls_2 > :p2 or (sortable_with_nulls_2 = :p2 and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1)))) order by sortable_with_nulls_2, pk1, pk2 limit 101

https://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+where+%28sortable_with_nulls_2+%3E+%3Ap2+or+%28sortable_with_nulls_2+%3D+%3Ap2+and+%28%28pk1+%3E+%3Ap0%29%0A++or%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29%29+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p0=a&p1=u&p2=0.2650566289400591

Here's the explain: 49 steps long https://latest.datasette.io/fixtures?sql=explain+select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+where+%28sortable_with_nulls_2+%3E+%3Ap2+or+%28sortable_with_nulls_2+%3D+%3Ap2+and+%28%28pk1+%3E+%3Ap0%29%0D%0A++or%0D%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29%29+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p2=0.2650566289400591&p0=a&p1=u

Rewritten with a subselect:

sql select * from ( select pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable ) where (sortable_with_nulls_2 > :p2 or (sortable_with_nulls_2 = :p2 and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1)))) order by sortable_with_nulls_2, pk1, pk2 limit 101 https://latest.datasette.io/fixtures?sql=select+*+from+(%0D%0A++select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable%0D%0A)%0D%0Awhere+(sortable_with_nulls_2+%3E+%3Ap2+or+(sortable_with_nulls_2+%3D+%3Ap2+and+((pk1+%3E+%3Ap0)%0D%0A++or%0D%0A(pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1))))+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p2=0.2650566289400591&p0=a&p1=u

And here's the explain for that - also 49 steps: https://latest.datasette.io/fixtures?sql=explain+select+*+from+%28%0D%0A++select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable%0D%0A%29%0D%0Awhere+%28sortable_with_nulls_2+%3E+%3Ap2+or+%28sortable_with_nulls_2+%3D+%3Ap2+and+%28%28pk1+%3E+%3Ap0%29%0D%0A++or%0D%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29%29+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p2=0.2650566289400591&p0=a&p1=u

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1420094396 https://github.com/simonw/datasette/issues/2019#issuecomment-1420094396 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UpOu8 simonw 9599 2023-02-07T02:18:11Z 2023-02-07T02:19:16Z OWNER

For the SQL underlying this page (the second page in that compound primary key paginated sequence): https://latest.datasette.io/fixtures/compound_three_primary_keys?_next=a%2Cd%2Cv

The explain for the default query: https://latest.datasette.io/fixtures?sql=explain+select%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%2C%0D%0A++content%0D%0Afrom%0D%0A++compound_three_primary_keys%0D%0Awhere%0D%0A++%28%0D%0A++++%28pk1+%3E+%3Ap0%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3E+%3Ap1%0D%0A++++%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3D+%3Ap1%0D%0A++++++and+pk3+%3E+%3Ap2%0D%0A++++%29%0D%0A++%29%0D%0Aorder+by%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%0D%0Alimit%0D%0A++101&p0=a&p1=d&p2=v

The explain for that query rewritten as this:

sql explain select * from ( select pk1, pk2, pk3, content from compound_three_primary_keys ) where ( (pk1 > :p0) or ( pk1 = :p0 and pk2 > :p1 ) or ( pk1 = :p0 and pk2 = :p1 and pk3 > :p2 ) ) order by pk1, pk2, pk3 limit 101 https://latest.datasette.io/fixtures?sql=explain+select+*+from+%28select+%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%2C%0D%0A++content%0D%0Afrom%0D%0A++compound_three_primary_keys%0D%0A%29%0D%0A++where%0D%0A++%28%0D%0A++++%28pk1+%3E+%3Ap0%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3E+%3Ap1%0D%0A++++%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3D+%3Ap1%0D%0A++++++and+pk3+%3E+%3Ap2%0D%0A++++%29%0D%0A++%29%0D%0Aorder+by%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%0D%0Alimit%0D%0A++101&p0=a&p1=d&p2=v

Both explains have 31 steps and look pretty much identical.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1420088670 https://github.com/simonw/datasette/issues/2019#issuecomment-1420088670 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UpNVe simonw 9599 2023-02-07T02:14:35Z 2023-02-07T02:14:35Z OWNER

Maybe the correct level of abstraction here is that pagination is something that happens to a SQL query that is defined as SQL and params, without an order by or limit. That's then wrapped in a sub-select and those things are added to it, plus the necessary where clauses depending on the page.

Need to check that the query plan for pagination of a subquery isn't slower than the plan for pagination as it works today.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1419953256 https://github.com/simonw/datasette/issues/2019#issuecomment-1419953256 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UosRo simonw 9599 2023-02-06T23:42:56Z 2023-02-06T23:43:10Z OWNER

Relevant issue: - https://github.com/simonw/datasette/issues/1773

Explains this comment: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L697

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1419928455 https://github.com/simonw/datasette/issues/2019#issuecomment-1419928455 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UomOH simonw 9599 2023-02-06T23:21:50Z 2023-02-06T23:21:50Z OWNER

Found more logic relating to this:

https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L684-L732

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1419921228 https://github.com/simonw/datasette/issues/2019#issuecomment-1419921228 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UokdM simonw 9599 2023-02-06T23:14:15Z 2023-02-06T23:14:15Z OWNER

Crucial utility function: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/init.py#L137-L160

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1419917661 https://github.com/simonw/datasette/issues/2019#issuecomment-1419917661 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5Uojld simonw 9599 2023-02-06T23:10:51Z 2023-02-06T23:10:51Z OWNER

I should turn sort and sort_desc into an object representing the sort order earlier in the code.

I should also create something that bundles together pks and use_rowid and maybe is_view as well.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1419916684 https://github.com/simonw/datasette/issues/2019#issuecomment-1419916684 https://api.github.com/repos/simonw/datasette/issues/2019 IC_kwDOBm6k_c5UojWM simonw 9599 2023-02-06T23:09:51Z 2023-02-06T23:10:13Z OWNER

The inputs and outputs for this are pretty complex.

Inputs:

  • ?_next= from the query string
  • is_view - is this for a table or view? If it's a view it uses offset/limit pagination - which could actually work for arbitrary queries too. Also views could have keyset pagination if they are known to be sorted by a particular column.
  • sort and sort_desc reflecting the current sort order
  • use_rowid for if the table is a rowid table with no primary key of its own
  • pks - the primary keys for the table
  • params - the current set of parameters, I think used just to count their length so new params can be added as p5 etc without collisions. This could be handled with a s0, s1 etc naming convention instead.

Outputs:

  • where_clauses - a list of where clauses to add to the query
  • params - additional parameters to use with the query due to the new where clauses
  • order_by - the order by clause
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Refactor out the keyset pagination code 1573424830  
1399343659 https://github.com/simonw/datasette/pull/1999#issuecomment-1399343659 https://api.github.com/repos/simonw/datasette/issues/1999 IC_kwDOBm6k_c5TaEor simonw 9599 2023-01-21T22:19:20Z 2023-02-06T23:02:12Z OWNER

HTML mode needs a list of renderers so it can show links to .geojson etc - can do that as a hidden extra (maybe called renderers), repeating this code:

https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/views/base.py#L477-L497

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
?_extra= support (draft) 1551694938  
1418288327 https://github.com/simonw/datasette/issues/262#issuecomment-1418288327 https://api.github.com/repos/simonw/datasette/issues/262 IC_kwDOBm6k_c5UiVzH simonw 9599 2023-02-05T22:57:58Z 2023-02-06T23:01:15Z OWNER

I think that does make sense: ?_extra=table perhaps, which would add {"table": "..."}.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add ?_extra= mechanism for requesting extra properties in JSON 323658641  
1419734229 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1419734229 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Un2zV cldellow 193185 2023-02-06T20:53:28Z 2023-02-06T21:16:29Z NONE

I think it's not currently possible: sqlite-utils requires that it be one of integer, text, float, blob (see code)

IMO, this is a bit of friction and it would be nice if it was more permissive. SQLite permits developers to use any data type when creating a table. For example, this is a perfectly cromulent sqlite session that creates a table with columns of type baz and bar:

``` sqlite> create table foo(column1 baz, column2 bar); sqlite> .schema foo CREATE TABLE foo(column1 baz, column2 bar); sqlite> select * from pragma_table_info('foo'); cid name type notnull dflt_value pk


0 column1 baz 0 0
1 column2 bar 0 0
```

The idea is that the application developer will know what meaning to ascribe to those types. For example, I'm working on a plugin to Datasette. Dates are tricky to handle. If you have some existing rows, you can look at the values in them to know how a user is serializing the dates -- as an ISO 8601 string? An RFC 3339 string? With millisecond precision? With timezone offset? But if you don't yet have any rows, you have to guess. If the column is of type TEXT, you don't even know that it's meant to hold a date! In this case, my plugin will look to see if the column is of type DATE or DATETIME, and assume a certain representation when writing.

Perhaps there is an argument that sqlite-utils is trying to conform to SQLite's strict mode, and that is why it limits the choices. In strict mode, SQLite requires that the data type be one of INT, INTEGER, REAL, TEXT, BLOB, ANY. But that can't be the case -- sqlite-utils supports FLOAT, which is not one of the valid types in strict mode, and it rejects INT, REAL and ANY, which are valid.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1419740776 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1419740776 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Un4Zo cldellow 193185 2023-02-06T20:59:01Z 2023-02-06T20:59:01Z NONE

That said, it looks like the check is only enforced at the CLI level. If you use the API directly, I think it'll work.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1419390560 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1419390560 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Umi5g 4l1fe 21095447 2023-02-06T16:43:47Z 2023-02-06T16:43:47Z NONE

SQLite doesn't have a native DATETIME type. It stores dates internally as strings and then has functions to work with date-like strings. Yes it's weird.

That's correct. But my issue is about the application level libraries that, i suppose, have better data understanding if see a specific type such as DATETIME.

I'm writing data with dataset i've mentioned. The lib changes its behavior depending on a type. I saw different behavior with types DATETIME, FLOAT, TEXT. Dataset, for their part, is built upon Sqlalchemy, you know what it is.

To be honest, i didn't dive into the details of why the behavior changes, but when i altered manually by other util a type of column to DATETIME things got back to normal.

On the matter, can i achieve it with Sqlite Utils at the moment?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1419357290 https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1419357290 https://api.github.com/repos/simonw/sqlite-utils/issues/524 IC_kwDOCGYnMM5Umaxq eyeseast 25778 2023-02-06T16:21:44Z 2023-02-06T16:21:44Z CONTRIBUTOR

SQLite doesn't have a native DATETIME type. It stores dates internally as strings and then has functions to work with date-like strings. Yes it's weird.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Transformation type `--type DATETIME` 1572766460  
1418288077 https://github.com/simonw/datasette/issues/2016#issuecomment-1418288077 https://api.github.com/repos/simonw/datasette/issues/2016 IC_kwDOBm6k_c5UiVvN simonw 9599 2023-02-05T22:56:43Z 2023-02-05T22:56:43Z OWNER

This absolutely makes sense. One of the biggest goals for Datasette 1.0 is "documented template contexts" - for any default template in Datasette that people might want to over-ride there should be documentation that describes the available context variables, plus tests that ensure they don't accidentally get broken by future changes.

Ensuring description/title/etc are available on the index page feels like it fits well into that bucket.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Database metadata fields like description are not available in the index page template's context 1571207083  
1416486796 https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1416486796 https://api.github.com/repos/simonw/sqlite-utils/issues/433 IC_kwDOCGYnMM5Ubd-M alecstein 16236421 2023-02-03T22:32:10Z 2023-02-03T22:32:10Z NONE

Came here to say that I also have this issue.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
CLI eats my cursor 1239034903  
1410827249 https://github.com/simonw/datasette/issues/2011#issuecomment-1410827249 https://api.github.com/repos/simonw/datasette/issues/2011 IC_kwDOBm6k_c5UF4Px simonw 9599 2023-01-31T17:58:54Z 2023-01-31T17:58:54Z OWNER

I think this is the relevant code:

https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/facets.py#L260-L268

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Applied facet did not result in an "x" icon to dismiss it 1564769997  
1409406327 https://github.com/simonw/datasette/issues/2010#issuecomment-1409406327 https://api.github.com/repos/simonw/datasette/issues/2010 IC_kwDOBm6k_c5UAdV3 simonw 9599 2023-01-30T21:51:58Z 2023-01-30T21:51:58Z OWNER

Here's a quick prototype I knocked up for this: ```diff diff --git a/datasette/static/app.css b/datasette/static/app.css index 71437bd4..d763bcff 100644 --- a/datasette/static/app.css +++ b/datasette/static/app.css @@ -695,7 +695,48 @@ p.zero-results {

+/ Force table to not be like tables anymore / +body.row table.rows-and-columns, +body.row .rows-and-columns thead, +body.row .rows-and-columns tbody, +body.row .rows-and-columns th, +body.row .rows-and-columns td, +body.row .rows-and-columns tr { + display: block; +} + +/ Hide table headers (but not display: none;, for accessibility) / +body.row .rows-and-columns thead tr { + position: absolute; + top: -9999px; + left: -9999px; +} + +body.row .rows-and-columns tr { + border: 1px solid #ccc; + margin-bottom: 1em; + border-radius: 10px; + background-color: white; + padding: 0.2rem; +}

+body.row .rows-and-columns td { + / Behave like a "row" / + border: none; + border-bottom: 1px solid #eee; + padding: 0; + padding-left: 10%; + padding-bottom: 0.3em; +} + +body.row .rows-and-columns td:before { + display: block; + color: black; + padding-bottom: 0.2em; + font-size: 0.8em; + font-weight: bold; + background-color: #f5f5f5; +}

/ Overrides ===============================================================/ diff --git a/datasette/templates/row.html b/datasette/templates/row.html index 1d1b0bfd..339eb643 100644 --- a/datasette/templates/row.html +++ b/datasette/templates/row.html @@ -5,6 +5,9 @@ {% block extra_head %} {{- super() -}} <style> +{% for column in columns %} +body.row .rows-and-columns td:nth-of-type({{ loop.index }}):before { content: "{{ column|escape_css_string }}"; } +{% endfor %} @media only screen and (max-width: 576px) { {% for column in columns %} .rows-and-columns td:nth-of-type({{ loop.index }}):before { content: "{{ column|escape_css_string }}"; } ``` Now the row page looks like this at all page widths: I think that's better (could do with a bit of tightening up). One catch: you can't copy and paste the column labels, since they are added using generated content like this: https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/static/app.css#L752-L757 https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/templates/row.html#L9-L11 I think the row page should switch to different HTML entirely, rather than continuing to share the `<table>` that's used by the table page. This will be a breaking change for users who customize Datasette, so I should aim to ship it before 1.0.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Row page should default to card view 1563264257  
1407767434 https://github.com/simonw/datasette/issues/1696#issuecomment-1407767434 https://api.github.com/repos/simonw/datasette/issues/1696 IC_kwDOBm6k_c5T6NOK cldellow 193185 2023-01-29T20:56:20Z 2023-01-29T20:56:20Z CONTRIBUTOR

I did some horrible things in https://github.com/cldellow/datasette-ui-extras/issues/2 to enable this in my plugin -- example here: https://dux-demo.fly.dev/cooking/posts?_facet=owner_user_id&owner_user_id=67

The implementation relies on two things:

  • a filters_from_request hook that adds a good human description (unfortunately, without the benefit of the CSS styling you mention)
  • doing something evil to hijack the exact and not operators in the Filters class. We can't leave them as is, or we'll get 2 human descriptions -- the built-in Datasette one and the one from my plugin. We can't remove them, or the filters UI will stop supporting the = and != operators

This got me thinking: it'd be neat if the list of operators that the filters UI supported wasn't a closed set.

A motivating example: adding a geospatial NEAR operator. Ideally it'd take two arguments - a target point and a radius, so you could express a filter like find me all rows whose lat/lng are within 10km of 43.4516° N, 80.4925° W. (Optionally, the UI could be enhanced if the geonames database was loaded and queried, so a user could say find me all rows whose lat/lng are within 10km of Kitchener, ON, and the city gets translated to a lat/lng for them)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Show foreign key label when filtering 1186696202  
1407733793 https://github.com/simonw/datasette/pull/2008#issuecomment-1407733793 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T6FAh simonw 9599 2023-01-29T18:17:40Z 2023-01-29T18:17:40Z OWNER

We don't have any performance tests yet - would be a useful thing to add, I've not built anything like that before (at least not in CI, I've always done as-hoc performance testing using something like Locust) so I don't have a great feel for how it could work.

Had an interesting conversation about this just now: https://fedi.simonwillison.net/@simon/109773800944614366

There's a risk that different runs will return different results due to the shared resource nature of GitHub Actions runners, but a good fix for that is to run comparative tests where you run the benchmark against e.g. both main and the incoming PR branch and report back on any differences.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  
1407716963 https://github.com/simonw/datasette/pull/2008#issuecomment-1407716963 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T6A5j cldellow 193185 2023-01-29T17:04:03Z 2023-01-29T17:04:03Z CONTRIBUTOR

Performance tests - I think most places don't have them as a formal gate enforced by CI. TypeScript and scalac seem to have tests that run to capture timings. The timings are included by a bot as a comment or build check, and also stored in a database so you can graph changes over time to spot regressions. Probably overkill for Datasette!

Window functions - oh, good point. Looks like Ubuntu shipped JSON1 support as far back as sqlite 3.11. I'll let this PR linger until there's a way to run against different SQLite versions. For now, I'm shipping this with datasette-ui-extras, since I think it's OK for a plugin to enforce a higher minimum requirement.

Tests - there actually did end up being test changes to capture the undercount bug of the current implementation, so the current implementation would fail against the new tests.

Perhaps a non-window function version could be written that uses random() instead of row_number() over () in order to get a unique key. It's technically not unique, but in practice, I imagine it'll work well.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  
1407568923 https://github.com/simonw/datasette/pull/2008#issuecomment-1407568923 https://api.github.com/repos/simonw/datasette/issues/2008 IC_kwDOBm6k_c5T5cwb simonw 9599 2023-01-29T05:47:36Z 2023-01-29T05:47:36Z OWNER

I don't know how/if you do automated tests for performance, so I haven't changed any of the tests.

We don't have any performance tests yet - would be a useful thing to add, I've not built anything like that before (at least not in CI, I've always done as-hoc performance testing using something like Locust) so I don't have a great feel for how it could work.

I see not having to change the tests at all for this change as a really positive sign. If you find any behaviour differences between this and the previous that's a sign we should add a mother test or two specifying the behaviour we want.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
array facet: don't materialize unnecessary columns 1560982210  

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 1.2ms · About: github-to-sqlite