html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/1518#issuecomment-981172801,https://api.github.com/repos/simonw/datasette/issues/1518,981172801,IC_kwDOBm6k_c46e4JB,9599,2021-11-28T23:23:51Z,2021-11-28T23:23:51Z,OWNER,"(I could experiment with merging the two tables by adding a temporary undocumented `?_sql=` parameter to the in-progress table view that sets an alternative query instead of `select cols from table` - added bonus, this will force me to use introspection against the returned columns rather than mixing in the known columns for the specified table)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-981172385,https://api.github.com/repos/simonw/datasette/issues/1518,981172385,IC_kwDOBm6k_c46e4Ch,9599,2021-11-28T23:21:26Z,2021-11-28T23:21:26Z,OWNER,"Aside: is there any reason this work can't complete the long-running goal of merging the TableView and QueryView, such that most of the features available for tables become available for arbitrary queries too? I had already mentally committed to implementing facets for queries, but I just realized that filters could work too - using either a CTE or a nested query. Pagination is the one holdout here, since table pagination uses keyset pagination over a known order. But maybe arbitrary queries can only be paginated off you order them first?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-981153186,https://api.github.com/repos/simonw/datasette/issues/1518,981153186,IC_kwDOBm6k_c46ezWi,9599,2021-11-28T21:13:50Z,2021-11-28T21:13:50Z,OWNER,"I'm also going to use the new `datasette-table` Web Component to help guide the design of the new API, which relates directly to this issue too: - #1532","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1534#issuecomment-981149531,https://api.github.com/repos/simonw/datasette/issues/1534,981149531,IC_kwDOBm6k_c46eydb,9599,2021-11-28T20:48:54Z,2021-11-28T20:48:54Z,OWNER,"If I'm going to do this, is there value in also spotting `Accept: text/csv` and returning CSV for that? I'm pretty sure no client has EVER implemented this though, so it feels like it would be showboating.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1065432388, https://github.com/simonw/datasette/issues/1533#issuecomment-981149039,https://api.github.com/repos/simonw/datasette/issues/1533,981149039,IC_kwDOBm6k_c46eyVv,9599,2021-11-28T20:45:36Z,2021-11-28T20:45:36Z,OWNER,I built an initial prototype of this in a branch: https://github.com/simonw/datasette/commit/e0a84691c2959f2d1d76948574c9c4a910c7556c - which exposed even more flaws in the way `TableView` is structured (adding custom HTTP headers to the response is way harder than it should be) which I should address in the refactor in #617.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1065431383, https://github.com/simonw/sqlite-utils/pull/333#issuecomment-979442854,https://api.github.com/repos/simonw/sqlite-utils/issues/333,979442854,IC_kwDOCGYnMM46YRym,9599,2021-11-25T19:47:26Z,2021-11-25T19:47:26Z,OWNER,"I just remembered that there's one other place that this could fit: as a Datasette ""insert"" plugin. This is vaporware at the moment, but the idea is that Datasette itself could grow a mechanism for importing data, that's driven by plugins. Out of the box Datasette would be able to import CSV and CSV files, similar to `sqlite-utils insert ... --csv` - but plugins would then be able to add support for additional format such as GeoJSON or - in this case - Parquet. The neat thing about having it as a Datasette plugin is that one plugin would enable three different ways of importing data: 1. Via a new `datasette insert ...` CLI option (similar to `sqlite-utils`) 2. Via a web form upload interface, where authenticated Datasette users would be able to upload files 3. Via an API interface, where files could be programatically submitted to a running Datasette server I started fleshing out this idea quite a while ago but didn't make much concrete progress, maybe I should revisit it: - https://github.com/simonw/datasette/issues/1160","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1039037439, https://github.com/simonw/sqlite-utils/pull/333#issuecomment-979345527,https://api.github.com/repos/simonw/sqlite-utils/issues/333,979345527,IC_kwDOCGYnMM46X6B3,2118708,2021-11-25T16:31:47Z,2021-11-25T16:31:47Z,NONE,"Thanks for your reply @simonw . Tbh, my first attempt was actually the `parquet-to-sqlite` package but I already had Makefiles that relied on `SQLite-utils` and it was less intrusive to my workflow. Maybe I'll revisit that decision. FYI: there's a `[sqlite-parquet-vtable](https://github.com/cldellow/sqlite-parquet-vtable)` I don't think plugins make much sense either. Probably defeats the purpose of simplicity: simple database along with a pip-able package.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1039037439, https://github.com/simonw/datasette/issues/1522#issuecomment-976117989,https://api.github.com/repos/simonw/datasette/issues/1522,976117989,IC_kwDOBm6k_c46LmDl,813732,2021-11-23T03:00:34Z,2021-11-23T03:00:34Z,CONTRIBUTOR,"I tried deploying the most recent version of the Dockerfile in this thread ([link to comment](https://github.com/simonw/datasette/issues/1522#issuecomment-974605128)), and after trying a few different different combinations, I was only successful when I used `--no-cpu-throttling` (""CPU Is always allocated"" in the UI) Using this method, I got a very similar issue to you: The first time I'd load the site I'd get a 503. But after that first load, I didn't get the issue again. It would re-occur if the service started from cold boot. I suspect this is a race condition in the supervisord configuration. The errors I got were the same `Connection refused: AH00957: http: attempt to connect to 127.0.0.1:8001 (127.0.0.1) failed`, and that seems to indicate that `datasette` hadn't yet started. Looking at the order of logs getting back, the processes reported successfully completing loading after the first 503 was returned, so that makes me think race condition. I can replicate this locally, if I `docker run` and request `localhost:5000/prefix` _before_ I get the `datasette entered RUNNING state` message. Cloud Run wakes up when requests are received, so this test would semi-replicate that, but local docker would be the equivalent of a persistent process, hence it doesn't normally exhibit the same issues. Unfortunately supervisor/supervisor issue 122 (not linking as to prevent cross-project link spam) seems to say that dependency chaining is a feature that's been asked for for a long time, but hasn't been implemented. You could try some suggestions in that thread. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-976023405,https://api.github.com/repos/simonw/datasette/issues/1522,976023405,IC_kwDOBm6k_c46LO9t,360895,2021-11-23T00:08:07Z,2021-11-23T00:08:07Z,NONE,"If you suspect that Cloud Run throttled CPU could be the cause, you can request to have CPU always allocated with `gcloud beta run deploy --no-cpu-throttling` ([read more](https://cloud.google.com/blog/products/serverless/cloud-run-gets-always-on-cpu-allocation)) It could also be the Cloud Run sandbox that somehow gets in the way here, in which case I recommend testing with the second generation execution environment: `gcloud beta run deploy --execution-environment gen2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1528#issuecomment-975955589,https://api.github.com/repos/simonw/datasette/issues/1528,975955589,IC_kwDOBm6k_c46K-aF,15178711,2021-11-22T22:00:30Z,2021-11-22T22:00:30Z,CONTRIBUTOR,"Oh, another thing to consider: I believe this would be the first `""_file""` key in datasette's metadata, compared to other `""_url""` keys like `""license_url""` or `""about_url""`. Not too sure what considerations to include with this (ex should missing files cause Datasette to stop before starting, should build scripts bundle these sql files somewhere during `datasette package`, etc.)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1060631257, https://github.com/simonw/datasette/issues/1526#issuecomment-975110692,https://api.github.com/repos/simonw/datasette/issues/1526,975110692,IC_kwDOBm6k_c46HwIk,9599,2021-11-22T04:49:44Z,2021-11-22T04:49:44Z,OWNER,Fixed in the 0.12 release of that plugin: https://github.com/simonw/datasette-publish-vercel/releases/tag/0.12,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059549523, https://github.com/simonw/datasette/issues/1526#issuecomment-975073308,https://api.github.com/repos/simonw/datasette/issues/1526,975073308,IC_kwDOBm6k_c46HnAc,9599,2021-11-22T04:13:46Z,2021-11-22T04:13:46Z,OWNER,"Addressing that over here (hadn't seen that issue yet, thanks for the prod): https://github.com/simonw/datasette-publish-vercel/issues/51#issuecomment-975073026","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059549523, https://github.com/simonw/datasette/issues/1527#issuecomment-974979785,https://api.github.com/repos/simonw/datasette/issues/1527,974979785,IC_kwDOBm6k_c46HQLJ,9599,2021-11-22T01:02:57Z,2021-11-22T01:03:19Z,OWNER,"I think the root cause is this hidden form field on https://latest.datasette.io/fixtures/facetable?_facet=_neighborhood&_neighborhood__exact=Downtown ```html ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059555791, https://github.com/simonw/datasette/issues/1525#issuecomment-974913180,https://api.github.com/repos/simonw/datasette/issues/1525,974913180,IC_kwDOBm6k_c46G_6c,9599,2021-11-21T22:57:08Z,2021-11-21T22:57:08Z,OWNER,https://latest.datasette.io/fixtures/facetable can't quite demonstrate the bug because `_neighborhood` isn't a foreign key - I should rename `city_id` to `_city_id`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059509927, https://github.com/simonw/datasette/issues/1525#issuecomment-974912985,https://api.github.com/repos/simonw/datasette/issues/1525,974912985,IC_kwDOBm6k_c46G_3Z,9599,2021-11-21T22:55:42Z,2021-11-21T22:55:42Z,OWNER,Here's the template: https://github.com/simonw/datasette/blob/48f11998b73350057b74fe6ab464d4ac3071637c/datasette/templates/row.html#L39-L41,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059509927, https://github.com/simonw/datasette/issues/93#issuecomment-974765825,https://api.github.com/repos/simonw/datasette/issues/93,974765825,IC_kwDOBm6k_c46Gb8B,9599,2021-11-21T07:00:21Z,2021-11-21T07:00:21Z,OWNER,Closing this in favour of Datasette Desktop: https://datasette.io/desktop,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952, https://github.com/simonw/sqlite-utils/pull/333#issuecomment-974754412,https://api.github.com/repos/simonw/sqlite-utils/issues/333,974754412,IC_kwDOCGYnMM46GZJs,9599,2021-11-21T04:35:32Z,2021-11-21T04:35:32Z,OWNER,"Some other recent projects (like trying to get this library to work in JupyterLite) have made me much more cautious about adding new dependencies, especially dependencies like `pyarrow` which require custom C/Rust extensions. There are a few ways this could work though: - Have this as an optional dependency feature - so it only works if the user installs `pyarrow` as well - Implement this as a separate tool, `parquet-to-sqlite` - which could itself depend on `sqlite-utils` - Add a concept of ""plugins"" to `sqlite-utils`, similar to how those work in Datasette: https://docs.datasette.io/en/stable/plugins.html My favourite option is `parquet-to-sqlite` because that can be built without any additional changes to `sqlite-utils` at all! I find the concept of plugins for `sqlite-utils` interesting. I've so far not had quite enough potential use-cases to convince me this is worthwhile (especially since it should be very easy to build out separate tools entirely), but I'm ready to be convinced that a plugin mechanism would be worthwhile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1039037439, https://github.com/simonw/datasette/issues/1524#issuecomment-974725814,https://api.github.com/repos/simonw/datasette/issues/1524,974725814,IC_kwDOBm6k_c46GSK2,9599,2021-11-20T23:24:01Z,2021-11-20T23:24:01Z,OWNER,"I noticed that `http://datasette-apache-proxy-demo.datasette.io/` wasn't redirecting to `https` so I built a new plugin: https://github.com/simonw/datasette-redirect-to-https ``` % curl -i 'http://datasette-apache-proxy-demo.datasette.io/prefix/fixtures/no_primary_key' HTTP/1.1 301 Moved Permanently date: Sat, 20 Nov 2021 23:22:50 GMT server: Fly/51d150d (2021-11-19) location: https://datasette-apache-proxy-demo.datasette.io/fixtures/no_primary_key x-proxied-by: Apache2 Debian transfer-encoding: chunked via: 1.1 fly.io fly-request-id: 01FMZTHTHVPC8BZY0625D7JV4B ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059219106, https://github.com/simonw/datasette/issues/1524#issuecomment-974721652,https://api.github.com/repos/simonw/datasette/issues/1524,974721652,IC_kwDOBm6k_c46GRJ0,9599,2021-11-20T22:41:03Z,2021-11-20T22:41:03Z,OWNER,New TIL: https://til.simonwillison.net/fly/custom-subdomain-fly,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059219106, https://github.com/simonw/datasette/issues/1426#issuecomment-974711959,https://api.github.com/repos/simonw/datasette/issues/1426,974711959,IC_kwDOBm6k_c46GOyX,52649,2021-11-20T21:11:51Z,2021-11-20T21:11:51Z,NONE,I think another thing would be to make `/pages/robots.txt` work. That way you can use jinja to generate a desired robots.txt. I'm using it to allow the main index and what it links to to be crawled (but not the database pages directly.),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",964322136, https://github.com/simonw/datasette/issues/1524#issuecomment-974707878,https://api.github.com/repos/simonw/datasette/issues/1524,974707878,IC_kwDOBm6k_c46GNym,9599,2021-11-20T20:34:51Z,2021-11-20T20:38:29Z,OWNER,"I pointed `CNAME` of `datasette-apache-proxy-demo.datasette.io` at `datasette-apache-proxy-demo.fly.dev.` using Vercel DNS: Then I asked Fly to issue a LetsEncrypt certificate for that: ``` % flyctl certs create datasette-apache-proxy-demo.datasette.io # About 53 seconds later: % flyctl certs show datasette-apache-proxy-demo.datasette.io The certificate for datasette-apache-proxy-demo.datasette.io has been issued. Hostname = datasette-apache-proxy-demo.datasette.io DNS Provider = constellix Certificate Authority = Let's Encrypt Issued = ecdsa,rsa Added to App = 53 seconds ago Source = fly ``` https://datasette-apache-proxy-demo.datasette.io/ works now - I'll use that in the documentation.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059219106, https://github.com/simonw/datasette/issues/1524#issuecomment-974704254,https://api.github.com/repos/simonw/datasette/issues/1524,974704254,IC_kwDOBm6k_c46GM5-,9599,2021-11-20T20:03:51Z,2021-11-20T20:22:52Z,OWNER,"I'm also going to extract the Apache config files from https://github.com/simonw/datasette/blob/250db8192cb8aba5eb8cd301ccc2a49525bc3d24/demos/apache-proxy/Dockerfile into a separate file to make it easier to read. (The supervisor config needs to be dynamically constructed to include $DATASETTE_REF so I will leave it where it is.)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1059219106, https://github.com/simonw/datasette/issues/1519#issuecomment-974701788,https://api.github.com/repos/simonw/datasette/issues/1519,974701788,IC_kwDOBm6k_c46GMTc,9599,2021-11-20T19:42:29Z,2021-11-20T19:42:29Z,OWNER,"> I think what's happening here is Apache is actually making a request to `/fixtures` rather than making a request to `/prefix/fixtures` - and Datasette is replying to requests on both the prefixed and the non-prefixed paths. > > This is pretty confusing! I think Datasette should ONLY reply to `/prefix/fixtures` instead and return a 404 for `/fixtures` - this would make things a whole lot easier to debug. > > But shipping that change could break existing deployments. Maybe that should be a breaking change for 1.0. On further thought I'm not going to do this. Having Datasette work behind a proxy the way it does right now is clearly easy for people to deploy (now that I've fixed the bugs) and I trust my improved tests to catch problems in the future.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974697824,https://api.github.com/repos/simonw/datasette/issues/1519,974697824,IC_kwDOBm6k_c46GLVg,9599,2021-11-20T19:11:21Z,2021-11-20T19:11:21Z,OWNER,"OK, i think I got all of them this time! The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2 I'm closing this issue, but feel free to re-open it if you spot any that I missed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1522#issuecomment-974695111,https://api.github.com/repos/simonw/datasette/issues/1522,974695111,IC_kwDOBm6k_c46GKrH,9599,2021-11-20T18:52:11Z,2021-11-20T18:52:11Z,OWNER,The demo is now live on https://datasette-apache-proxy-demo.fly.dev/prefix/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974693350,https://api.github.com/repos/simonw/datasette/issues/1522,974693350,IC_kwDOBm6k_c46GKPm,9599,2021-11-20T18:39:27Z,2021-11-20T18:39:27Z,OWNER,"I'm going to go with Fly instead for this, especially as I can keep it within their free tier (and iI want to get more familiar with their platform).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974692546,https://api.github.com/repos/simonw/datasette/issues/1522,974692546,IC_kwDOBm6k_c46GKDC,9599,2021-11-20T18:34:13Z,2021-11-20T18:34:13Z,OWNER,"Here's why it failed - the `fly.toml` file that was generated when I ran `flyctl launch` had this section: ```toml [[services]] http_checks = [] internal_port = 8080 processes = [""app""] protocol = ""tcp"" script_checks = [] ``` But I need `internal_port` to be 80 for Apache, so I changed that and ran `flyctl deploy --build-arg DATASETTE_REF=main` again - and it worked! https://floral-dust-4577.fly.dev/prefix/ - not seeing any 503 errors there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974685095,https://api.github.com/repos/simonw/datasette/issues/1522,974685095,IC_kwDOBm6k_c46GIOn,9599,2021-11-20T17:42:25Z,2021-11-20T17:42:25Z,OWNER,"I tried to deploy it to Fly - initially using `flyctl launch` but then switching to `flyctl deploy` so I could use the `--build-arg` option (posted [a feature request](https://community.fly.io/t/feature-request-flyctl-launch-build-arg/3209) here). Almost got it working, but it failed the health check: ``` % cd datasette/demos/apache-proxy apache-proxy % flyctl launch Creating app in /Users/simon/Dropbox/Development/datasette/demos/apache-proxy Scanning source code Detected Dockerfile app Automatically selected personal organization: Simon Willison ? Select region: sjc (Sunnyvale, California (US)) Created app floral-dust-4577 in organization personal Wrote config file fly.toml Your app is ready. Deploy with `flyctl deploy` ? Would you like to deploy now? Yes Deploying floral-dust-4577 ==> Validating app configuration --> Validating app configuration done Services TCP 80/443 ⇢ 8080 ==> Creating build context --> Creating build context done ==> Building image with Docker Sending build context to Docker daemon 8.704kB ... Error error building: executor failed running [/bin/sh -c pip install https://github.com/simonw/datasette/archive/${DATASETTE_REF}.zip]: exit code: 1 # I didn't pass the build argument, trying again with flyctl deploy apache-proxy % flyctl deploy --build-arg DATASETTE_REF=main Update available 0.0.229 -> v0.0.255 Run ""flyctl version update"" to upgrade Deploying floral-dust-4577 ==> Validating app configuration --> Validating app configuration done Services TCP 80/443 ⇢ 8080 ==> Creating build context --> Creating build context done ==> Building image with Docker Sending build context to Docker daemon 8.704kB [+] Building 15.7s (27/27) ... 0.0s ==> Pushing image to fly The push refers to repository [registry.fly.io/floral-dust-4577] 9bf88c92aa2a: Pushed 3d61728b8391: Pushed ... --> Pushing image done Image: registry.fly.io/floral-dust-4577:deployment-1637429501 Image size: 276 MB ==> Creating release Release v2 created You can detach the terminal anytime without stopping the deployment Monitoring Deployment 1 desired, 1 placed, 0 healthy, 0 unhealthy [health checks: 1 total, 1 critical] 1 desired, 1 placed, 0 healthy, 1 unhealthy [health checks: 1 total, 1 critical] v0 failed - Failed due to unhealthy allocations - no stable job version to auto revert to Failed Instances ==> Failure #1 Instance ID = 36adac86 Version = 0 Region = sjc Desired = run Status = running Health Checks = 1 total, 1 critical Restarts = 0 Created = 4m52s ago Recent Events TIMESTAMP TYPE MESSAGE 2021-11-20T17:32:52Z Received Task received by client 2021-11-20T17:32:52Z Task Setup Building Task Directory 2021-11-20T17:33:02Z Started Task started by client Recent Logs 2021-11-20T17:32:56Z [info] Unpacking image 2021-11-20T17:33:01Z [info] Preparing kernel init 2021-11-20T17:33:01Z [info] Configuring firecracker 2021-11-20T17:33:02Z [info] Starting virtual machine 2021-11-20T17:33:02Z [info] Starting init (commit: 7943db6)... 2021-11-20T17:33:02Z [info] Preparing to run: `/usr/bin/supervisord -c /app/supervisord.conf` as root 2021-11-20T17:33:02Z [info] 2021/11/20 17:33:02 listening on [fdaa:0:4ef:a7b:2295:36ad:ac86:2]:22 (DNS: [fdaa::3]:53) 2021-11-20T17:33:02Z [info] 2021-11-20 17:33:02,374 CRIT Supervisor is running as root. Privileges were not dropped because no user is specified in the config file. If you intend to run as root, you can set user=root in the config file to avoid this message. 2021-11-20T17:33:02Z [info] 2021-11-20 17:33:02,376 INFO supervisord started with pid 510 2021-11-20T17:33:03Z [info] 2021-11-20 17:33:03,379 INFO spawned: 'apache2' with pid 515 2021-11-20T17:33:03Z [info] 2021-11-20 17:33:03,381 INFO spawned: 'datasette' with pid 516 2021-11-20T17:33:05Z [info] 2021-11-20 17:33:05,068 INFO success: apache2 entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) 2021-11-20T17:33:05Z [info] 2021-11-20 17:33:05,068 INFO success: datasette entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) 2021-11-20T17:33:28Z [error] Health check status changed 'warning' => 'critical' ***v0 failed - Failed due to unhealthy allocations - no stable job version to auto revert to and deploying as v1 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974683220,https://api.github.com/repos/simonw/datasette/issues/1522,974683220,IC_kwDOBm6k_c46GHxU,9599,2021-11-20T17:29:12Z,2021-11-20T17:29:12Z,OWNER,"> As a a sanity check, would it be worth looking at trying to push the multi-process container on another provider of a knative / cloud run / tekton ? I have a somewhat similar use case for a future proejct, so i'm been very grateful to you sharing all the progress in this issue. That's a great idea. I'll try running on a non-Knative host too (probably Fly - though they actually run containers using Firecracker which ends up being completely different). Cloud Run are the only Knative host I've used, know of any others aside from Scaleway? They look like they're worth getting familiar with.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974682507,https://api.github.com/repos/simonw/datasette/issues/1522,974682507,IC_kwDOBm6k_c46GHmL,9599,2021-11-20T17:24:13Z,2021-11-20T17:24:13Z,OWNER,"I'm going to leave this issue open, tag it as ""help wanted"" and cross my fingers that someone with Cloud Run deep expertise takes an interest in figuring out what's going wrong here!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974607456,https://api.github.com/repos/simonw/datasette/issues/1522,974607456,IC_kwDOBm6k_c46F1Rg,17906,2021-11-20T07:10:11Z,2021-11-20T07:10:11Z,NONE,"As a a sanity check, would it be worth looking at trying to push the multi-process container on another provider of a knative / cloud run / tekton ? I have a somewhat similar use case for a future proejct, so i'm been very grateful to you sharing all the progress in this issue. As I understand it, Scaleway also offer a very similar offering using what appear to be many similar components that might at least see if it's an issue with more than one knative based FaaS provider https://www.scaleway.com/en/serverless-containers/ https://developers.scaleway.com/en/products/containers/api/#main-features ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974605529,https://api.github.com/repos/simonw/datasette/issues/1522,974605529,IC_kwDOBm6k_c46F0zZ,9599,2021-11-20T06:52:21Z,2021-11-20T06:52:21Z,OWNER,"I've now tried both Debian and Alpine, and I've tried both `tini` and `supervisord`. Each time I get the same result - I get 503 errors for the first dozen or so refreshes of `/prefix/` followed by it intermittently working. Absolutely stumped.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974605128,https://api.github.com/repos/simonw/datasette/issues/1522,974605128,IC_kwDOBm6k_c46F0tI,9599,2021-11-20T06:47:59Z,2021-11-20T06:47:59Z,OWNER,"I managed to port the whole thing over to Debian - which took a lot of work because their packaged Apache 2 works very differently from the Alpine one. Once again... I got it working fine on my laptop, but the image deployed to Cloud Run throws 503 errors! ```dockerfile FROM python:3.9.7-slim-bullseye RUN apt-get update && \ apt-get install -y apache2 supervisor && \ apt clean && \ rm -rf /var/lib/apt && \ rm -rf /var/lib/dpkg/info/* # Apache environment, copied from # https://github.com/ijklim/laravel-benfords-law-app/blob/e9bf385dcaddb62ea466a7b245ab6e4ef708c313/docker/os/Dockerfile ENV APACHE_DOCUMENT_ROOT=/var/www/html/public ENV APACHE_RUN_USER www-data ENV APACHE_RUN_GROUP www-data ENV APACHE_PID_FILE /var/run/apache2.pid ENV APACHE_RUN_DIR /var/run/apache2 ENV APACHE_LOCK_DIR /var/lock/apache2 ENV APACHE_LOG_DIR /var/log RUN ln -sf /dev/stdout /var/log/apache2-access.log RUN ln -sf /dev/stderr /var/log/apache2-error.log RUN mkdir -p $APACHE_RUN_DIR $APACHE_LOCK_DIR RUN a2enmod proxy RUN a2enmod proxy_http RUN a2enmod headers ARG DATASETTE_REF RUN pip install https://github.com/simonw/datasette/archive/${DATASETTE_REF}.zip # Append this to the end of the default httpd.conf file RUN echo '\n\ \n\ Options Indexes FollowSymLinks\n\ AllowOverride None\n\ Require all granted\n\ \n\ \n\ \n\ ServerName localhost\n\ DocumentRoot /app/html\n\ ProxyPreserveHost On\n\ ProxyPass /prefix/ http://127.0.0.1:8001/\n\ Header add X-Proxied-By ""Apache2""\n\ \n\ ' > /etc/apache2/sites-enabled/000-default.conf WORKDIR /app RUN mkdir -p /app/html RUN echo 'Datasette' > /app/html/index.html ADD https://latest.datasette.io/fixtures.db /app/fixtures.db EXPOSE 80 RUN echo ""[supervisord]"" >> /app/supervisord.conf RUN echo ""nodaemon=true"" >> /app/supervisord.conf RUN echo """" >> /app/supervisord.conf RUN echo ""[program:apache2]"" >> /app/supervisord.conf RUN echo ""command=apache2 -D FOREGROUND"" >> /app/supervisord.conf RUN echo """" >> /app/supervisord.conf RUN echo ""[program:datasette]"" >> /app/supervisord.conf RUN echo ""command=datasette /app/fixtures.db --setting base_url '/prefix/' --version-note '${DATASETTE_REF}' -h 0.0.0.0 -p 8001"" >> /app/supervisord.conf CMD [""/usr/bin/supervisord"", ""-c"", ""/app/supervisord.conf""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974602459,https://api.github.com/repos/simonw/datasette/issues/1522,974602459,IC_kwDOBm6k_c46F0Db,9599,2021-11-20T06:15:58Z,2021-11-20T06:15:58Z,OWNER,First I'm going to try using Debian Buster as the base image instead of Alpine.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974585374,https://api.github.com/repos/simonw/datasette/issues/1522,974585374,IC_kwDOBm6k_c46Fv4e,9599,2021-11-20T03:28:58Z,2021-11-20T03:28:58Z,OWNER,Based on https://medium.com/google-cloud/init-process-for-containers-d03a471fa0cc I might try s6.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974578141,https://api.github.com/repos/simonw/datasette/issues/1522,974578141,IC_kwDOBm6k_c46FuHd,9599,2021-11-20T02:27:23Z,2021-11-20T02:27:23Z,OWNER,"Aha! This could be the clue I was looking for: https://www.reddit.com/r/googlecloud/comments/fmkx63/comment/fl5csty/?utm_source=reddit&utm_medium=web2x&context=3 > Are you processing on a background thread in your container? If so, it's likely your problem, because cloud run will put your app into a low power state between http requests. For long running tasks in cloud run, you need to keep the http connection open, and not return until you are done. Maybe the `datasette &` process is being affected by that in some way?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974577949,https://api.github.com/repos/simonw/datasette/issues/1522,974577949,IC_kwDOBm6k_c46FuEd,9599,2021-11-20T02:26:09Z,2021-11-20T02:26:17Z,OWNER,"So frustrating, that's giving me the same problem after being deployed! 503 errors for the first while, then it starts working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974577565,https://api.github.com/repos/simonw/datasette/issues/1522,974577565,IC_kwDOBm6k_c46Ft-d,9599,2021-11-20T02:23:07Z,2021-11-20T02:23:07Z,OWNER,"OK, that works on my laptop - and Ctrl+C quits it, which is nice: ``` apache-proxy % docker run -p 5000:80 --rm datasette-build-arg-demo 2021-11-20 02:22:13,925 CRIT Supervisor is running as root. Privileges were not dropped because no user is specified in the config file. If you intend to run as root, you can set user=root in the config file to avoid this message. 2021-11-20 02:22:13,927 INFO supervisord started with pid 1 2021-11-20 02:22:14,931 INFO spawned: 'datasette' with pid 7 2021-11-20 02:22:14,934 INFO spawned: 'httpd' with pid 8 2021-11-20 02:22:16,484 INFO success: datasette entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) 2021-11-20 02:22:16,484 INFO success: httpd entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) ^C 2021-11-20 02:22:26,285 WARN received SIGINT indicating exit request 2021-11-20 02:22:26,286 INFO waiting for datasette, httpd to die 2021-11-20 02:22:26,315 INFO stopped: httpd (exit status 0) 2021-11-20 02:22:26,540 INFO stopped: datasette (exit status 0) ``` Here's my new Dockerfile: ```dockerfile FROM python:3-alpine RUN apk add --no-cache \ apache2 \ apache2-proxy \ supervisor \ bash ARG DATASETTE_REF RUN pip install https://github.com/simonw/datasette/archive/${DATASETTE_REF}.zip # Append this to the end of the default httpd.conf file RUN echo -e 'ServerName localhost\n\ \n\ \n\ Order deny,allow\n\ Allow from all\n\ \n\ \n\ ProxyPreserveHost On\n\ ProxyPass /prefix/ http://127.0.0.1:8001/\n\ Header add X-Proxied-By ""Apache2""' >> /etc/apache2/httpd.conf RUN echo 'Datasette' > /var/www/localhost/htdocs/index.html WORKDIR /app ADD https://latest.datasette.io/fixtures.db /app/fixtures.db EXPOSE 80 RUN echo ""[supervisord]"" >> /app/supervisord.conf RUN echo ""nodaemon=true"" >> /app/supervisord.conf RUN echo """" >> /app/supervisord.conf RUN echo ""[program:httpd]"" >> /app/supervisord.conf RUN echo ""command=httpd -D FOREGROUND"" >> /app/supervisord.conf RUN echo """" >> /app/supervisord.conf RUN echo ""[program:datasette]"" >> /app/supervisord.conf RUN echo ""command=datasette /app/fixtures.db --setting base_url '/prefix/' --version-note '${DATASETTE_REF}' -h 0.0.0.0 -p 8001"" >> /app/supervisord.conf CMD [""/usr/bin/supervisord"", ""-c"", ""/app/supervisord.conf""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974577082,https://api.github.com/repos/simonw/datasette/issues/1522,974577082,IC_kwDOBm6k_c46Ft26,9599,2021-11-20T02:19:27Z,2021-11-20T02:19:27Z,OWNER,"https://docs.docker.com/config/containers/multi-service_container/ suggests `supervisord` as a last resort. https://stackoverflow.com/a/49100302/6083 has a neat looking recipe for than in Alpine: > **1.** `Dockerfile` is: > > FROM alpine:latest > RUN apk update && apk add --no-cache supervisor openssh nginx > COPY supervisord.conf /etc/supervisord.conf > CMD [""/usr/bin/supervisord"", ""-c"", ""/etc/supervisord.conf""] > > **2.** `supervisord.conf` is: > > [supervisord] > nodaemon=true > > [program:sshd] > command=/usr/sbin/sshd -D > > [program:nginx] > command=nginx -c /etc/nginx/nginx.conf ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974576624,https://api.github.com/repos/simonw/datasette/issues/1522,974576624,IC_kwDOBm6k_c46Ftvw,9599,2021-11-20T02:16:12Z,2021-11-20T02:16:12Z,OWNER,"Again, that approach worked on my laptop but when deployed to Cloud Run mostly gave me 503 errors for the `/prefix/` page, with the occasional 200. I did this: ```Dockerfile RUN echo ""#!/bin/bash"" >> start.sh # Start Datasette running in background with & RUN echo ""datasette /app/fixtures.db --setting base_url '/prefix/' --version-note '${DATASETTE_REF}' -h 0.0.0.0 -p 8001 &"" >> /app/start.sh RUN echo ""httpd -D FOREGROUND"" >> /app/start.sh RUN chmod +x /app/start.sh CMD /app/start.sh ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974576436,https://api.github.com/repos/simonw/datasette/issues/1522,974576436,IC_kwDOBm6k_c46Fts0,9599,2021-11-20T02:14:45Z,2021-11-20T02:14:45Z,OWNER,I'm going to try running Apache with `httpd -D FOREGROUND` while running `datasette &`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974575512,https://api.github.com/repos/simonw/datasette/issues/1522,974575512,IC_kwDOBm6k_c46FteY,9599,2021-11-20T02:09:20Z,2021-11-20T02:09:20Z,OWNER,"> **Waiting for health check to begin** makes it sound like the container didn't start properly. That eventually failed, but I did get these in the build logs: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974573616,https://api.github.com/repos/simonw/datasette/issues/1522,974573616,IC_kwDOBm6k_c46FtAw,9599,2021-11-20T01:58:44Z,2021-11-20T01:58:44Z,OWNER,"Deploy to Cloud Run appears to hang here: ``` Deploying container to Cloud Run service [datasette-apache-proxy-demo] in project [datasette-222320] region [us-central1] ⠧ Deploying... Revision deployment finished. Waiting for health check to begin. ⠧ Creating Revision... . Routing traffic... ✓ Setting IAM Policy... ``` **Waiting for health check to begin** makes it sound like the container didn't start properly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974572505,https://api.github.com/repos/simonw/datasette/issues/1522,974572505,IC_kwDOBm6k_c46FsvZ,9599,2021-11-20T01:50:48Z,2021-11-20T01:51:41Z,OWNER,"I figured out a recipe to run `httpd` as a service inside Alpine - works great on my laptop, here's my new `Dockerfile`: ```dockerfile FROM python:3-alpine # openrc gives us rc-service RUN apk add --no-cache \ openrc \ apache2 \ apache2-proxy \ bash ARG DATASETTE_REF RUN pip install https://github.com/simonw/datasette/archive/${DATASETTE_REF}.zip # Append this to the end of the default httpd.conf file RUN echo -e 'ServerName localhost\n\ \n\ \n\ Order deny,allow\n\ Allow from all\n\ \n\ \n\ ProxyPreserveHost On\n\ ProxyPass /prefix/ http://127.0.0.1:8001/\n\ Header add X-Proxied-By ""Apache2""' >> /etc/apache2/httpd.conf RUN echo 'Datasette' > /var/www/localhost/htdocs/index.html WORKDIR /app ADD https://latest.datasette.io/fixtures.db /app/fixtures.db EXPOSE 80 # RUN echo -e ""#!/bin/bash\nopenrc default\nrc-service apache2 start;\ndatasette /app/fixtures.db --setting base_url '/prefix/' --version-note '${DATASETTE_REF}' -h 0.0.0.0 -p 8001"" > /app/start.sh RUN echo ""#!/bin/bash"" >> start.sh RUN echo ""openrc default"" >> start.sh RUN echo ""rc-service apache2 start"" >> start.sh RUN echo ""datasette /app/fixtures.db --setting base_url '/prefix/' --version-note '${DATASETTE_REF}' -h 0.0.0.0 -p 8001"" >> /app/start.sh RUN chmod +x /app/start.sh CMD /app/start.sh ``` I'm going to try this on Cloud Run and see if it fixes the 503s One annoying thing about this: Ctrl+C on my laptop no longer stops the container, I have to `docker ps` and then `docker kill xxx` instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974565816,https://api.github.com/repos/simonw/datasette/issues/1522,974565816,IC_kwDOBm6k_c46FrG4,9599,2021-11-20T01:13:39Z,2021-11-20T01:13:39Z,OWNER,"I have a hunch that running `httpd -D FOREGROUND` doesn't show error logs, which would explain why I can't use the Cloud Run logs to figure out the reason for the 503s.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974565392,https://api.github.com/repos/simonw/datasette/issues/1522,974565392,IC_kwDOBm6k_c46FrAQ,9599,2021-11-20T01:11:20Z,2021-11-20T01:11:20Z,OWNER,"Yup, that fixed it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974564712,https://api.github.com/repos/simonw/datasette/issues/1522,974564712,IC_kwDOBm6k_c46Fq1o,9599,2021-11-20T01:07:49Z,2021-11-20T01:10:48Z,OWNER,https://apache-proxy-demo.datasette.io/prefix/fixtures/compound_three_primary_keys has broken suggested facet links - they go to `https://localhost:8001/prefix/fixtures/compound_three_primary_keys?_facet=pk1#facet-pk1` - but I think that's because I'm missing the `ProxyPreserveHost On` setting.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1519#issuecomment-974562942,https://api.github.com/repos/simonw/datasette/issues/1519,974562942,IC_kwDOBm6k_c46FqZ-,9599,2021-11-20T00:59:32Z,2021-11-20T00:59:32Z,OWNER,"Ouch a nasty bug crept through there - https://datasette-apache-proxy-demo-j7hipcg4aq-uc.a.run.app/prefix/fixtures/compound_three_primary_keys says > 500: name 'ds' is not defined","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974561593,https://api.github.com/repos/simonw/datasette/issues/1519,974561593,IC_kwDOBm6k_c46FqE5,9599,2021-11-20T00:53:19Z,2021-11-20T00:53:19Z,OWNER,Adding that test found (I hope!) all of the remaining `base_url` bugs. There were a bunch! I think I finally get to close #838 too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974559176,https://api.github.com/repos/simonw/datasette/issues/1519,974559176,IC_kwDOBm6k_c46FpfI,9599,2021-11-20T00:42:08Z,2021-11-20T00:42:08Z,OWNER,"> In the meantime I can catch these errors by changing the test to run each path twice, once with and once without the prefix. This should accurately simulate how Apache is working here. This worked, I managed to get the tests to fail! Here's the change I made: ```diff diff --git a/tests/test_html.py b/tests/test_html.py index f24165b..dbdfe59 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -1614,12 +1614,19 @@ def test_metadata_sort_desc(app_client): ""/fixtures/compound_three_primary_keys/a,a,a"", ""/fixtures/paginated_view"", ""/fixtures/facetable"", + ""/fixtures?sql=select+1"", ], ) -def test_base_url_config(app_client_base_url_prefix, path): +@pytest.mark.parametrize(""use_prefix"", (True, False)) +def test_base_url_config(app_client_base_url_prefix, path, use_prefix): client = app_client_base_url_prefix - response = client.get(""/prefix/"" + path.lstrip(""/"")) + path_to_get = path + if use_prefix: + path_to_get = ""/prefix/"" + path.lstrip(""/"") + response = client.get(path_to_get) soup = Soup(response.body, ""html.parser"") + if path == ""/fixtures?sql=select+1"": + assert False for el in soup.findAll([""a"", ""link"", ""script""]): if ""href"" in el.attrs: href = el[""href""] @@ -1642,11 +1649,12 @@ def test_base_url_config(app_client_base_url_prefix, path): # If this has been made absolute it may start http://localhost/ if href.startswith(""http://localhost/""): href = href[len(""http://localost/"") :] - assert href.startswith(""/prefix/""), { + assert href.startswith(""/prefix/""), json.dumps({ ""path"": path, + ""path_to_get"": path_to_get, ""href_or_src"": href, ""element_parent"": str(el.parent), - } + }, indent=4, default=repr) def test_base_url_affects_metadata_extra_css_urls(app_client_base_url_prefix): ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974558267,https://api.github.com/repos/simonw/datasette/issues/1519,974558267,IC_kwDOBm6k_c46FpQ7,9599,2021-11-20T00:37:57Z,2021-11-20T00:37:57Z,OWNER,Thanks to #1522 I have a live demo that exhibits this bug now: https://apache-proxy-demo.datasette.io/prefix/fixtures/attraction_characteristic,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1522#issuecomment-974558076,https://api.github.com/repos/simonw/datasette/issues/1522,974558076,IC_kwDOBm6k_c46FpN8,9599,2021-11-20T00:36:56Z,2021-11-20T00:36:56Z,OWNER,That 503 error is _really_ frustrating: I have a deploy running at https://apache-proxy-demo.datasette.io/prefix/ and after a fresh deploy it serves 503 errors for quite a while - then eventually starts working.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974557766,https://api.github.com/repos/simonw/datasette/issues/1522,974557766,IC_kwDOBm6k_c46FpJG,9599,2021-11-20T00:35:25Z,2021-11-20T00:35:25Z,OWNER,Wrote a TIL about `--build-arg` and Cloud Run: https://til.simonwillison.net/cloudrun/using-build-args-with-cloud-run,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974542348,https://api.github.com/repos/simonw/datasette/issues/1522,974542348,IC_kwDOBm6k_c46FlYM,9599,2021-11-19T23:41:47Z,2021-11-19T23:44:07Z,OWNER,Do I have to use `cloudbuild.yml` to specify these? https://stackoverflow.com/a/58327340/6083 and https://stackoverflow.com/a/66232670/6083 suggest I do.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974541971,https://api.github.com/repos/simonw/datasette/issues/1522,974541971,IC_kwDOBm6k_c46FlST,9599,2021-11-19T23:40:32Z,2021-11-19T23:40:32Z,OWNER,"I want to be able to use build arguments to specify which commit version or branch of Datasette to deploy. This is proving hard to work out. I have this in my Dockerfile now: ``` ARG DATASETTE_REF RUN pip install https://github.com/simonw/datasette/archive/${DATASETTE_REF}.zip ``` Which works locally: docker build -t datasette-apache-proxy-demo . \ --build-arg DATASETTE_REF=c617e1769ea27e045b0f2907ef49a9a1244e577d But I can't figure out the right incantation to pass to `gcloud build submit`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974523569,https://api.github.com/repos/simonw/datasette/issues/1522,974523569,IC_kwDOBm6k_c46Fgyx,9599,2021-11-19T22:51:10Z,2021-11-19T22:51:10Z,OWNER,I wan a GitHub Action which I can manually activate to deploy a new version of that demo... and I want it to bake in the latest release of Datasette so I can use it to demonstrate bug fixes.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974523297,https://api.github.com/repos/simonw/datasette/issues/1522,974523297,IC_kwDOBm6k_c46Fguh,9599,2021-11-19T22:50:31Z,2021-11-19T22:50:31Z,OWNER,Demo code is now at: https://github.com/simonw/datasette/tree/main/demos/apache-proxy,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974521687,https://api.github.com/repos/simonw/datasette/issues/1522,974521687,IC_kwDOBm6k_c46FgVX,9599,2021-11-19T22:46:26Z,2021-11-19T22:46:26Z,OWNER,"Oh weird, it started working: https://datasette-apache-proxy-demo-j7hipcg4aq-uc.a.run.app/prefix/fixtures/sortable","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1522#issuecomment-974506401,https://api.github.com/repos/simonw/datasette/issues/1522,974506401,IC_kwDOBm6k_c46Fcmh,9599,2021-11-19T22:11:51Z,2021-11-19T22:11:51Z,OWNER,"This is frustrating: I have the following Dockerfile: ```dockerfile FROM python:3-alpine RUN apk add --no-cache \ apache2 \ apache2-proxy \ bash RUN pip install datasette ENV TINI_VERSION v0.18.0 ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static /tini RUN chmod +x /tini # Append this to the end of the default httpd.conf file RUN echo $'ServerName localhost\n\ \n\ \n\ Order deny,allow\n\ Allow from all\n\ \n\ \n\ ProxyPass /prefix/ http://localhost:8001/\n\ Header add X-Proxied-By ""Apache2""' >> /etc/apache2/httpd.conf RUN echo $'Datasette' > /var/www/localhost/htdocs/index.html WORKDIR /app ADD https://latest.datasette.io/fixtures.db /app/fixtures.db RUN echo $'#!/usr/bin/env bash\n\ set -e\n\ \n\ httpd -D FOREGROUND &\n\ datasette fixtures.db --setting base_url ""/prefix/"" -h 0.0.0.0 -p 8001 &\n\ \n\ wait -n' > /app/start.sh RUN chmod +x /app/start.sh EXPOSE 80 ENTRYPOINT [""/tini"", ""--"", ""/app/start.sh""] ``` It works fine when I run it locally: ``` docker build -t datasette-apache-proxy-demo . docker run -p 5000:80 datasette-apache-proxy-demo ``` But when I deploy it to Cloud Run with the following script: ```bash #!/bin/bash # https://til.simonwillison.net/cloudrun/ship-dockerfile-to-cloud-run NAME=""datasette-apache-proxy-demo"" PROJECT=$(gcloud config get-value project) IMAGE=""gcr.io/$PROJECT/$NAME"" gcloud builds submit --tag $IMAGE gcloud run deploy \ --allow-unauthenticated \ --platform=managed \ --image $IMAGE $NAME \ --port 80 ``` It serves the `/` page successfully, but hits to `/prefix/` return the following 503 error: > Service Unavailable > > The server is temporarily unable to service your request due to maintenance downtime or capacity problems. Please try again later. > > Apache/2.4.51 (Unix) Server at datasette-apache-proxy-demo-j7hipcg4aq-uc.a.run.app Port 80 Cloud Run logs: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1519#issuecomment-974478126,https://api.github.com/repos/simonw/datasette/issues/1519,974478126,IC_kwDOBm6k_c46FVsu,9599,2021-11-19T21:16:36Z,2021-11-19T21:16:36Z,OWNER,"In the meantime I can catch these errors by changing the test to run each path twice, once with and once without the prefix. This should accurately simulate how Apache is working here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974477465,https://api.github.com/repos/simonw/datasette/issues/1519,974477465,IC_kwDOBm6k_c46FViZ,9599,2021-11-19T21:15:30Z,2021-11-19T21:15:30Z,OWNER,"I think what's happening here is Apache is actually making a request to `/fixtures` rather than making a request to `/prefix/fixtures` - and Datasette is replying to requests on both the prefixed and the non-prefixed paths. This is pretty confusing! I think Datasette should ONLY reply to `/prefix/fixtures` instead and return a 404 for `/fixtures` - this would make things a whole lot easier to debug. But shipping that change could break existing deployments. Maybe that should be a breaking change for 1.0.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974450232,https://api.github.com/repos/simonw/datasette/issues/1519,974450232,IC_kwDOBm6k_c46FO44,9599,2021-11-19T20:41:53Z,2021-11-19T20:42:19Z,OWNER,https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration says I should use `ProxyPreserveHost on`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974447950,https://api.github.com/repos/simonw/datasette/issues/1519,974447950,IC_kwDOBm6k_c46FOVO,9599,2021-11-19T20:40:19Z,2021-11-19T20:40:19Z,OWNER,"Figured it out! The test is not an accurate recreation of what is happening, because it doesn't simulate a request with a path of `/fixtures` that has been redirected by the proxy to `/prefix/fixtures`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1522#issuecomment-974435661,https://api.github.com/repos/simonw/datasette/issues/1522,974435661,IC_kwDOBm6k_c46FLVN,9599,2021-11-19T20:33:42Z,2021-11-19T20:33:42Z,OWNER,"Should just be a case of deploying this `Dockerfile`: ```Dockerfile FROM python:3-alpine RUN apk add --no-cache \ apache2 \ apache2-proxy \ bash RUN pip install datasette ENV TINI_VERSION v0.18.0 ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static /tini RUN chmod +x /tini # Append this to the end of the default httpd.conf file RUN echo $'ServerName localhost\n\ \n\ \n\ Order deny,allow\n\ Allow from all\n\ \n\ \n\ ProxyPass /foo/bar/ http://localhost:9000/\n\ Header add X-Proxied-By ""Apache2""' >> /etc/apache2/httpd.conf RUN echo $'Datasette' > /var/www/localhost/htdocs/index.html WORKDIR /app ADD https://latest.datasette.io/fixtures.db /app/fixtures.db RUN echo $'#!/usr/bin/env bash\n\ set -e\n\ \n\ httpd -D FOREGROUND &\n\ datasette fixtures.db --setting base_url ""/foo/bar/"" -p 9000 &\n\ \n\ wait -n' > /app/start.sh RUN chmod +x /app/start.sh EXPOSE 80 ENTRYPOINT [""/tini"", ""--"", ""/app/start.sh""] ``` I can follow this TIL: https://til.simonwillison.net/cloudrun/ship-dockerfile-to-cloud-run","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236, https://github.com/simonw/datasette/issues/1521#issuecomment-974433520,https://api.github.com/repos/simonw/datasette/issues/1521,974433520,IC_kwDOBm6k_c46FKzw,9599,2021-11-19T20:32:29Z,2021-11-19T20:32:29Z,OWNER,This configuration works great.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1519#issuecomment-974433320,https://api.github.com/repos/simonw/datasette/issues/1519,974433320,IC_kwDOBm6k_c46FKwo,9599,2021-11-19T20:32:04Z,2021-11-19T20:32:04Z,OWNER,Still not clear why the tests pass but the live example fails.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974433206,https://api.github.com/repos/simonw/datasette/issues/1519,974433206,IC_kwDOBm6k_c46FKu2,9599,2021-11-19T20:31:52Z,2021-11-19T20:31:52Z,OWNER,"Modified my `Dockerfile` to do this: RUN pip install https://github.com/simonw/datasette/archive/ff0dd4da38d48c2fa9250ecf336002c9ed724e36.zip And now the `request` in that debug `?_context=1` looks like this: ``` ""request"": """" ``` That explains the bug - that request doesn't maintain the original path prefix of `http://localhost:5000/foo/bar/fixtures?sql=` (also it's been rewritten to `localhost:9000` instead of `localhost:5000`).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974422829,https://api.github.com/repos/simonw/datasette/issues/1519,974422829,IC_kwDOBm6k_c46FIMt,9599,2021-11-19T20:26:35Z,2021-11-19T20:26:35Z,OWNER,"In the `?_context=` debug view the request looks like this: ``` ""request"": """", ``` I'm going to add a `repr()` to it such that it's a bit more useful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974420619,https://api.github.com/repos/simonw/datasette/issues/1519,974420619,IC_kwDOBm6k_c46FHqL,9599,2021-11-19T20:25:19Z,2021-11-19T20:25:19Z,OWNER,"The implementations of `path_with_removed_args` and `path_with_format`: https://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/utils/__init__.py#L228-L254 https://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/utils/__init__.py#L710-L729","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974418496,https://api.github.com/repos/simonw/datasette/issues/1519,974418496,IC_kwDOBm6k_c46FHJA,9599,2021-11-19T20:24:16Z,2021-11-19T20:24:16Z,OWNER,"Here's the code that generates `edit_sql_url` correctly: https://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/views/database.py#L416-L420 And here's the code for `show_hide_link`: https://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/views/database.py#L432-L433 And for `url_csv`: https://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/views/base.py#L600-L602","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974398399,https://api.github.com/repos/simonw/datasette/issues/1519,974398399,IC_kwDOBm6k_c46FCO_,9599,2021-11-19T20:08:20Z,2021-11-19T20:22:02Z,OWNER,"The relevant test is this one: https://github.com/simonw/datasette/blob/30255055150d7bc0affc8156adc18295495020ff/tests/test_html.py#L1608-L1649 I modified that test to add `""/fixtures/facetable?sql=select+1""` as one of the tested paths, and dropped in an `assert False` to pause it in the debugger: ``` @pytest.mark.parametrize( ""path"", [ ""/"", ""/fixtures"", ""/fixtures/compound_three_primary_keys"", ""/fixtures/compound_three_primary_keys/a,a,a"", ""/fixtures/paginated_view"", ""/fixtures/facetable"", ""/fixtures?sql=select+1"", ], ) def test_base_url_config(app_client_base_url_prefix, path): client = app_client_base_url_prefix response = client.get(""/prefix/"" + path.lstrip(""/"")) soup = Soup(response.body, ""html.parser"") if path == ""/fixtures?sql=select+1"": > assert False E assert False ``` BUT... in the debugger: ``` (Pdb) print(soup) ...

This data as json, testall, testnone, testresponse, CSV

``` Those all have the correct prefix! But that's not what I'm seeing in my `Dockerfile` reproduction of the issue. Something very weird is going on here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974405016,https://api.github.com/repos/simonw/datasette/issues/1519,974405016,IC_kwDOBm6k_c46FD2Y,9599,2021-11-19T20:14:19Z,2021-11-19T20:15:05Z,OWNER,"I added `template_debug` in the Dockerfile: ``` datasette fixtures.db --setting template_debug 1 --setting base_url ""/foo/bar/"" -p 9000 &\n\ ``` And then hit `http://localhost:5000/foo/bar/fixtures?sql=select+*+from+compound_three_primary_keys+limit+1&_context=1` to view the template context - and it showed the bug, output edited to just show relevant keys: ```json { ""edit_sql_url"": ""/foo/bar/fixtures?sql=select+%2A+from+compound_three_primary_keys+limit+1"", ""settings"": { ""force_https_urls"": false, ""template_debug"": true, ""trace_debug"": false, ""base_url"": ""/foo/bar/"" }, ""show_hide_link"": ""/fixtures?sql=select+%2A+from+compound_three_primary_keys+limit+1&_context=1&_hide_sql=1"", ""show_hide_text"": ""hide"", ""show_hide_hidden"": """", ""renderers"": { ""json"": ""/fixtures.json?sql=select+*+from+compound_three_primary_keys+limit+1&_context=1"" }, ""url_csv"": ""/fixtures.csv?sql=select+*+from+compound_three_primary_keys+limit+1&_context=1&_size=max"", ""url_csv_path"": ""/fixtures.csv"", ""base_url"": ""/foo/bar/"" } ``` This is so strange. `edit_sql_url` and `base_url` are correct, but `show_hide_link` and `url_csv` and `renderers.json` are not. And it's _really strange_ that the bug doesn't show up in the tests.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974391204,https://api.github.com/repos/simonw/datasette/issues/1519,974391204,IC_kwDOBm6k_c46FAek,9599,2021-11-19T20:02:41Z,2021-11-19T20:02:41Z,OWNER,"Bug confirmed: ![proxy-bug](https://user-images.githubusercontent.com/9599/142684666-112136bf-9243-4b6e-8202-339fcfe91bcc.gif) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974389472,https://api.github.com/repos/simonw/datasette/issues/1519,974389472,IC_kwDOBm6k_c46FADg,9599,2021-11-19T20:01:02Z,2021-11-19T20:01:02Z,OWNER,I now have a `Dockerfile` in https://github.com/simonw/datasette/issues/1521#issuecomment-974388295 that I can use to run a local Apache 2 with `mod_proxy` to investigate this class of bugs!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1521#issuecomment-974388295,https://api.github.com/repos/simonw/datasette/issues/1521,974388295,IC_kwDOBm6k_c46E_xH,9599,2021-11-19T20:00:06Z,2021-11-19T20:00:06Z,OWNER,"And this is the version that proxies to a `base_url` of `/foo/bar/`: ```Dockerfile FROM python:3-alpine RUN apk add --no-cache \ apache2 \ apache2-proxy \ bash RUN pip install datasette ENV TINI_VERSION v0.18.0 ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static /tini RUN chmod +x /tini # Append this to the end of the default httpd.conf file RUN echo $'ServerName localhost\n\ \n\ \n\ Order deny,allow\n\ Allow from all\n\ \n\ \n\ ProxyPass /foo/bar/ http://localhost:9000/\n\ Header add X-Proxied-By ""Apache2""' >> /etc/apache2/httpd.conf RUN echo $'Datasette' > /var/www/localhost/htdocs/index.html WORKDIR /app ADD https://latest.datasette.io/fixtures.db /app/fixtures.db RUN echo $'#!/usr/bin/env bash\n\ set -e\n\ \n\ httpd -D FOREGROUND &\n\ datasette fixtures.db --setting base_url ""/foo/bar/"" -p 9000 &\n\ \n\ wait -n' > /app/start.sh RUN chmod +x /app/start.sh EXPOSE 80 ENTRYPOINT [""/tini"", ""--"", ""/app/start.sh""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974380798,https://api.github.com/repos/simonw/datasette/issues/1521,974380798,IC_kwDOBm6k_c46E97-,9599,2021-11-19T19:54:26Z,2021-11-19T19:54:26Z,OWNER,"Got it working! Here's a `Dockerfile` which runs completely stand-alone (thanks to using the `echo $'` trick to write out the config files it needs) and successfully serves Datasette behind Apache and `mod_proxy`: ```Dockerfile FROM python:3-alpine RUN apk add --no-cache \ apache2 \ apache2-proxy \ bash RUN pip install datasette ENV TINI_VERSION v0.18.0 ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static /tini RUN chmod +x /tini # Append this to the end of the default httpd.conf file RUN echo $'ServerName localhost\n\ \n\ \n\ Order deny,allow\n\ Allow from all\n\ \n\ \n\ ProxyPass / http://localhost:9000/\n\ ProxyPassReverse / http://localhost:9000/\n\ Header add X-Proxied-By ""Apache2""' >> /etc/apache2/httpd.conf WORKDIR /app RUN echo $'#!/usr/bin/env bash\n\ set -e\n\ \n\ httpd -D FOREGROUND &\n\ datasette -p 9000 &\n\ \n\ wait -n' > /app/start.sh RUN chmod +x /app/start.sh EXPOSE 80 ENTRYPOINT [""/tini"", ""--"", ""/app/start.sh""] ``` Run it like this: ``` docker build -t datasette-apache2-proxy . docker run -p 5000:80 --rm datasette-apache2-proxy ``` Then run this to confirm: ``` ~ % curl -i 'http://localhost:5000/-/versions.json' HTTP/1.1 200 OK Date: Fri, 19 Nov 2021 19:54:05 GMT Server: uvicorn content-type: application/json; charset=utf-8 X-Proxied-By: Apache2 Transfer-Encoding: chunked {""python"": {""version"": ""3.10.0"", ""full"": ""3.10.0 (default, Nov 13 2021, 03:23:03) [GCC 10.3.1 20210424]""}, ""datasette"": {""version"": ""0.59.2""}, ""asgi"": ""3.0"", ""uvicorn"": ""0.15.0"", ""sqlite"": {""version"": ""3.35.5"", ""fts_versions"": [""FTS5"", ""FTS4"", ""FTS3""], ""extensions"": {""json1"": null}, ""compile_options"": [""COMPILER=gcc-10.3.1 20210424"", ""ENABLE_COLUMN_METADATA"", ""ENABLE_DBSTAT_VTAB"", ""ENABLE_FTS3"", ""ENABLE_FTS3_PARENTHESIS"", ""ENABLE_FTS4"", ""ENABLE_FTS5"", ""ENABLE_GEOPOLY"", ""ENABLE_JSON1"", ""ENABLE_MATH_FUNCTIONS"", ""ENABLE_RTREE"", ""ENABLE_UNLOCK_NOTIFY"", ""MAX_VARIABLE_NUMBER=250000"", ""SECURE_DELETE"", ""THREADSAFE=1"", ""USE_URI""]}} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974371116,https://api.github.com/repos/simonw/datasette/issues/1521,974371116,IC_kwDOBm6k_c46E7ks,9599,2021-11-19T19:45:47Z,2021-11-19T19:45:47Z,OWNER,"https://github.com/krallin/tini says: > *NOTE: If you are using Docker 1.13 or greater, Tini is included in Docker itself. This includes all versions of Docker CE. To enable Tini, just [pass the `--init` flag to `docker run`](https://docs.docker.com/engine/reference/commandline/run/).*","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974336020,https://api.github.com/repos/simonw/datasette/issues/1521,974336020,IC_kwDOBm6k_c46EzAU,9599,2021-11-19T19:10:48Z,2021-11-19T19:10:48Z,OWNER,"There's a promising looking minimal Apache 2 proxy config here: https://stackoverflow.com/questions/26474476/minimal-configuration-for-apache-reverse-proxy-in-docker-container ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974334278,https://api.github.com/repos/simonw/datasette/issues/1521,974334278,IC_kwDOBm6k_c46EylG,9599,2021-11-19T19:08:09Z,2021-11-19T19:08:09Z,OWNER,"Stripping comments using this StackOverflow recipe: https://unix.stackexchange.com/a/157619 docker run -it --entrypoint sh alpine-apache2-sh \ -c ""cat /etc/apache2/httpd.conf"" | sed '/^[[:blank:]]*#/d;s/#.*//' Result is here: https://gist.github.com/simonw/0a05090df5fcff8e8b3334621fa17976","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974332787,https://api.github.com/repos/simonw/datasette/issues/1521,974332787,IC_kwDOBm6k_c46EyNz,9599,2021-11-19T19:05:52Z,2021-11-19T19:05:52Z,OWNER,"Made myself this Dockerfile to let me explore a bit: ```Dockerfile FROM python:3-alpine RUN apk add --no-cache \ apache2 CMD [""sh""] ``` Then: ``` % docker run alpine-apache2-sh % docker run -it alpine-apache2-sh / # ls /etc/apache2/httpd.conf /etc/apache2/httpd.conf / # cat /etc/apache2/httpd.conf # # This is the main Apache HTTP server configuration file. It contains the # configuration directives that give the server its instructions. ... ``` Copying that into a GIST like so: ``` docker run -it --entrypoint sh alpine-apache2-sh -c ""cat /etc/apache2/httpd.conf"" | pbcopy ``` Gist here: https://gist.github.com/simonw/5ea0db6049192cb9f761fbd6beb3a84a","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974327812,https://api.github.com/repos/simonw/datasette/issues/1521,974327812,IC_kwDOBm6k_c46ExAE,9599,2021-11-19T18:58:49Z,2021-11-19T18:59:55Z,OWNER,"From this example: https://github.com/tigelane/dockerfiles/blob/06cff2ac8cdc920ebd64f50965115eaa3d0afb84/Alpine-Apache2/Dockerfile#L25-L31 it looks like running `apk add apache2` installs a config file at `/etc/apache2/httpd.conf` - so one approach is to then modify that file. ``` # APACHE - Alpine ################# RUN apk --update add apache2 php5-apache2 && \ #apk add openrc --no-cache && \ rm -rf /var/cache/apk/* && \ sed -i 's/#ServerName www.example.com:80/ServerName localhost/' /etc/apache2/httpd.conf && \ mkdir -p /run/apache2/ # Upload our files from folder ""dist"". COPY dist /var/www/localhost/htdocs # Manually set up the apache environment variables ENV APACHE_RUN_USER www-data ENV APACHE_RUN_GROUP www-data ENV APACHE_LOG_DIR /var/log/apache2 ENV APACHE_LOCK_DIR /var/lock/apache2 ENV APACHE_PID_FILE /var/run/apache2.pid # Execute apache2 on run ######################## EXPOSE 80 ENTRYPOINT [""httpd""] CMD [""-D"", ""FOREGROUND""] ``` I think I'll create my own separate copy and modify that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974321391,https://api.github.com/repos/simonw/datasette/issues/1521,974321391,IC_kwDOBm6k_c46Evbv,9599,2021-11-19T18:49:15Z,2021-11-19T18:57:18Z,OWNER,"This pattern looks like it can help: https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution I got that demo working locally like this: ```bash cd /tmp git clone https://github.com/ahmetb/multi-process-container-lazy-solution cd multi-process-container-lazy-solution docker build -t multi-process-container-lazy-solution . docker run -p 5000:8080 --rm multi-process-container-lazy-solution ``` I want to use `apache2` rather than `nginx` though. I found a few relevant examples of Apache in Alpine: - https://github.com/Hacking-Lab/alpine-apache2-reverse-proxy/blob/master/Dockerfile - https://www.sentiatechblog.com/running-apache-in-a-docker-container - https://github.com/search?l=Dockerfile&q=alpine+apache2&type=code ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1521#issuecomment-974322178,https://api.github.com/repos/simonw/datasette/issues/1521,974322178,IC_kwDOBm6k_c46EvoC,9599,2021-11-19T18:50:22Z,2021-11-19T18:50:22Z,OWNER,"I'll get this working on my laptop first, but then I want to get it up and running on Cloud Run - maybe with a GitHub Actions workflow in this repo that re-deploys it on manual execution.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058815557, https://github.com/simonw/datasette/issues/1519#issuecomment-974310208,https://api.github.com/repos/simonw/datasette/issues/1519,974310208,IC_kwDOBm6k_c46EstA,9599,2021-11-19T18:32:31Z,2021-11-19T18:32:31Z,OWNER,Having a live demo running on Cloud Run that proxies through Apache and uses `base_url` would be incredibly useful for replicating and debugging this kind of thing. I wonder how hard it is to run Apache and `mod_proxy` in the same Docker container as Datasette?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1519#issuecomment-974309591,https://api.github.com/repos/simonw/datasette/issues/1519,974309591,IC_kwDOBm6k_c46EsjX,9599,2021-11-19T18:31:32Z,2021-11-19T18:31:32Z,OWNER,"`base_url` has been a source of so many bugs like this! I often find them quite hard to replicate, likely because I haven't made myself a good Apache `mod_proxy` testing environment yet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545, https://github.com/simonw/datasette/issues/1520#issuecomment-974308215,https://api.github.com/repos/simonw/datasette/issues/1520,974308215,IC_kwDOBm6k_c46EsN3,9599,2021-11-19T18:29:26Z,2021-11-19T18:29:26Z,OWNER,"The solution that jumps to mind first is that it would be neat if routes could return something that meant ""actually my bad, I can't handle this after all - move to the next one in the list"". A related idea: it might be useful for custom views like my one here to say ""no actually call the default view for this, but give me back the response so I can modify it in some way"". Kind of like Django or ASGI middleware.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058803238, https://github.com/simonw/datasette/issues/1518#issuecomment-974300823,https://api.github.com/repos/simonw/datasette/issues/1518,974300823,IC_kwDOBm6k_c46EqaX,9599,2021-11-19T18:18:32Z,2021-11-19T18:18:32Z,OWNER,"> This may be an argument for continuing to allow non-JSON-objects through to the HTML templates. Need to think about that a bit more. I can definitely support this using pure-JSON - I could make two versions of the row available, one that's an array of cell objects and the other that's an object mapping column names to column raw values.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-974285803,https://api.github.com/repos/simonw/datasette/issues/1518,974285803,IC_kwDOBm6k_c46Emvr,9599,2021-11-19T17:56:48Z,2021-11-19T18:14:30Z,OWNER,"Very confused by this piece of code here: https://github.com/simonw/datasette/blob/1c13e1af0664a4dfb1e69714c56523279cae09e4/datasette/views/table.py#L37-L63 I added it in https://github.com/simonw/datasette/commit/754836eef043676e84626c4fd3cb993eed0d2976 - in the new world that should probably be replaced by pure JSON. Aha - this comment explains it: https://github.com/simonw/datasette/issues/521#issuecomment-505279560 > I think the trick is to redefine what a ""cell_row"" is. Each row is currently a list of cells: > > https://github.com/simonw/datasette/blob/6341f8cbc7833022012804dea120b838ec1f6558/datasette/views/table.py#L159-L163 > > I can redefine the row (the `cells` variable in the above example) as a thing-that-iterates-cells (hence behaving like a list) but that also supports `__getitem__` access for looking up cell values if you know the name of the column. The goal was to support neater custom templates like this: ```html+jinja {% for row in display_rows %}

{{ row[""First_Name""] }} {{ row[""Last_Name""] }}

... ``` This may be an argument for continuing to allow non-JSON-objects through to the HTML templates. Need to think about that a bit more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-974287570,https://api.github.com/repos/simonw/datasette/issues/1518,974287570,IC_kwDOBm6k_c46EnLS,9599,2021-11-19T17:59:33Z,2021-11-19T17:59:33Z,OWNER,"I'm going to try leaning into the `asyncinject` mechanism a bit here. One method can execute and return the raw rows. Another can turn that into the default minimal JSON representation. Then a third can take that (or take both) and use it to inflate out the JSON that the HTML template needs, with those extras and with the rendered cells from plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/pull/1495#issuecomment-974108455,https://api.github.com/repos/simonw/datasette/issues/1495,974108455,IC_kwDOBm6k_c46D7cn,192568,2021-11-19T14:14:35Z,2021-11-19T14:14:35Z,CONTRIBUTOR,A nudge on this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1033678984, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973820125,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973820125,IC_kwDOCGYnMM46C1Dd,9599,2021-11-19T07:25:55Z,2021-11-19T07:25:55Z,OWNER,"`alter=True` doesn't make sense to support here either, because `.lookup()` already adds missing columns: https://github.com/simonw/sqlite-utils/blob/3b8abe608796e99e4ffc5f3f4597a85e605c0e9b/sqlite_utils/db.py#L2743-L2746","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973802998,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973802998,IC_kwDOCGYnMM46Cw32,9599,2021-11-19T06:59:22Z,2021-11-19T06:59:32Z,OWNER,"I don't think I need the `DEFAULT` defaults for `.insert()` either, since it just passes through to `.insert()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973802766,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973802766,IC_kwDOCGYnMM46Cw0O,9599,2021-11-19T06:58:45Z,2021-11-19T06:58:45Z,OWNER,"And neither does `hash_id`. On that basis I'm going to specifically list the ones that DO make sense, and hope that I remember to add any new ones in the future. I can add a code comment hint to `.insert()` about that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973802469,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973802469,IC_kwDOCGYnMM46Cwvl,9599,2021-11-19T06:58:03Z,2021-11-19T06:58:03Z,OWNER,Also: I don't think `ignore=` and `replace=` make sense in the context of `lookup()`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973802308,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973802308,IC_kwDOCGYnMM46CwtE,9599,2021-11-19T06:57:37Z,2021-11-19T06:57:37Z,OWNER,"Here's the current full method signature for `.insert()`: https://github.com/simonw/sqlite-utils/blob/3b8abe608796e99e4ffc5f3f4597a85e605c0e9b/sqlite_utils/db.py#L2462-L2477 I could add a test which uses introspection (`inspect.signature(method).parameters`) to confirm that `.lookup()` has a super-set of the arguments accepted by `.insert()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973801650,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973801650,IC_kwDOCGYnMM46Cwiy,9599,2021-11-19T06:55:56Z,2021-11-19T06:55:56Z,OWNER,"`pk` needs to be an explicit argument to `.lookup()`. The rest could be `**kwargs` passed through to `.insert()`, like this hacked together version (docstring removed for brevity): ```python def lookup( self, lookup_values: Dict[str, Any], extra_values: Optional[Dict[str, Any]] = None, pk=""id"", **insert_kwargs, ): """""" assert isinstance(lookup_values, dict) if extra_values is not None: assert isinstance(extra_values, dict) combined_values = dict(lookup_values) if extra_values is not None: combined_values.update(extra_values) if self.exists(): self.add_missing_columns([combined_values]) unique_column_sets = [set(i.columns) for i in self.indexes] if set(lookup_values.keys()) not in unique_column_sets: self.create_index(lookup_values.keys(), unique=True) wheres = [""[{}] = ?"".format(column) for column in lookup_values] rows = list( self.rows_where( "" and "".join(wheres), [value for _, value in lookup_values.items()] ) ) try: return rows[0][pk] except IndexError: return self.insert(combined_values, pk=pk, **insert_kwargs).last_pk else: pk = self.insert(combined_values, pk=pk, **insert_kwargs).last_pk self.create_index(lookup_values.keys(), unique=True) return pk ``` I think I'll explicitly list the parameters, mainly so they can be typed and covered by automatic documentation. I do worry that I'll add more keyword arguments to `.insert()` in the future and forget to mirror them to `.lookup()` though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973800795,https://api.github.com/repos/simonw/sqlite-utils/issues/342,973800795,IC_kwDOCGYnMM46CwVb,9599,2021-11-19T06:54:08Z,2021-11-19T06:54:08Z,OWNER,"Looking at the code for `lookup()` it currently hard-codes `pk` to `""id""` - but it actually only calls `.insert()` in two places, both of which could be passed extra arguments. https://github.com/simonw/sqlite-utils/blob/3b8abe608796e99e4ffc5f3f4597a85e605c0e9b/sqlite_utils/db.py#L2756-L2763","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058196641, https://github.com/simonw/datasette/issues/1518#issuecomment-973700549,https://api.github.com/repos/simonw/datasette/issues/1518,973700549,IC_kwDOBm6k_c46CX3F,9599,2021-11-19T03:31:20Z,2021-11-19T03:31:26Z,OWNER,"... and while I'm doing all of this I can rewrite the templates to not use those cheating magical functions AND document the template context at the same time, refs: - #1510.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-973700322,https://api.github.com/repos/simonw/datasette/issues/1518,973700322,IC_kwDOBm6k_c46CXzi,9599,2021-11-19T03:30:30Z,2021-11-19T03:30:30Z,OWNER,"Right now the HTML version gets to cheat - it passes through objects that are not JSON serializable, including custom functions that can then be called by Jinja. I'm interested in maybe removing this cheating - if the HTML version could only request JSON-serializable extras those could be exposed in the API as well. It would also help cleanup the kind-of-nasty pattern I use in the current `BaseView` where everything returns both a bunch of JSON-serializable data AND an awaitable function that then gets to add extra things to the HTML context.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-973698917,https://api.github.com/repos/simonw/datasette/issues/1518,973698917,IC_kwDOBm6k_c46CXdl,9599,2021-11-19T03:26:18Z,2021-11-19T03:29:03Z,OWNER,"A (likely incomplete) list of features on the table page: - [ ] Display table/database/instance metadata - [ ] Show count of all results - [ ] Display table of results - [ ] Special table display treatment for URLs, numbers - [ ] Allow plugins to modify table cells - [ ] Respect `?_col=` and `?_nocol=` - [ ] Show interface for filtering by columns and operations - [ ] Show search box, support executing FTS searches - [ ] Sort table by specified column - [ ] Paginate table - [ ] Show facet results - [ ] Show suggested facets - [ ] Link to available exports - [ ] Display schema for table - [ ] Maybe it should show the SQL for the query too? - [ ] Handle various non-obvious querystring options, like `?_where=` and `?_through=`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543, https://github.com/simonw/datasette/issues/1518#issuecomment-973699424,https://api.github.com/repos/simonw/datasette/issues/1518,973699424,IC_kwDOBm6k_c46CXlg,9599,2021-11-19T03:27:49Z,2021-11-19T03:27:49Z,OWNER,"My goal is to break up a lot of this functionality into separate methods. These methods can be executed in parallel by `asyncinject`, but more importantly they can be used to build a much better JSON representation, where the default representation is lighter and `?_extra=x` options can be used to execute more expensive portions and add them to the response. So the HTML version itself needs to be re-written to use those JSON extras.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058072543,