html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/1249#issuecomment-804347152,https://api.github.com/repos/simonw/datasette/issues/1249,804347152,MDEyOklzc3VlQ29tbWVudDgwNDM0NzE1Mg==,9599,2021-03-22T19:47:56Z,2021-03-22T19:48:03Z,OWNER,I wrote a bunch of tips on creating smaller Docker images here: https://simonwillison.net/2018/Nov/19/smaller-python-docker-images/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-804344553,https://api.github.com/repos/simonw/datasette/issues/1249,804344553,MDEyOklzc3VlQ29tbWVudDgwNDM0NDU1Mw==,9599,2021-03-22T19:43:25Z,2021-03-22T19:43:25Z,OWNER,Does `--no-install-recommends` make a difference?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-804338678,https://api.github.com/repos/simonw/datasette/issues/1249,804338678,MDEyOklzc3VlQ29tbWVudDgwNDMzODY3OA==,9599,2021-03-22T19:33:43Z,2021-03-22T19:33:43Z,OWNER,"Replacing `rm -rf /var/lib/{apt,dpkg,cache,log}/` with ``` rm -rf /var/lib/apt && \ rm -rf /var/lib/dpkg ``` Got the size down to 305MB.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-804318314,https://api.github.com/repos/simonw/datasette/issues/1249,804318314,MDEyOklzc3VlQ29tbWVudDgwNDMxODMxNA==,9599,2021-03-22T19:04:30Z,2021-03-22T19:04:30Z,OWNER,Considering the image on Docker Hub right now is `383MB` this is actually an improvement.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-804317545,https://api.github.com/repos/simonw/datasette/issues/1249,804317545,MDEyOklzc3VlQ29tbWVudDgwNDMxNzU0NQ==,9599,2021-03-22T19:03:22Z,2021-03-22T19:03:22Z,OWNER,"This Dockerfile: ```dockerfile FROM python:3.9.2-slim-buster as build # software-properties-common provides add-apt-repository RUN apt-get update && \ apt-get -y install software-properties-common && \ add-apt-repository ""deb http://httpredir.debian.org/debian sid main"" && \ apt-get update && \ apt-get -t sid install -y libsqlite3-mod-spatialite && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ RUN pip install datasette EXPOSE 8001 CMD [""datasette""] ``` Produces a 344MB image that includes a working SpatiaLite 5.0 module. And weirdly... it doesn't exhibit the hanging bug!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-804310353,https://api.github.com/repos/simonw/datasette/issues/1249,804310353,MDEyOklzc3VlQ29tbWVudDgwNDMxMDM1Mw==,9599,2021-03-22T18:52:12Z,2021-03-22T18:52:12Z,OWNER,"This Dockerfile: ```dockerfile FROM python:3.9.2-slim-buster as build # Setup build dependencies RUN apt update \ && apt install -y python3-dev build-essential wget libxml2-dev libproj-dev \ libminizip-dev libgeos-dev libsqlite3-dev zlib1g-dev pkg-config git \ && apt clean RUN wget ""https://www.sqlite.org/2021/sqlite-autoconf-3340100.tar.gz"" && tar xzf sqlite-autoconf-3340100.tar.gz \ && cd sqlite-autoconf-3340100 && ./configure --disable-static --enable-fts5 --enable-json1 \ CFLAGS=""-g -O2 -DSQLITE_ENABLE_FTS3=1 -DSQLITE_ENABLE_FTS3_PARENTHESIS -DSQLITE_ENABLE_FTS4=1 -DSQLITE_ENABLE_RTREE=1 -DSQLITE_ENABLE_JSON1"" \ && make && make install RUN wget ""http://www.gaia-gis.it/gaia-sins/freexl-1.0.6.tar.gz"" && tar zxf freexl-1.0.6.tar.gz \ && cd freexl-1.0.6 && ./configure && make && make install RUN wget ""http://www.gaia-gis.it/gaia-sins/libspatialite-5.0.1.tar.gz"" && tar zxf libspatialite-5.0.1.tar.gz \ && cd libspatialite-5.0.1 && ./configure --disable-rttopo && make && make install RUN wget ""http://www.gaia-gis.it/gaia-sins/readosm-sources/readosm-1.1.0.tar.gz"" && tar zxf readosm-1.1.0.tar.gz && cd readosm-1.1.0 && ./configure && make && make install RUN wget ""http://www.gaia-gis.it/gaia-sins/spatialite-tools-5.0.0.tar.gz"" && tar zxf spatialite-tools-5.0.0.tar.gz \ && cd spatialite-tools-5.0.0 && ./configure --disable-rttopo && make && make install # Add local code to the image instead of fetching from pypi. #COPY . /datasette #RUN pip install /datasette RUN pip install datasette FROM python:3.9.2-slim-buster # Copy python dependencies and spatialite libraries COPY --from=build /usr/local/lib/ /usr/local/lib/ # Copy executables COPY --from=build /usr/local/bin /usr/local/bin # Copy spatial extensions COPY --from=build /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu ENV LD_LIBRARY_PATH=/usr/local/lib EXPOSE 8001 CMD [""datasette""] ``` Produced a 448MB image.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-804309510,https://api.github.com/repos/simonw/datasette/issues/1249,804309510,MDEyOklzc3VlQ29tbWVudDgwNDMwOTUxMA==,9599,2021-03-22T18:50:50Z,2021-03-22T18:50:50Z,OWNER,"Ideally I'd like to use the Debian stable `python:3.9.2-slim-buster` base image but install SpatiaLite from Debian unstable here: https://packages.debian.org/sid/libspatialite7 This pattern might let me do that: https://github.com/helmesjo/cpp_bash_utils/blob/f031e926249f8e2d7f260f22dc8974c6d5be11fe/docker/images/linux-gcc.dockerfile#L20-L24","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/pull/1271#issuecomment-804265042,https://api.github.com/repos/simonw/datasette/issues/1271,804265042,MDEyOklzc3VlQ29tbWVudDgwNDI2NTA0Mg==,9599,2021-03-22T17:45:45Z,2021-03-22T17:45:45Z,OWNER,"I can remove this code too: https://github.com/simonw/datasette/blob/6f41c8a2bef309a66588b2875c3e24d26adb4850/datasette/database.py#L190-L192","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837956424, https://github.com/simonw/datasette/issues/1249#issuecomment-804263434,https://api.github.com/repos/simonw/datasette/issues/1249,804263434,MDEyOklzc3VlQ29tbWVudDgwNDI2MzQzNA==,9599,2021-03-22T17:43:25Z,2021-03-22T17:43:25Z,OWNER,I figured out the cause of the hang in #1268 - it was caused by `select count(*) from SpatialIndex` interacting badly with the `set_progress_handler()` mechanism I was using to implement query time limits. #1271 has a replacement for that using `asyncio.wait_for()` and `conn.interrupt()` which should resolve the SpatiaLite issue too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1268#issuecomment-804261915,https://api.github.com/repos/simonw/datasette/issues/1268,804261915,MDEyOklzc3VlQ29tbWVudDgwNDI2MTkxNQ==,9599,2021-03-22T17:41:12Z,2021-03-22T17:41:12Z,OWNER,"Closing this because I've figured out the root of the problem now, and I have a potential solution.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1269#issuecomment-804261610,https://api.github.com/repos/simonw/datasette/issues/1269,804261610,MDEyOklzc3VlQ29tbWVudDgwNDI2MTYxMA==,9599,2021-03-22T17:40:41Z,2021-03-22T17:40:41Z,OWNER,"#1270 looks promising, and I don't want to leave open a security hole where someone could potentially hang Datasette with a nasty `count(*)` query.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837348479, https://github.com/simonw/datasette/issues/1270#issuecomment-804255633,https://api.github.com/repos/simonw/datasette/issues/1270,804255633,MDEyOklzc3VlQ29tbWVudDgwNDI1NTYzMw==,9599,2021-03-22T17:32:02Z,2021-03-22T17:32:08Z,OWNER,Confirmed that the `interrupt()` based cancellation mechanism fixes the SpatiaLite issue in #1268!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837350092, https://github.com/simonw/datasette/issues/1270#issuecomment-803834784,https://api.github.com/repos/simonw/datasette/issues/1270,803834784,MDEyOklzc3VlQ29tbWVudDgwMzgzNDc4NA==,9599,2021-03-22T07:31:57Z,2021-03-22T16:22:19Z,OWNER,"I think the implementation for this goes here: https://github.com/simonw/datasette/blob/6f41c8a2bef309a66588b2875c3e24d26adb4850/datasette/database.py#L146-L157 I figured out a similar pattern in `datasette-ripgrep` here: https://github.com/simonw/datasette-ripgrep/blob/0.7/datasette_ripgrep/__init__.py#L63-L71","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837350092, https://github.com/simonw/datasette/issues/1268#issuecomment-803802957,https://api.github.com/repos/simonw/datasette/issues/1268,803802957,MDEyOklzc3VlQ29tbWVudDgwMzgwMjk1Nw==,9599,2021-03-22T06:38:14Z,2021-03-22T06:38:14Z,OWNER,"Also worth trying is to change this code: ```python n = 1000 if ms < 50: n = 1 ``` What happens with `n = 10` instead?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1269#issuecomment-803785808,https://api.github.com/repos/simonw/datasette/issues/1269,803785808,MDEyOklzc3VlQ29tbWVudDgwMzc4NTgwOA==,9599,2021-03-22T06:00:53Z,2021-03-22T06:00:53Z,OWNER,This may not be necessary if using `.interrupt() for SQLite timeouts in #1270 works.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837348479, https://github.com/simonw/datasette/issues/1268#issuecomment-803784902,https://api.github.com/repos/simonw/datasette/issues/1268,803784902,MDEyOklzc3VlQ29tbWVudDgwMzc4NDkwMg==,9599,2021-03-22T05:59:06Z,2021-03-22T05:59:06Z,OWNER,"Even if I implement that workaround in #1269 I'm concerned that this could still allow users to deliberately crash Datasette (if it's running SpatiaLite 5.0) by executing `select count(*) from SpatialIndex`. That `interrupt` timeout mechanism is worth digging into further.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803782705,https://api.github.com/repos/simonw/datasette/issues/1268,803782705,MDEyOklzc3VlQ29tbWVudDgwMzc4MjcwNQ==,9599,2021-03-22T05:54:19Z,2021-03-22T05:54:19Z,OWNER,"Got two new TILs out of this: * [Tracing every executed Python statement](https://til.simonwillison.net/python/tracing-every-statement) * [Running gdb against a Python process in a running Docker container](https://til.simonwillison.net/docker/gdb-python-docker)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803777724,https://api.github.com/repos/simonw/datasette/issues/1268,803777724,MDEyOklzc3VlQ29tbWVudDgwMzc3NzcyNA==,9599,2021-03-22T05:42:50Z,2021-03-22T05:43:23Z,OWNER," If I want to avoid counting virtual tables, I need to detect which tables are virtual tables. The safest way to do this is probably to pull the `sql` for every table and then, in Python, check for values that start with `create virtual table` after converting to lower case, using any number of spaces. This would catch things like ` CREATE virtual TABLE` which might be missed by a SQL `like` query. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803775121,https://api.github.com/repos/simonw/datasette/issues/1268,803775121,MDEyOklzc3VlQ29tbWVudDgwMzc3NTEyMQ==,9599,2021-03-22T05:36:26Z,2021-03-22T05:36:26Z,OWNER,So one fix could be to avoid running counts for anything that turns out to be a virtual table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803774926,https://api.github.com/repos/simonw/datasette/issues/1268,803774926,MDEyOklzc3VlQ29tbWVudDgwMzc3NDkyNg==,9599,2021-03-22T05:35:56Z,2021-03-22T05:35:56Z,OWNER,That's in this code here: https://github.com/simonw/datasette/blob/c4f1ec7f33fd7d5b93f0f895dafb5351cc3bfc5b/datasette/database.py#L221-L241,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803774518,https://api.github.com/repos/simonw/datasette/issues/1268,803774518,MDEyOklzc3VlQ29tbWVudDgwMzc3NDUxOA==,9599,2021-03-22T05:34:57Z,2021-03-22T05:34:57Z,OWNER,"... and sure enough, adding this code fixed the problem: ```diff diff --git a/datasette/database.py b/datasette/database.py index 3579cce..b466b12 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -224,6 +226,9 @@ class Database: # Try to get counts for each table, $limit timeout for each count counts = {} for table in await self.table_names(): + if table == ""SpatialIndex"": + counts[table] = 0 + continue try: table_count = ( await self.execute( ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803773484,https://api.github.com/repos/simonw/datasette/issues/1268,803773484,MDEyOklzc3VlQ29tbWVudDgwMzc3MzQ4NA==,9599,2021-03-22T05:32:29Z,2021-03-22T05:32:29Z,OWNER,"To figure out which SQL query triggers the problem I added this code to write to a log file: ```python with sqlite_timelimit(conn, time_limit_ms): try: cursor = conn.cursor() with open(""/tmp/sql.log"", ""ab"", buffering=0) as fp: fp.write((""{}: {}\n"".format(sql, params)).encode(""utf-8"")) cursor.execute(sql, params if params is not None else {}) ``` I had to use `ab` binary mode because Python doesn't allow `buffering=0` for non-binary file operations. With the log enabled, I used `docker exec -it 589ae68de943 bash` to attach to the running container and `tail -f /tmp/sql.log` to see the logs. Here's where it broke: ``` select count(*) from [idx_civici_geom_parent]: None select count(*) from [sqlite_stat1]: None select count(*) from [sqlite_stat3]: None select count(*) from [SpatialIndex]: None ``` So attempting to run a `count(*)` against the `SpatialIndex` virtual table is the thing that triggers the bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803764919,https://api.github.com/repos/simonw/datasette/issues/1268,803764919,MDEyOklzc3VlQ29tbWVudDgwMzc2NDkxOQ==,9599,2021-03-22T05:11:11Z,2021-03-22T05:11:11Z,OWNER,"Maybe I could implement SQLite query timeouts using the `interrupt()` method instead of the progress handler hack I'm currently using? https://stackoverflow.com/questions/43240496/python-sqlite3-how-to-quickly-and-cleanly-interrupt-long-running-query-with-e has some tips.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803764200,https://api.github.com/repos/simonw/datasette/issues/1268,803764200,MDEyOklzc3VlQ29tbWVudDgwMzc2NDIwMA==,9599,2021-03-22T05:09:13Z,2021-03-22T05:09:13Z,OWNER,"I tried building a container where the `conn.set_progress_handler(handler, n)` line was commented out... and it fixed the bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803762969,https://api.github.com/repos/simonw/datasette/issues/1268,803762969,MDEyOklzc3VlQ29tbWVudDgwMzc2Mjk2OQ==,9599,2021-03-22T05:05:51Z,2021-03-22T05:05:51Z,OWNER,I had to run `docker kill 16197781a7b5` to kill the broken container - Ctrl+C in the Datasette console window didn't do anything.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803762609,https://api.github.com/repos/simonw/datasette/issues/1268,803762609,MDEyOklzc3VlQ29tbWVudDgwMzc2MjYwOQ==,9599,2021-03-22T05:05:00Z,2021-03-22T05:05:00Z,OWNER,"Using https://til.simonwillison.net/docker/attach-bash-to-running-container - I figured out how to run `gdb`. I had to use `--privileged` here because otherwise `gdb` showed a ""Could not attach to process"" error. ``` docker exec --privileged -it 16197781a7b5 bash # apt-get install gdb python3-dbg # gdb /usr/bin/python3 -p 20 ``` This paused the process. I tried running this: ``` (gdb) py-bt Traceback (most recent call first): File ""/usr/lib/python3.8/asyncio/base_events.py"", line 1845, in _run_once if handle._cancelled: File ""/usr/lib/python3.8/asyncio/base_events.py"", line 570, in run_forever self._run_once() File ""/usr/lib/python3.8/asyncio/base_events.py"", line 603, in run_until_complete self.run_forever() File ""/usr/local/lib/python3.8/dist-packages/uvicorn/server.py"", line 49, in run loop.run_until_complete(self.serve(sockets=sockets)) File ""/usr/local/lib/python3.8/dist-packages/uvicorn/main.py"", line 386, in run server.run() File ""/usr/local/lib/python3.8/dist-packages/datasette/cli.py"", line 575, in serve uvicorn.run(ds.app(), **uvicorn_kwargs) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/bin/datasette"", line 8, in sys.exit(cli()) File ""/usr/lib/python3.8/trace.py"", line 450, in runctx exec(cmd, globals, locals) File ""/usr/lib/python3.8/trace.py"", line 6632, in main File ""/usr/lib/python3.8/trace.py"", line 756, in main() File ""/usr/lib/python3.8/runpy.py"", line 343, in _run_code File ""/usr/lib/python3.8/runpy.py"", line 450, in _run_module_as_main ``` Not sure if that's useful or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803759051,https://api.github.com/repos/simonw/datasette/issues/1268,803759051,MDEyOklzc3VlQ29tbWVudDgwMzc1OTA1MQ==,9599,2021-03-22T04:55:22Z,2021-03-22T04:55:22Z,OWNER,So I think there's a bug in the way the `set_progress_handler()` mechanism works when used in conjunction with SpatiaLite 5.0 on Linux.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803758793,https://api.github.com/repos/simonw/datasette/issues/1268,803758793,MDEyOklzc3VlQ29tbWVudDgwMzc1ODc5Mw==,9599,2021-03-22T04:54:32Z,2021-03-22T04:54:32Z,OWNER,"Hitting http://localhost:8001/tuscany_housenumbers triggers the bug. It gets stuck in a loop that looks like this: Which looks to me like this code: https://github.com/simonw/datasette/blob/8e18c7943181f228ce5ebcea48deb59ce50bee1f/datasette/utils/__init__.py#L139-L158","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803758182,https://api.github.com/repos/simonw/datasette/issues/1268,803758182,MDEyOklzc3VlQ29tbWVudDgwMzc1ODE4Mg==,9599,2021-03-22T04:52:15Z,2021-03-22T04:52:15Z,OWNER,Hitting http://localhost:8001/ successfully shows the homepage (after a lot more scrolling).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803757746,https://api.github.com/repos/simonw/datasette/issues/1268,803757746,MDEyOklzc3VlQ29tbWVudDgwMzc1Nzc0Ng==,9599,2021-03-22T04:50:40Z,2021-03-22T04:51:52Z,OWNER,"Here's a fun debugging trick: docker run -it -p 8001:8001 -v `pwd`:/mnt datasette-spatialite:latest bash root@16197781a7b5:/# python3 -m trace --trace $(which datasette) \ -p 8001 -h 0.0.0.0 /mnt/tuscany_housenumbers.sqlite \ --load-extension=spatialite A huge amount of stuff scrolls past as Datasette starts up, since we are tracing every executed line of Python. After about a minute it's finished starting and gets to this point: ``` selectors.py(452): if timeout is None: selectors.py(454): elif timeout <= 0: selectors.py(459): timeout = math.ceil(timeout * 1e3) * 1e-3 selectors.py(464): max_ev = max(len(self._fd_to_key), 1) selectors.py(466): ready = [] selectors.py(467): try: selectors.py(468): fd_event_list = self._selector.poll(timeout, max_ev) ``` Now I can make some HTTP requests against it. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1268#issuecomment-803756495,https://api.github.com/repos/simonw/datasette/issues/1268,803756495,MDEyOklzc3VlQ29tbWVudDgwMzc1NjQ5NQ==,9599,2021-03-22T04:46:04Z,2021-03-22T04:46:04Z,OWNER,`gdb` may be able to help debug this: https://www.podoliaka.org/2016/04/10/debugging-cpython-gdb/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837308703, https://github.com/simonw/datasette/issues/1249#issuecomment-803755698,https://api.github.com/repos/simonw/datasette/issues/1249,803755698,MDEyOklzc3VlQ29tbWVudDgwMzc1NTY5OA==,9599,2021-03-22T04:43:02Z,2021-03-22T04:43:02Z,OWNER,I'll spin off a separate ticket to investigate the hang.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1267#issuecomment-803754226,https://api.github.com/repos/simonw/datasette/issues/1267,803754226,MDEyOklzc3VlQ29tbWVudDgwMzc1NDIyNg==,9599,2021-03-22T04:37:26Z,2021-03-22T04:37:26Z,OWNER,"Thanks for doing this - I've used alternativeto.net a bunch in the past, it's great to see Datasette listed there. This does raise some interesting philosophical questions: three years into the project I'm still not entirely sure what Datasette competes with! Could be SQLite desktop packages, could be visualization software like Tableau, could even be something like Airtable (given a few more plugins). It will be interesting to see how the alternativeto listing evolves, maybe it will help me answer that question!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",837208901, https://github.com/simonw/datasette/issues/1249#issuecomment-803753388,https://api.github.com/repos/simonw/datasette/issues/1249,803753388,MDEyOklzc3VlQ29tbWVudDgwMzc1MzM4OA==,9599,2021-03-22T04:34:20Z,2021-03-22T04:35:10Z,OWNER,"Well this is frustrating. I finally found a Dockerfile that worked and installed an Ubuntu pre-compiled SpatiaLite module that would load... ```dockerfile FROM ubuntu:20.10 as install_spatialite RUN apt update && \ apt install -y libsqlite3-mod-spatialite && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ FROM ubuntu:20.10 RUN apt update && \ apt install -y python3-pip && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ RUN pip install datasette # Copy spatial extensions COPY --from=install_spatialite /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/ ENV LD_LIBRARY_PATH=/usr/local/lib EXPOSE 8001 CMD [""datasette""] ``` (Which produced a 550MB image) And when I ran Datasette I got that same error where the database listing page hangs! ``` docker run -p 8001:8001 -v `pwd`:/mnt datasette-spatialite:latest datasette -p 8001 -h 0.0.0.0 /mnt/tuscany_housenumbers.sqlite --load-extension=spatialite ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803751068,https://api.github.com/repos/simonw/datasette/issues/1249,803751068,MDEyOklzc3VlQ29tbWVudDgwMzc1MTA2OA==,9599,2021-03-22T04:26:45Z,2021-03-22T04:26:45Z,OWNER,"Here's why: ``` datasette % docker run -it -p 8001:8001 -v `pwd`:/mnt datasette-spatialite:latest bash root@3430352ff378:/# datasette bash: /usr/local/bin/datasette: /usr/bin/python3: bad interpreter: No such file or directory ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803750617,https://api.github.com/repos/simonw/datasette/issues/1249,803750617,MDEyOklzc3VlQ29tbWVudDgwMzc1MDYxNw==,9599,2021-03-22T04:25:14Z,2021-03-22T04:25:14Z,OWNER,"Got this error attempting to run Datasette (with or without SpatiaLite): ``` standard_init_linux.go:219: exec user process caused: no such file or directory ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803750399,https://api.github.com/repos/simonw/datasette/issues/1249,803750399,MDEyOklzc3VlQ29tbWVudDgwMzc1MDM5OQ==,9599,2021-03-22T04:24:25Z,2021-03-22T04:24:25Z,OWNER,"I'll try using `ubuntu:20.10` for everything: ```dockerfile FROM ubuntu:20.10 as install_spatialite RUN apt update && \ apt install -y libsqlite3-mod-spatialite && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ FROM ubuntu:20.10 as build RUN apt update && \ apt install -y python3-pip && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ RUN pip install datasette #COPY . /datasette #RUN pip install /datasette FROM ubuntu:20.10 # Copy python dependencies and spatialite libraries COPY --from=build /usr/local/lib/ /usr/local/lib/ # Copy executables COPY --from=build /usr/local/bin /usr/local/bin # Copy spatial extensions COPY --from=install_spatialite /usr/lib/x86_64-linux-gnu/mod_spatialite.so /usr/lib/x86_64-linux-gnu/mod_spatialite.so ENV LD_LIBRARY_PATH=/usr/local/lib EXPOSE 8001 CMD [""datasette""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803749831,https://api.github.com/repos/simonw/datasette/issues/1249,803749831,MDEyOklzc3VlQ29tbWVudDgwMzc0OTgzMQ==,9599,2021-03-22T04:22:35Z,2021-03-22T04:22:35Z,OWNER,"I tried copying just the `mod_spatialite.so` file: ```dockerfile FROM ubuntu:20.10 as install_spatialite RUN apt update && \ apt install -y libsqlite3-mod-spatialite && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ FROM python:3.9.2-slim as build RUN pip install datasette #COPY . /datasette #RUN pip install /datasette FROM python:3.9.2-slim # Copy python dependencies and spatialite libraries COPY --from=build /usr/local/lib/ /usr/local/lib/ # Copy executables COPY --from=build /usr/local/bin /usr/local/bin # Copy spatial extensions COPY --from=install_spatialite /usr/lib/x86_64-linux-gnu/mod_spatialite.so /usr/lib/x86_64-linux-gnu/mod_spatialite.so ENV LD_LIBRARY_PATH=/usr/local/lib EXPOSE 8001 CMD [""datasette""] ``` But when I ran Datasette with `--load-extension=spatialite` I got this: ``` File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 151, in in_thread self.ds._prepare_connection(conn, self.name) File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 502, in _prepare_connection conn.execute(f""SELECT load_extension('{extension}')"") sqlite3.OperationalError: /usr/lib/x86_64-linux-gnu/mod_spatialite.so.so: cannot open shared object file: No such file or directory ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803748469,https://api.github.com/repos/simonw/datasette/issues/1249,803748469,MDEyOklzc3VlQ29tbWVudDgwMzc0ODQ2OQ==,9599,2021-03-22T04:17:51Z,2021-03-22T04:17:51Z,OWNER,"... except my clever image using SpatiaLite installed for Ubuntu doesn't actually work: ``` datasette % docker run -p 8001:8001 -v `pwd`:/mnt datasette-spatialite:latest datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db File ""/usr/local/lib/python3.9/sqlite3/dbapi2.py"", line 27, in from _sqlite3 import * ImportError: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /usr/lib/x86_64-linux-gnu/libsqlite3.so.0) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803748158,https://api.github.com/repos/simonw/datasette/issues/1249,803748158,MDEyOklzc3VlQ29tbWVudDgwMzc0ODE1OA==,9599,2021-03-22T04:16:57Z,2021-03-22T04:16:57Z,OWNER,"Which is great, because the image on Docker Hub right now is 383MB.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803747701,https://api.github.com/repos/simonw/datasette/issues/1249,803747701,MDEyOklzc3VlQ29tbWVudDgwMzc0NzcwMQ==,9599,2021-03-22T04:15:40Z,2021-03-22T04:15:40Z,OWNER,"Here's a trick: install SpatiaLite in `ubuntu:20.10` and then copy it into the final `python:3.9.2-slim` image. ```dockerfile FROM ubuntu:20.10 as install_spatialite RUN apt update && \ apt install -y libsqlite3-mod-spatialite && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ FROM python:3.9.2-slim as build RUN pip install datasette #COPY . /datasette #RUN pip install /datasette FROM python:3.9.2-slim # Copy python dependencies and spatialite libraries COPY --from=build /usr/local/lib/ /usr/local/lib/ # Copy executables COPY --from=build /usr/local/bin /usr/local/bin # Copy spatial extensions COPY --from=install_spatialite /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu ENV LD_LIBRARY_PATH=/usr/local/lib EXPOSE 8001 CMD [""datasette""] ``` That produced a 265MB image.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803700940,https://api.github.com/repos/simonw/datasette/issues/1249,803700940,MDEyOklzc3VlQ29tbWVudDgwMzcwMDk0MA==,9599,2021-03-22T01:14:24Z,2021-03-22T01:14:24Z,OWNER,I tried that with just `python3-pip` (removing `libsqlite3-mod-spatialite`) and got 435MB.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803700626,https://api.github.com/repos/simonw/datasette/issues/1249,803700626,MDEyOklzc3VlQ29tbWVudDgwMzcwMDYyNg==,9599,2021-03-22T01:13:04Z,2021-03-22T01:13:04Z,OWNER,"Building a Dockerfile containing just `FROM ubuntu:20.10` gave me `79.5MB`. Building this one: ```dockerfile FROM ubuntu:20.10 # Setup build dependencies RUN apt update && \ apt install -y python3-pip libsqlite3-mod-spatialite && \ apt clean && \ rm -rf /var/lib/{apt,dpkg,cache,log}/ ``` Resulted in a 515MB image.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803698983,https://api.github.com/repos/simonw/datasette/issues/1249,803698983,MDEyOklzc3VlQ29tbWVudDgwMzY5ODk4Mw==,9599,2021-03-22T01:05:36Z,2021-03-22T01:06:23Z,OWNER,"It's pretty big though. I tried this version which avoids copying junk from my laptop in: ```dockerfile FROM ubuntu:20.10 # Setup build dependencies RUN apt update && apt install -y python3-pip libsqlite3-mod-spatialite && apt clean RUN pip install datasette EXPOSE 8001 CMD [""datasette""] ``` And got this: ``` datasette % docker images datasette-spatialite REPOSITORY TAG IMAGE ID CREATED SIZE datasette-spatialite latest 0796950653c2 2 seconds ago 528MB ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803698168,https://api.github.com/repos/simonw/datasette/issues/1249,803698168,MDEyOklzc3VlQ29tbWVudDgwMzY5ODE2OA==,9599,2021-03-22T01:02:02Z,2021-03-22T01:02:30Z,OWNER,"This is the shortest Dockerfile that appeared to give me a working SpatiaLite module: ```dockerfile FROM ubuntu:20.10 # Setup build dependencies RUN apt update && apt install -y python3-pip libsqlite3-mod-spatialite && apt clean # Add local code to the image instead of fetching from pypi. COPY . /datasette RUN pip install /datasette RUN rm -rf /datasette EXPOSE 8001 CMD [""datasette""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803697546,https://api.github.com/repos/simonw/datasette/issues/1249,803697546,MDEyOklzc3VlQ29tbWVudDgwMzY5NzU0Ng==,9599,2021-03-22T00:59:47Z,2021-03-22T00:59:47Z,OWNER,"To debug I'm running: docker run -it -p 8001:8001 -v `pwd`:/mnt datasette-spatialite:latest bash This gets me a shell I can use.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803697211,https://api.github.com/repos/simonw/datasette/issues/1249,803697211,MDEyOklzc3VlQ29tbWVudDgwMzY5NzIxMQ==,9599,2021-03-22T00:58:01Z,2021-03-22T00:58:01Z,OWNER,"I'm messing around with the `Dockerfile` and after each change I'm running: docker build . -t datasette-spatialite And then: docker run -p 8001:8001 -v `pwd`:/mnt datasette-spatialite:latest datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803694661,https://api.github.com/repos/simonw/datasette/issues/1249,803694661,MDEyOklzc3VlQ29tbWVudDgwMzY5NDY2MQ==,9599,2021-03-22T00:46:49Z,2021-03-22T00:46:49Z,OWNER,Actually for the loadable module I think I need https://packages.ubuntu.com/groovy/libsqlite3-mod-spatialite,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803694436,https://api.github.com/repos/simonw/datasette/issues/1249,803694436,MDEyOklzc3VlQ29tbWVudDgwMzY5NDQzNg==,9599,2021-03-22T00:46:00Z,2021-03-22T00:46:00Z,OWNER,So I'm going to try `20.10` and see where that gets me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803694359,https://api.github.com/repos/simonw/datasette/issues/1249,803694359,MDEyOklzc3VlQ29tbWVudDgwMzY5NDM1OQ==,9599,2021-03-22T00:45:47Z,2021-03-22T00:45:47Z,OWNER,https://pythonspeed.com/articles/base-image-python-docker-images/ suggests using `python:3.9-slim-buster` or `ubuntu:20.04` - but 20.04 is focal which still has SpatiaLite `4.3.0a-6build1` - It's `20.10` that has 5.0: https://packages.ubuntu.com/groovy/libspatialite-dev,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803693181,https://api.github.com/repos/simonw/datasette/issues/1249,803693181,MDEyOklzc3VlQ29tbWVudDgwMzY5MzE4MQ==,9599,2021-03-22T00:41:02Z,2021-03-22T00:41:02Z,OWNER,Debian sid has it too: https://packages.debian.org/sid/libspatialite-dev,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803692673,https://api.github.com/repos/simonw/datasette/issues/1249,803692673,MDEyOklzc3VlQ29tbWVudDgwMzY5MjY3Mw==,9599,2021-03-22T00:38:42Z,2021-03-22T00:38:42Z,OWNER,Ubuntu Groovy has a package for SpatiaLite 5 - I could try using that instead: https://packages.ubuntu.com/groovy/libspatialite-dev,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-803691236,https://api.github.com/repos/simonw/datasette/issues/1249,803691236,MDEyOklzc3VlQ29tbWVudDgwMzY5MTIzNg==,9599,2021-03-22T00:32:03Z,2021-03-22T00:32:03Z,OWNER,"Here's something odd: when I run `datasette tuscany_housenumbers.sqlite --load-extension=spatialite` on macOS against SpatiaLite installed using Homebrew (which reports `""spatialite"": ""5.0.0""` on the `/-/versions` page) I don't get any weird errors at all, everything works fine. But when I tried compiling SpatiaLite inside the Docker container I had hanging errors on some pages. This is using https://www.gaia-gis.it/gaia-sins/knn/tuscany_housenumbers.7z from the SpatiaLite KNN tutorial at https://www.gaia-gis.it/fossil/libspatialite/wiki?name=KNN","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1259#issuecomment-803674728,https://api.github.com/repos/simonw/datasette/issues/1259,803674728,MDEyOklzc3VlQ29tbWVudDgwMzY3NDcyOA==,9599,2021-03-21T22:55:31Z,2021-03-21T22:55:31Z,OWNER,CTEs were added in 2014-02-03 SQLite 3.8.3 - so I think it's OK to depend on them for Datasette.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",830567275, https://github.com/simonw/datasette/issues/647#issuecomment-803673225,https://api.github.com/repos/simonw/datasette/issues/647,803673225,MDEyOklzc3VlQ29tbWVudDgwMzY3MzIyNQ==,9599,2021-03-21T22:44:19Z,2021-03-21T22:44:19Z,OWNER,"Now that I'm looking at refactoring how views work in #878 it's clear that the gnarliest, most convoluted code I need to deal with relates to this old feature. I'm going to remove it entirely. Any performance enhancement or provides can be achieved just as well by using regular URLs and a caching proxy. I may provide a 404 handling plugin that attempts to rewrite old URLs that used this mechanism, but I won't do any more than that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",531755959, https://github.com/simonw/sqlite-utils/issues/249#issuecomment-803501756,https://api.github.com/repos/simonw/sqlite-utils/issues/249,803501756,MDEyOklzc3VlQ29tbWVudDgwMzUwMTc1Ng==,9599,2021-03-21T02:33:45Z,2021-03-21T02:33:45Z,OWNER,"Did you run `enable-fts` before you inserted the data? If so you'll need to run `populate-fts` after the insert to populate the FTS index. A better solution may be to add `--create-triggers` to the `enable-fts` command to add triggers that will automatically keep the index updated as you insert new records.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",836963850, https://github.com/simonw/datasette/issues/878#issuecomment-803473015,https://api.github.com/repos/simonw/datasette/issues/878,803473015,MDEyOklzc3VlQ29tbWVudDgwMzQ3MzAxNQ==,9599,2021-03-20T22:33:05Z,2021-03-20T22:33:05Z,OWNER,"Things this mechanism needs to be able to support: - Returning a default JSON representation - Defining ""extra"" JSON representations blocks, which can be requested using `?_extra=` - Returning rendered HTML, based on the default JSON + one or more extras + a template - Using Datasette output renderers to return e.g. CSV data - Potentially also supporting streaming output renderers for streaming CSV/TSV/JSON-nl etc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648435885, https://github.com/simonw/datasette/issues/878#issuecomment-803472595,https://api.github.com/repos/simonw/datasette/issues/878,803472595,MDEyOklzc3VlQ29tbWVudDgwMzQ3MjU5NQ==,9599,2021-03-20T22:28:12Z,2021-03-20T22:28:12Z,OWNER,"Another idea I had: a view is a class that takes the `datasette` instance in its constructor, and defines a `__call__` method that accepts a request and returns a response. Except `await __call__` looks like it might be a bit messy, discussion in https://github.com/encode/starlette/issues/886","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648435885, https://github.com/simonw/datasette/issues/878#issuecomment-803472278,https://api.github.com/repos/simonw/datasette/issues/878,803472278,MDEyOklzc3VlQ29tbWVudDgwMzQ3MjI3OA==,9599,2021-03-20T22:25:04Z,2021-03-20T22:25:04Z,OWNER,"I came up with a slightly wild idea for this that would involve pytest-style dependency injection. Prototype here: https://gist.github.com/simonw/496b24fdad44f6f8b7237fe394a0ced7 Copying from my private notes: > Use the lazy evaluated DI mechanism to break up table view into different pieces eg for faceting > > Use that to solve JSON vs HTML views > > Oh here's an idea: what if the various components of the table view were each defined as async functions.... and then executed using asyncio.gather in order to run the SQL queries in parallel? Then try benchmarking with different numbers of threads? > > The async_call_with_arguments function could do this automatically for any awaitable dependencies > > This would give me massively parallel dependency injection > > (I could build an entire framework around this and call it c64) > > Idea: arguments called eg ""count"" are executed and the result passed to the function. If called count_fn then a reference to the not-yet-called function is passed instead > > I'm not going to completely combine the views mechanism and the render hooks. Instead, the core view will define a bunch of functions used to compose the page and the render hook will have conditional access to those functions - which will otherwise be asyncio.gather executed directly by the HTML page version > > Using asyncio.gather to execute facets and suggest facets in parallel would be VERY interesting > > suggest facets should be VERY cachable - doesn't matter if it's wrong unlike actual facets themselves > > What if all Datasette views were defined in terms of dependency injection - and those dependency functions could themselves depend on others just like pytest fixtures. Everything would become composable and async stuff could execute in parallel > > FURTHER IDEA: use this for the ?_extra= mechanism as well. > > Any view in Datasette can be defined as a collection of named keys. Each of those keys maps to a function or an async function that accepts as input other named keys, using DI to handle them. > > The HTML view is a defined function. So are the other outputs. > > Default original inputs include “request” and “datasette”. > > So… maybe a view function is a class methods that use DI. One of those methods as an .html() method used for the default page. > > Output formats are a bit more complicated because they are supposed to be defined separately in plugins. They are unified across query, row and table though. > > I’m going to try breaking up the TableView to see what happens.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",648435885, https://github.com/simonw/datasette/issues/878#issuecomment-803471917,https://api.github.com/repos/simonw/datasette/issues/878,803471917,MDEyOklzc3VlQ29tbWVudDgwMzQ3MTkxNw==,9599,2021-03-20T22:21:33Z,2021-03-20T22:21:33Z,OWNER,"This has been blocking things for too long. If this becomes a documented pattern, things like adding a JSON output to https://github.com/dogsheep/dogsheep-beta becomes easier too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648435885, https://github.com/simonw/datasette/issues/1258#issuecomment-803471702,https://api.github.com/repos/simonw/datasette/issues/1258,803471702,MDEyOklzc3VlQ29tbWVudDgwMzQ3MTcwMg==,9599,2021-03-20T22:19:39Z,2021-03-20T22:19:39Z,OWNER,"This is a good idea. I avoided this initially because it should be possible to run a canned query with a parameter set to the empty string, but that view could definitely be smart enough to differentiate between `?sql=...¶m=` and `?sql=` with no `param` specified at all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",828858421, https://github.com/simonw/datasette/issues/782#issuecomment-803469623,https://api.github.com/repos/simonw/datasette/issues/782,803469623,MDEyOklzc3VlQ29tbWVudDgwMzQ2OTYyMw==,9599,2021-03-20T22:01:23Z,2021-03-20T22:01:23Z,OWNER,"I'm going to keep `?_shape=array` working on the assumption that many existing uses of the Datasette API are already using that option, so it would be nice not to break them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879, https://github.com/simonw/datasette/issues/1261#issuecomment-803468314,https://api.github.com/repos/simonw/datasette/issues/1261,803468314,MDEyOklzc3VlQ29tbWVudDgwMzQ2ODMxNA==,9599,2021-03-20T21:48:48Z,2021-03-20T21:48:48Z,OWNER,That's fixed in this release of `datasette-publish-vercel`: https://github.com/simonw/datasette-publish-vercel/releases/tag/0.9.2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",832092321, https://github.com/simonw/datasette/issues/1261#issuecomment-803466868,https://api.github.com/repos/simonw/datasette/issues/1261,803466868,MDEyOklzc3VlQ29tbWVudDgwMzQ2Njg2OA==,9599,2021-03-20T21:36:06Z,2021-03-20T21:36:06Z,OWNER,"This isn't a Datasette bug - it's a Vercel bug: https://github.com/simonw/datasette-publish-vercel/issues/28 I'm looking at a fix for that now, so watch that issue for updates.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",832092321, https://github.com/simonw/datasette/issues/1266#issuecomment-803466730,https://api.github.com/repos/simonw/datasette/issues/1266,803466730,MDEyOklzc3VlQ29tbWVudDgwMzQ2NjczMA==,9599,2021-03-20T21:35:00Z,2021-03-20T21:35:00Z,OWNER,https://docs.datasette.io/en/latest/internals.html#returning-a-response-with-asgi-send-send,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",836273891, https://github.com/simonw/datasette/issues/1265#issuecomment-803130332,https://api.github.com/repos/simonw/datasette/issues/1265,803130332,MDEyOklzc3VlQ29tbWVudDgwMzEzMDMzMg==,9599,2021-03-19T21:03:09Z,2021-03-19T21:03:09Z,OWNER,This is now available in `datasette-auth-passwords` 0.4! https://github.com/simonw/datasette-auth-passwords/releases/tag/0.4,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",836123030, https://github.com/simonw/datasette/issues/1262#issuecomment-802099264,https://api.github.com/repos/simonw/datasette/issues/1262,802099264,MDEyOklzc3VlQ29tbWVudDgwMjA5OTI2NA==,9599,2021-03-18T16:43:09Z,2021-03-18T16:43:09Z,OWNER,"I often find myself wanting this too, when I'm exploring a new dataset. i agree with Bob that this is a good candidate for a plugin. The plugin system isn't quite setup for this yet though - there isn't an obvious mechanism for adding extra sort orders or other interface elements that manipulate the query used by the table view in some way. I'm going to promote this issue to status of a plugin hook feature request - I have a hunch that a plugin hook that enables `order by random()` could enable a lot of other useful plugin features too.","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",834602299, https://github.com/simonw/sqlite-utils/issues/246#issuecomment-799479175,https://api.github.com/repos/simonw/sqlite-utils/issues/246,799479175,MDEyOklzc3VlQ29tbWVudDc5OTQ3OTE3NQ==,9599,2021-03-15T14:47:31Z,2021-03-15T14:47:31Z,OWNER,"This is a smart feature. I have something that does this in Datasette, extracting it out to `sqlite-utils` makes a lot of sense. https://github.com/simonw/datasette/blob/8e18c7943181f228ce5ebcea48deb59ce50bee1f/datasette/utils/__init__.py#L818-L829","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",831751367, https://github.com/simonw/datasette/issues/236#issuecomment-799066252,https://api.github.com/repos/simonw/datasette/issues/236,799066252,MDEyOklzc3VlQ29tbWVudDc5OTA2NjI1Mg==,9599,2021-03-15T03:34:52Z,2021-03-15T03:34:52Z,OWNER,"Yeah the Lambda Docker stuff is pretty odd - you still don't get to speak HTTP, you have to speak their custom event protocol instead. https://github.com/glassechidna/serverlessish looks interesting here - it adds a proxy inside the container which allows your existing HTTP Docker image to run within Docker-on-Lambda. I've not tried it out yet though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500, https://github.com/simonw/datasette/issues/1259#issuecomment-797827038,https://api.github.com/repos/simonw/datasette/issues/1259,797827038,MDEyOklzc3VlQ29tbWVudDc5NzgyNzAzOA==,9599,2021-03-13T00:15:40Z,2021-03-13T00:15:40Z,OWNER,"If all of the facets were being calculated in a single query, I'd be willing to bump the facet time limit up to something a lot higher, maybe even a full second. There's a chance that could work amazingly well with a materialized CTE.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",830567275, https://github.com/simonw/datasette/issues/1259#issuecomment-797804869,https://api.github.com/repos/simonw/datasette/issues/1259,797804869,MDEyOklzc3VlQ29tbWVudDc5NzgwNDg2OQ==,9599,2021-03-12T23:05:05Z,2021-03-12T23:05:05Z,OWNER,"I wonder if I could optimize facet suggestion in the same way? One challenge: the query time limit will apply to the full CTE query, not to the individual columns. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",830567275, https://github.com/simonw/datasette/issues/1259#issuecomment-797801075,https://api.github.com/repos/simonw/datasette/issues/1259,797801075,MDEyOklzc3VlQ29tbWVudDc5NzgwMTA3NQ==,9599,2021-03-12T22:53:56Z,2021-03-12T22:55:16Z,OWNER,"OK, a better comparison: https://global-power-plants.datasettes.com/global-power-plants?sql=WITH+data+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++%5Bglobal-power-plants%5D%0D%0A%29%2C%0D%0Acountry_long+as+%28select+%0D%0A++%27country_long%27+as+col%2C+country_long+as+value%2C+count%28*%29+as+c+from+data+group+by+country_long%0D%0A++order+by+c+desc+limit+31%0D%0A%29%2C%0D%0Aprimary_fuel+as+%28%0D%0Aselect%0D%0A++%27primary_fuel%27+as+col%2C+primary_fuel+as+value%2C+count%28*%29+as+c+from+data+group+by+primary_fuel%0D%0A++order+by+c+desc+limit+31%0D%0A%29%2C%0D%0Aowner+as+%28%0D%0Aselect%0D%0A++%27owner%27+as+col%2C+owner+as+value%2C+count%28*%29+as+c+from+data+group+by+owner%0D%0A++order+by+c+desc+limit+31%0D%0A%29%0D%0Aselect+*+from+primary_fuel+union+select+*+from+country_long%0D%0Aunion+select+*+from+owner+order+by+col%2C+c+desc calculates facets against three columns. It takes **78.5ms** (and 34.5ms when I refreshed it, presumably after warming some SQLite caches of some sort). https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_facet=country_long&_facet=primary_fuel&_trace=1&_size=0 shows those facets with size=0 on the SQL query - and shows a SQL trace at the bottom of the page. The country_long facet query takes 45.36ms, owner takes 38.45ms, primary_fuel takes 49.04ms - so a total of 132.85ms That's against https://global-power-plants.datasettes.com/-/versions says SQLite 3.27.3 - so even on a SQLite version that doesn't materialize the CTEs there's a significant performance boost to doing all three facets in a single CTE query.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",830567275, https://github.com/simonw/datasette/issues/1259#issuecomment-797790017,https://api.github.com/repos/simonw/datasette/issues/1259,797790017,MDEyOklzc3VlQ29tbWVudDc5Nzc5MDAxNw==,9599,2021-03-12T22:22:12Z,2021-03-12T22:22:12Z,OWNER,"https://sqlite.org/lang_with.html > Prior to SQLite 3.35.0, all CTEs where treated as if the NOT MATERIALIZED phrase was present It looks like this optimization is completely unavailable on SQLite prior to 3.35.0 (released 12th March 2021). But I could still rewrite the faceting to work in this way, using the exact same SQL - it would just be significantly faster on 3.35.0+ (assuming it's actually faster in practice - would need to benchmark).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",830567275, https://github.com/simonw/datasette/issues/1193#issuecomment-797159434,https://api.github.com/repos/simonw/datasette/issues/1193,797159434,MDEyOklzc3VlQ29tbWVudDc5NzE1OTQzNA==,9599,2021-03-12T01:01:54Z,2021-03-12T01:01:54Z,OWNER,"DuckDB has a read-only mechanism: https://duckdb.org/docs/api/python ```python import duckdb con = duckdb.connect(database=""/tmp/blah.db"", read_only=True) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787173276, https://github.com/simonw/datasette/issues/1250#issuecomment-797159221,https://api.github.com/repos/simonw/datasette/issues/1250,797159221,MDEyOklzc3VlQ29tbWVudDc5NzE1OTIyMQ==,9599,2021-03-12T01:01:17Z,2021-03-12T01:01:17Z,OWNER,This is a duplicate of #1193.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824067604, https://github.com/simonw/datasette/issues/670#issuecomment-797158641,https://api.github.com/repos/simonw/datasette/issues/670,797158641,MDEyOklzc3VlQ29tbWVudDc5NzE1ODY0MQ==,9599,2021-03-12T00:59:49Z,2021-03-12T00:59:49Z,OWNER,"> Challenge: what's the equivalent for PostgreSQL of opening a database in read only mode? Will I have to talk users through creating read only credentials? It looks like the answer to this is yes - I'll need users to setup read-only credentials. Here's a TIL about that: https://til.simonwillison.net/postgresql/read-only-postgresql-user","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",564833696, https://github.com/simonw/datasette/pull/1211#issuecomment-796854370,https://api.github.com/repos/simonw/datasette/issues/1211,796854370,MDEyOklzc3VlQ29tbWVudDc5Njg1NDM3MA==,9599,2021-03-11T16:15:29Z,2021-03-11T16:15:29Z,OWNER,Thanks very much for this - it's really comprehensive. I need to bake some of these patterns into my coding habits better!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797649915, https://github.com/simonw/datasette/issues/838#issuecomment-795918377,https://api.github.com/repos/simonw/datasette/issues/838,795918377,MDEyOklzc3VlQ29tbWVudDc5NTkxODM3Nw==,9599,2021-03-10T19:01:48Z,2021-03-10T19:01:48Z,OWNER,The biggest challenge here I think is to replicate the exact situation here this happens in a Python unit test. The fix should be easy once we have a test in place.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097, https://github.com/simonw/datasette/issues/838#issuecomment-795895436,https://api.github.com/repos/simonw/datasette/issues/838,795895436,MDEyOklzc3VlQ29tbWVudDc5NTg5NTQzNg==,9599,2021-03-10T18:44:46Z,2021-03-10T18:44:57Z,OWNER,Let's reopen this.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097, https://github.com/simonw/datasette/pull/1254#issuecomment-795870524,https://api.github.com/repos/simonw/datasette/issues/1254,795870524,MDEyOklzc3VlQ29tbWVudDc5NTg3MDUyNA==,9599,2021-03-10T18:27:45Z,2021-03-10T18:27:45Z,OWNER,What other breaks did you spot?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826613352, https://github.com/simonw/datasette/pull/1256#issuecomment-795869144,https://api.github.com/repos/simonw/datasette/issues/1256,795869144,MDEyOklzc3VlQ29tbWVudDc5NTg2OTE0NA==,9599,2021-03-10T18:26:46Z,2021-03-10T18:26:46Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",827341657, https://github.com/simonw/datasette/pull/1254#issuecomment-794439632,https://api.github.com/repos/simonw/datasette/issues/1254,794439632,MDEyOklzc3VlQ29tbWVudDc5NDQzOTYzMg==,9599,2021-03-09T20:53:02Z,2021-03-09T20:53:02Z,OWNER,Thanks for catching that documentation update!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826613352, https://github.com/simonw/datasette/pull/1254#issuecomment-794437715,https://api.github.com/repos/simonw/datasette/issues/1254,794437715,MDEyOklzc3VlQ29tbWVudDc5NDQzNzcxNQ==,9599,2021-03-09T20:51:19Z,2021-03-09T20:51:19Z,OWNER,Did you see my note on https://github.com/simonw/datasette/issues/1249#issuecomment-792384382 about a weird issue I was having with the `/dbname` page hanging the server? Have you seen anything like that in your work here?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826613352, https://github.com/simonw/datasette/issues/1250#issuecomment-792386484,https://api.github.com/repos/simonw/datasette/issues/1250,792386484,MDEyOklzc3VlQ29tbWVudDc5MjM4NjQ4NA==,9599,2021-03-08T00:29:06Z,2021-03-08T00:29:06Z,OWNER,"DuckDB has a read-only mechanism: https://duckdb.org/docs/api/python ```python import duckdb con = duckdb.connect(database=""/tmp/blah.db"", read_only=True) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824067604, https://github.com/simonw/datasette/issues/1248#issuecomment-792385274,https://api.github.com/repos/simonw/datasette/issues/1248,792385274,MDEyOklzc3VlQ29tbWVudDc5MjM4NTI3NA==,9599,2021-03-08T00:25:10Z,2021-03-08T00:25:10Z,OWNER,"It's not possible yet, unfortunately. This came up on the forums recently: https://github.com/simonw/datasette/discussions/968 I'm leaning further towards making the database connection layer itself work via a plugin hook, which would open up the possibility of supporting DuckDB and other databases as well. I've not committed to doing this yet though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",823035080, https://github.com/simonw/datasette/issues/1249#issuecomment-792384854,https://api.github.com/repos/simonw/datasette/issues/1249,792384854,MDEyOklzc3VlQ29tbWVudDc5MjM4NDg1NA==,9599,2021-03-08T00:23:38Z,2021-03-08T00:23:38Z,OWNER,One reason to prioritize this issue: Homebrew upgraded to SpatiaLite 5.0 recently https://formulae.brew.sh/formula/spatialite-tools and as a result SpatiaLite database created on my laptop don't appear to be compatible with Datasette when published using `datasette publish`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-792384382,https://api.github.com/repos/simonw/datasette/issues/1249,792384382,MDEyOklzc3VlQ29tbWVudDc5MjM4NDM4Mg==,9599,2021-03-08T00:22:02Z,2021-03-08T00:22:02Z,OWNER,"I tried this patch against `Dockerfile`: ```diff diff --git a/Dockerfile b/Dockerfile index f4b1414..dd659e1 100644 --- a/Dockerfile +++ b/Dockerfile @@ -1,25 +1,26 @@ -FROM python:3.7.10-slim-stretch as build +FROM python:3.9.2-slim-buster as build # Setup build dependencies RUN apt update \ -&& apt install -y python3-dev build-essential wget libxml2-dev libproj-dev libgeos-dev libsqlite3-dev zlib1g-dev pkg-config git \ - && apt clean + && apt install -y python3-dev build-essential wget libxml2-dev libproj-dev \ + libminizip-dev libgeos-dev libsqlite3-dev zlib1g-dev pkg-config git \ + && apt clean - -RUN wget ""https://www.sqlite.org/2020/sqlite-autoconf-3310100.tar.gz"" && tar xzf sqlite-autoconf-3310100.tar.gz \ - && cd sqlite-autoconf-3310100 && ./configure --disable-static --enable-fts5 --enable-json1 CFLAGS=""-g -O2 -DSQLITE_ENABLE_FTS3=1 -DSQLITE_ENABLE_FTS3_PARENTHESIS -DSQLITE_ENABLE_FTS4=1 -DSQLITE_ENABLE_RTREE=1 -DSQLITE_ENABLE_JSON1"" \ +RUN wget ""https://www.sqlite.org/2021/sqlite-autoconf-3340100.tar.gz"" && tar xzf sqlite-autoconf-3340100.tar.gz \ + && cd sqlite-autoconf-3340100 && ./configure --disable-static --enable-fts5 --enable-json1 \ + CFLAGS=""-g -O2 -DSQLITE_ENABLE_FTS3=1 -DSQLITE_ENABLE_FTS3_PARENTHESIS -DSQLITE_ENABLE_FTS4=1 -DSQLITE_ENABLE_RTREE=1 -DSQLITE_ENABLE_JSON1"" \ && make && make install -RUN wget ""http://www.gaia-gis.it/gaia-sins/freexl-sources/freexl-1.0.5.tar.gz"" && tar zxf freexl-1.0.5.tar.gz \ - && cd freexl-1.0.5 && ./configure && make && make install +RUN wget ""http://www.gaia-gis.it/gaia-sins/freexl-1.0.6.tar.gz"" && tar zxf freexl-1.0.6.tar.gz \ + && cd freexl-1.0.6 && ./configure && make && make install -RUN wget ""http://www.gaia-gis.it/gaia-sins/libspatialite-sources/libspatialite-4.4.0-RC0.tar.gz"" && tar zxf libspatialite-4.4.0-RC0.tar.gz \ - && cd libspatialite-4.4.0-RC0 && ./configure && make && make install +RUN wget ""http://www.gaia-gis.it/gaia-sins/libspatialite-5.0.1.tar.gz"" && tar zxf libspatialite-5.0.1.tar.gz \ + && cd libspatialite-5.0.1 && ./configure --disable-rttopo && make && make install RUN wget ""http://www.gaia-gis.it/gaia-sins/readosm-sources/readosm-1.1.0.tar.gz"" && tar zxf readosm-1.1.0.tar.gz && cd readosm-1.1.0 && ./configure && make && make install -RUN wget ""http://www.gaia-gis.it/gaia-sins/spatialite-tools-sources/spatialite-tools-4.4.0-RC0.tar.gz"" && tar zxf spatialite-tools-4.4.0-RC0.tar.gz \ - && cd spatialite-tools-4.4.0-RC0 && ./configure && make && make install +RUN wget ""http://www.gaia-gis.it/gaia-sins/spatialite-tools-5.0.0.tar.gz"" && tar zxf spatialite-tools-5.0.0.tar.gz \ + && cd spatialite-tools-5.0.0 && ./configure --disable-rttopo && make && make install # Add local code to the image instead of fetching from pypi. @@ -27,7 +28,7 @@ COPY . /datasette RUN pip install /datasette -FROM python:3.7.10-slim-stretch +FROM python:3.9.2-slim-buster # Copy python dependencies and spatialite libraries COPY --from=build /usr/local/lib/ /usr/local/lib/ ``` I had to use `--disable-rttopo` from the tip in https://github.com/OSGeo/gdal/pull/3443 and also needed to install `libminizip-dev`. This works, sort of... I'm getting a weird issue where the `/dbname` page is hanging some of the time instead of loading correctly. Other than that it seems to work, but a hanging page is bad!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/issues/1249#issuecomment-792383956,https://api.github.com/repos/simonw/datasette/issues/1249,792383956,MDEyOklzc3VlQ29tbWVudDc5MjM4Mzk1Ng==,9599,2021-03-08T00:20:09Z,2021-03-08T00:20:09Z,OWNER,"Worth noting that the Docker image used by `datasette publish cloudrun` doesn't actually use that Datasette docker image - it does this: https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/datasette/utils/__init__.py#L349-L353 Where the apt extras for SpatiaLite are: https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/datasette/utils/__init__.py#L344-L345 `libsqlite3-mod-spatialite` against that official `python:3.8` image doesn't appear to install SpatiaLite 5.0. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",824064069, https://github.com/simonw/datasette/pull/1223#issuecomment-792233255,https://api.github.com/repos/simonw/datasette/issues/1223,792233255,MDEyOklzc3VlQ29tbWVudDc5MjIzMzI1NQ==,9599,2021-03-07T07:41:01Z,2021-03-07T07:41:01Z,OWNER,"This is fantastic, thanks so much for tracking this down.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",806918878, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790695126,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790695126,MDEyOklzc3VlQ29tbWVudDc5MDY5NTEyNg==,9599,2021-03-04T15:20:42Z,2021-03-04T15:20:42Z,MEMBER,"I'm not sure why but my most recent import, when displayed in Datasette, looks like this: Sorting by `id` in the opposite order gives me the data I would expect - so it looks like a bunch of null/blank messages are being imported at some point and showing up first due to ID ordering.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790693674,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790693674,MDEyOklzc3VlQ29tbWVudDc5MDY5MzY3NA==,9599,2021-03-04T15:18:36Z,2021-03-04T15:18:36Z,MEMBER,"I imported my 10GB mbox with 750,000 emails in it, ran this tool (with a hacked fix for the blob column problem) - and now a search that returns 92 results takes 25.37ms! This is fantastic.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790669767,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790669767,MDEyOklzc3VlQ29tbWVudDc5MDY2OTc2Nw==,9599,2021-03-04T14:46:06Z,2021-03-04T14:46:06Z,MEMBER,"Solution could be to pre-process that string by splitting on `(` and dropping everything afterwards, assuming that the `(...)` bit isn't necessary for correctly parsing the date.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790668263,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790668263,MDEyOklzc3VlQ29tbWVudDc5MDY2ODI2Mw==,9599,2021-03-04T14:43:58Z,2021-03-04T14:43:58Z,MEMBER,"I added this code to output a message ID on errors: ```diff print(""Errors: {}"".format(num_errors)) print(traceback.format_exc()) + print(""Message-Id: {}"".format(email.get(""Message-Id"", ""None""))) continue ``` Having found a message ID that had an error, I ran this command to see the context: rg --text --context 20 '44F289B0.000001.02100@SCHWARZE-DWFXMI' ~/gmail.mbox This was for the following error: ``` File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 102, in get_mbox message[""date""] = get_message_date(email.get(""Date""), email.get_from()) File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 178, in get_message_date datetime_tuple = email.utils.parsedate_tz(mail_date) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 50, in parsedate_tz res = _parsedate_tz(data) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 69, in _parsedate_tz data = data.split() AttributeError: 'Header' object has no attribute 'split' ``` Here's what I spotted in the `ripgrep` output: ``` 177133570:Message-Id: <44F289B0.000001.02100@SCHWARZE-DWFXMI> 177133571-Date: Mon, 28 Aug 2006 08:14:08 +0200 (Westeurop�ische Sommerzeit) 177133572-X-Mailer: IncrediMail (5002253) ``` So it could it be that `_parsedate_tz` is having trouble with that `Mon, 28 Aug 2006 08:14:08 +0200 (Westeurop�ische Sommerzeit)` string.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/issues/6#issuecomment-790384087,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/6,790384087,MDEyOklzc3VlQ29tbWVudDc5MDM4NDA4Nw==,9599,2021-03-04T07:22:51Z,2021-03-04T07:22:51Z,MEMBER,#3 also mentions the conflicting version with other tools.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",821841046, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790380839,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790380839,MDEyOklzc3VlQ29tbWVudDc5MDM4MDgzOQ==,9599,2021-03-04T07:17:05Z,2021-03-04T07:17:05Z,MEMBER,"Looks like you're doing this: ```python elif message.get_content_type() == ""text/plain"": body = message.get_payload(decode=True) ``` So presumably that decodes to a unicode string? I imagine the reason the column is a `BLOB` for me is that `sqlite-utils` determines the column type based on the first batch of items - https://github.com/simonw/sqlite-utils/blob/09c3386f55f766b135b6a1c00295646c4ae29bec/sqlite_utils/db.py#L1927-L1928 - and I got unlucky and had something in my first batch that wasn't a unicode string. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790379629,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790379629,MDEyOklzc3VlQ29tbWVudDc5MDM3OTYyOQ==,9599,2021-03-04T07:14:41Z,2021-03-04T07:14:41Z,MEMBER,"Confirmed: removing the `len()` call does not speed things up, so it's reading through the entire file for some other purpose too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790378658,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790378658,MDEyOklzc3VlQ29tbWVudDc5MDM3ODY1OA==,9599,2021-03-04T07:12:48Z,2021-03-04T07:12:48Z,MEMBER,"It looks like the `body` is being loaded into a BLOB column - so in Datasette default it looks like this: If I `datasette install datasette-render-binary` and then try again I get this: It would be great if we could store the `body` as unicode text instead. May have to do something clever to decode it based on some kind of charset header?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790373024,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790373024,MDEyOklzc3VlQ29tbWVudDc5MDM3MzAyNA==,9599,2021-03-04T07:01:58Z,2021-03-04T07:04:06Z,MEMBER,"I got 9 warnings that look like this: ``` Errors: 1 Traceback (most recent call last): File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 103, in get_mbox message[""date""] = get_message_date(email.get(""Date""), email.get_from()) File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 167, in get_message_date datetime_tuple = email.utils.parsedate_tz(mail_date) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 50, in parsedate_tz res = _parsedate_tz(data) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 69, in _parsedate_tz data = data.split() AttributeError: 'Header' object has no attribute 'split' ``` It would be useful if those warnings told me the message ID (or similar) of the affected message so I could grep for it in the `mbox` and see what was going on. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790372621,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790372621,MDEyOklzc3VlQ29tbWVudDc5MDM3MjYyMQ==,9599,2021-03-04T07:01:18Z,2021-03-04T07:01:18Z,MEMBER,"I'm not sure if it would work, but there is an alternative pattern for showing a progress bar against a really large file that I've used in `healthkit-to-sqlite` - you set the progress bar size to the size of the file in bytes, then update a counter as you read the file. https://github.com/dogsheep/healthkit-to-sqlite/blob/3eb2b06bfe3b4faaf10e9cf9dfcb28e3d16c14ff/healthkit_to_sqlite/cli.py#L24-L57 and https://github.com/dogsheep/healthkit-to-sqlite/blob/3eb2b06bfe3b4faaf10e9cf9dfcb28e3d16c14ff/healthkit_to_sqlite/utils.py#L4-L19 (the `progress_callback()` bit) is where that happens. It can be a bit of a convoluted pattern, and I'm not at all sure it would work for `mbox` files since it looks like that library has other reasons it needs to do a file scan rather than streaming it through one chunk of bytes at a time. So I imagine this would not work here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790370485,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790370485,MDEyOklzc3VlQ29tbWVudDc5MDM3MDQ4NQ==,9599,2021-03-04T06:57:25Z,2021-03-04T06:57:48Z,MEMBER,"The command takes quite a while to start running, presumably because this line causes it to have to scan the WHOLE file in order to generate a count: https://github.com/dogsheep/google-takeout-to-sqlite/blob/a3de045eba0fae4b309da21aa3119102b0efc576/google_takeout_to_sqlite/utils.py#L66-L67 I'm fine with waiting though. It's not like this is a command people run every day - and without that count we can't show a progress bar, which seems pretty important for a process that takes this long.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790369076,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790369076,MDEyOklzc3VlQ29tbWVudDc5MDM2OTA3Ng==,9599,2021-03-04T06:54:46Z,2021-03-04T06:54:46Z,MEMBER,"The Rich-powered progress bar is pretty: ![rich](https://user-images.githubusercontent.com/9599/109923307-71f69200-7c73-11eb-9ee2-8f0a240f3994.gif) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401,