home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

732 rows where author_association = "NONE" sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user >30

  • codecov[bot] 146
  • fgregg 32
  • aborruso 19
  • chrismp 18
  • carlmjohnson 14
  • tballison 13
  • frafra 10
  • psychemedia 10
  • terrycojones 10
  • stonebig 10
  • rayvoelker 10
  • maxhawkins 9
  • clausjuhl 9
  • 20after4 8
  • dracos 8
  • UtahDave 8
  • tomchristie 8
  • bsilverm 8
  • mhalle 7
  • zeluspudding 7
  • cobiadigital 7
  • zaneselvans 6
  • tsibley 5
  • khusmann 5
  • khimaros 5
  • MarkusH 5
  • dazzag24 5
  • SteadBytes 5
  • Btibert3 4
  • dholth 4
  • …

issue >30

  • link_or_copy_directory() error - Invalid cross-device link 13
  • WIP: Add Gmail takeout mbox import 12
  • .json and .csv exports fail to apply base_url 11
  • base_url configuration setting 10
  • Documentation with recommendations on running Datasette in production without using Docker 9
  • JavaScript plugin hooks mechanism similar to pluggy 9
  • Add GraphQL endpoint 8
  • Full text search of all tables at once? 7
  • Populate "endpoint" key in ASGI scope 7
  • Figure out some interesting example SQL queries 7
  • create-index should run analyze after creating index 7
  • Incorrect URLs when served behind a proxy with base_url set 6
  • publish heroku does not work on Windows 10 6
  • Metadata should be a nested arbitrary KV store 5
  • Windows installation error 5
  • Ways to improve fuzzy search speed on larger data sets? 5
  • Redesign default .json format 5
  • Improve the display of facets information 5
  • Feature Request: Gmail 5
  • Plugin hook for dynamic metadata 5
  • Datasette serve should accept paths/URLs to CSVs and other file formats 4
  • Mechanism for ranking results from SQLite full-text search 4
  • Port Datasette to ASGI 4
  • Wildcard support in query parameters 4
  • Mechanism for turning nested JSON into foreign keys / many-to-many 4
  • Support column descriptions in metadata.json 4
  • "Stream all rows" is not at all obvious 4
  • UNIQUE constraint failed: workouts.id 4
  • Document how to send multiple values for "Named parameters" 4
  • Add support for Jinja2 version 3.0 4
  • …

author_association 1

  • NONE · 732 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
1170595021 https://github.com/simonw/sqlite-utils/issues/26#issuecomment-1170595021 https://api.github.com/repos/simonw/sqlite-utils/issues/26 IC_kwDOCGYnMM5FxdzN izzues 60892516 2022-06-29T23:35:29Z 2022-06-29T23:35:29Z NONE

Have you seen MakeTypes? Not the exact same thing but it may be relevant.

And it's inspired by the paper "Types from Data: Making Structured Data First-Class Citizens in F#".

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for turning nested JSON into foreign keys / many-to-many 455486286  
1168715058 https://github.com/simonw/datasette/pull/1763#issuecomment-1168715058 https://api.github.com/repos/simonw/datasette/issues/1763 IC_kwDOBm6k_c5FqS0y codecov[bot] 22429695 2022-06-28T13:19:28Z 2022-06-28T13:19:28Z NONE

Codecov Report

Merging #1763 (fd6a817) into main (00e59ec) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1763   +/-   ##
=======================================
  Coverage   91.67%   91.67%           
=======================================
  Files          36       36           
  Lines        4658     4658           
=======================================
  Hits         4270     4270           
  Misses        388      388           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 00e59ec...fd6a817. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump black from 22.1.0 to 22.6.0 1287325944  
1164460052 https://github.com/simonw/sqlite-utils/issues/431#issuecomment-1164460052 https://api.github.com/repos/simonw/sqlite-utils/issues/431 IC_kwDOCGYnMM5FaEAU rafguns 738408 2022-06-23T14:12:51Z 2022-06-23T14:12:51Z NONE

Yeah, I think I prefer your suggestion: it seems cleaner than my initial left_name=/right_name= idea. Perhaps one downside is that it's less obvious what the role of each field is: in this example, is people_id_1 a reference to parent or child?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Allow making m2m relation of a table to itself 1227571375  
1163917719 https://github.com/dogsheep/healthkit-to-sqlite/issues/12#issuecomment-1163917719 https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12 IC_kwDOC8tyDs5FX_mX Mjboothaus 956433 2022-06-23T04:35:02Z 2022-06-23T04:35:02Z NONE

In terms of unique identifiers - could you use values stored in HKMetadataKeySyncIdentifier?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Some workout columns should be float, not text 727848625  
1163097455 https://github.com/simonw/datasette/pull/1760#issuecomment-1163097455 https://api.github.com/repos/simonw/datasette/issues/1760 IC_kwDOBm6k_c5FU3Vv codecov[bot] 22429695 2022-06-22T13:27:08Z 2022-06-22T13:27:08Z NONE

Codecov Report

Merging #1760 (69951ee) into main (00e59ec) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1760   +/-   ##
=======================================
  Coverage   91.67%   91.67%           
=======================================
  Files          36       36           
  Lines        4658     4658           
=======================================
  Hits         4270     4270           
  Misses        388      388           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 00e59ec...69951ee. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump furo from 2022.4.7 to 2022.6.21 1280136357  
1162500525 https://github.com/simonw/sqlite-utils/issues/448#issuecomment-1162500525 https://api.github.com/repos/simonw/sqlite-utils/issues/448 IC_kwDOCGYnMM5FSlmt mungewell 236907 2022-06-22T00:46:43Z 2022-06-22T00:46:43Z NONE

log.txt

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 1279144769  
1162498734 https://github.com/simonw/sqlite-utils/issues/448#issuecomment-1162498734 https://api.github.com/repos/simonw/sqlite-utils/issues/448 IC_kwDOCGYnMM5FSlKu mungewell 236907 2022-06-22T00:43:45Z 2022-06-22T00:43:45Z NONE

Attempted to test on a machine with a new version of Python, but install failed with an error message for the 'click' package.

C:\WINDOWS\system32>"c:\Program Files\Python310\python.exe"
Python 3.10.2 (tags/v3.10.2:a58ebcc, Jan 17 2022, 14:12:15) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> quit()

C:\WINDOWS\system32>cd C:\Users\swood\Downloads\sqlite-utils-main-20220621\sqlite-utils-main

C:\Users\swood\Downloads\sqlite-utils-main-20220621\sqlite-utils-main>"c:\Program Files\Python310\python.exe" setup.py install
running install
running bdist_egg
running egg_info

...

Installed c:\program files\python310\lib\site-packages\click_default_group_wheel-1.2.2-py3.10.egg
Searching for click
Downloading https://files.pythonhosted.org/packages/3d/da/f3bbf30f7e71d881585d598f67f4424b2cc4c68f39849542e81183218017/click-default-group-wheel-1.2.2.tar.gz#sha256=e90da42d92c03e88a12ed0c0b69c8a29afb5d36e3dc8d29c423ba4219e6d7747
Best match: click default-group-wheel-1.2.2
Processing click-default-group-wheel-1.2.2.tar.gz
Writing C:\Users\swood\AppData\Local\Temp\easy_install-aiaj0_eh\click-default-group-wheel-1.2.2\setup.cfg
Running click-default-group-wheel-1.2.2\setup.py -q bdist_egg --dist-dir C:\Users\swood\AppData\Local\Temp\easy_install-aiaj0_eh\click-default-group-wheel-1.2.2\egg-dist-tmp-z61a4h8n
zip_safe flag not set; analyzing archive contents...
removing 'c:\program files\python310\lib\site-packages\click_default_group_wheel-1.2.2-py3.10.egg' (and everything under it)
Copying click_default_group_wheel-1.2.2-py3.10.egg to c:\program files\python310\lib\site-packages
click-default-group-wheel 1.2.2 is already the active version in easy-install.pth

Installed c:\program files\python310\lib\site-packages\click_default_group_wheel-1.2.2-py3.10.egg
error: The 'click' distribution was not found and is required by click-default-group-wheel, sqlite-utils
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 1279144769  
1160717784 https://github.com/simonw/datasette/pull/1759#issuecomment-1160717784 https://api.github.com/repos/simonw/datasette/issues/1759 IC_kwDOBm6k_c5FLyXY codecov[bot] 22429695 2022-06-20T18:04:46Z 2022-06-20T18:04:46Z NONE

Codecov Report

Merging #1759 (b901bb0) into main (2e97516) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1759   +/-   ##
=======================================
  Coverage   91.67%   91.67%           
=======================================
  Files          36       36           
  Lines        4658     4658           
=======================================
  Hits         4270     4270           
  Misses        388      388           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2e97516...b901bb0. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Extract facet portions of table.html out into included templates 1275523220  
1155515426 https://github.com/simonw/sqlite-utils/issues/441#issuecomment-1155515426 https://api.github.com/repos/simonw/sqlite-utils/issues/441 IC_kwDOCGYnMM5E38Qi betatim 1448859 2022-06-14T17:53:43Z 2022-06-14T17:53:43Z NONE

That would be handy (additional where filters) but I think the trick with the with statement is already an order of magnitude better than what I had thought of, so my problem is solved by it (plus I got to learn about with today!)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Combining `rows_where()` and `search()` to limit which rows are searched 1257724585  
1147435032 https://github.com/simonw/datasette/pull/1753#issuecomment-1147435032 https://api.github.com/repos/simonw/datasette/issues/1753 IC_kwDOBm6k_c5EZHgY codecov[bot] 22429695 2022-06-06T13:15:11Z 2022-06-06T13:15:11Z NONE

Codecov Report

Merging #1753 (23a8515) into main (2e97516) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1753   +/-   ##
=======================================
  Coverage   91.67%   91.67%           
=======================================
  Files          36       36           
  Lines        4658     4658           
=======================================
  Hits         4270     4270           
  Misses        388      388           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2e97516...23a8515. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump furo from 2022.4.7 to 2022.6.4.1 1261826957  
1141711418 https://github.com/simonw/sqlite-utils/issues/26#issuecomment-1141711418 https://api.github.com/repos/simonw/sqlite-utils/issues/26 IC_kwDOCGYnMM5EDSI6 nileshtrivedi 19304 2022-05-31T06:21:15Z 2022-05-31T06:21:15Z NONE

I ran into this. My use case has a JSON file with array of book objects with a key called reviews which is also an array of objects. My JSON is human-edited and does not specify IDs for either books or reviews. Because sqlite-utils does not support inserting nested objects, I instead have to maintain two separate CSV files with id column in books.csv and book_id column in reviews.csv.

I think the right way to declare the relationship while inserting a JSON might be to describe the relationship:

sqlite-utils insert data.db books mydata.json --hasmany reviews --hasone author --manytomany tags

This is relying on the assumption that foreign keys can point to rowid primary key.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for turning nested JSON into foreign keys / many-to-many 455486286  
1140321380 https://github.com/simonw/datasette/issues/1751#issuecomment-1140321380 https://api.github.com/repos/simonw/datasette/issues/1751 IC_kwDOBm6k_c5D9-xk knutwannheden 408765 2022-05-28T19:52:17Z 2022-05-28T19:52:17Z NONE

Closing in favor of existing issue #1298.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add scrollbars to table presentation in default layout 1251710928  
1139426398 https://github.com/simonw/sqlite-utils/issues/439#issuecomment-1139426398 https://api.github.com/repos/simonw/sqlite-utils/issues/439 IC_kwDOCGYnMM5D6kRe frafra 4068 2022-05-27T09:04:05Z 2022-05-27T10:44:54Z NONE

This code works:

import csv
import sqlite_utils
db = sqlite_utils.Database("test.db")
reader = csv.DictReader(open("csv", encoding="utf-16-le").read().split("\r\n"), delimiter=";")
db["test"].insert_all(reader, pk="Id")

I used iconv to change the encoding; sqlite-utils can import the resulting file, even if it stops at 98 %:

sqlite-utils insert --csv test test.db clean 
  [------------------------------------]    0%
  [###################################-]   98%  00:00:00
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Misleading progress bar against utf-16-le CSV input 1250495688  
1139484453 https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1139484453 https://api.github.com/repos/simonw/sqlite-utils/issues/433 IC_kwDOCGYnMM5D6ycl frafra 4068 2022-05-27T10:20:08Z 2022-05-27T10:20:08Z NONE

I can confirm. This only happens with sqlite-utils. I am using gnome-terminal with bash.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
CLI eats my cursor 1239034903  
1139392769 https://github.com/simonw/sqlite-utils/issues/438#issuecomment-1139392769 https://api.github.com/repos/simonw/sqlite-utils/issues/438 IC_kwDOCGYnMM5D6cEB frafra 4068 2022-05-27T08:21:53Z 2022-05-27T08:21:53Z NONE

Argument were specified in the wrong order. PATH TABLE FILE can be misleading :)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
illegal UTF-16 surrogate 1250161887  
1139379923 https://github.com/simonw/sqlite-utils/issues/438#issuecomment-1139379923 https://api.github.com/repos/simonw/sqlite-utils/issues/438 IC_kwDOCGYnMM5D6Y7T frafra 4068 2022-05-27T08:05:01Z 2022-05-27T08:05:01Z NONE

I tried to debug it using pdb, but it looks sqlite-utils catches the exception, so it is not quick to figure out where the failure is happening.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
illegal UTF-16 surrogate 1250161887  
1081861670 https://github.com/simonw/datasette/pull/1693#issuecomment-1081861670 https://api.github.com/repos/simonw/datasette/issues/1693 IC_kwDOBm6k_c5Ae-Ym codecov[bot] 22429695 2022-03-29T13:18:47Z 2022-05-20T20:36:30Z NONE

Codecov Report

Merging #1693 (65a5d5e) into main (1465fea) will not change coverage.
The diff coverage is n/a.

:exclamation: Current head 65a5d5e differs from pull request most recent head ec2d1e4. Consider uploading reports for the commit ec2d1e4 to get more accurate results

@@           Coverage Diff           @@
##             main    #1693   +/-   ##
=======================================
  Coverage   91.67%   91.67%           
=======================================
  Files          36       36           
  Lines        4658     4658           
=======================================
  Hits         4270     4270           
  Misses        388      388           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1d33fd0...ec2d1e4. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump black from 22.1.0 to 22.3.0 1184850337  
1129332959 https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1129332959 https://api.github.com/repos/simonw/sqlite-utils/issues/425 IC_kwDOCGYnMM5DUEDf McEazy2700 102771161 2022-05-17T21:27:02Z 2022-05-17T21:27:02Z NONE

Hi, I'm trying to deploy my site using elasticbeanstalk and I keep getting this same error :
deterministic=True requires SQLite 3.8.3 or higher

I saw your previous solution that involves editing sqlite-utils/sqlite_utils/db.py file, but I'm curious as to how that will work in production.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 1203842656  
1126295407 https://github.com/simonw/sqlite-utils/issues/431#issuecomment-1126295407 https://api.github.com/repos/simonw/sqlite-utils/issues/431 IC_kwDOCGYnMM5DIedv rafguns 738408 2022-05-13T17:47:32Z 2022-05-13T17:47:32Z NONE

I'd be happy to write a PR for this, if you think it's worth having.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Allow making m2m relation of a table to itself 1227571375  
1125083348 https://github.com/simonw/datasette/issues/1298#issuecomment-1125083348 https://api.github.com/repos/simonw/datasette/issues/1298 IC_kwDOBm6k_c5DD2jU llimllib 7150 2022-05-12T14:43:51Z 2022-05-12T14:43:51Z NONE

user report: I found this issue because the first time I tried to use datasette for real, I displayed a large table, and thought there was no horizontal scroll bar at all. I didn't even consider that I had to scroll all the way to the end of the page to find it.

Just chipping in to say that this confused me, and I didn't even find the scroll bar until after I saw this issue. I don't know what the right answer is, but IMO the UI should suggest to the user that there is a way to view the data that's hidden to the right.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
improve table horizontal scroll experience 855476501  
1116684581 https://github.com/simonw/sqlite-utils/issues/416#issuecomment-1116684581 https://api.github.com/repos/simonw/sqlite-utils/issues/416 IC_kwDOCGYnMM5Cj0El mattkiefer 638427 2022-05-03T21:36:49Z 2022-05-03T21:36:49Z NONE

Thanks for addressing this @simonw! However, I just reinstalled sqlite-utils 3.26.1 and get an ParserError: Unknown string format: None:

sqlite-utils --version
sqlite-utils, version 3.26.1
sqlite-utils convert idfpr.db license "Original Issue Date" "r.parsedate(value)"
Traceback (most recent call last):
  File "/home/matt/.local/lib/python3.9/site-packages/sqlite_utils/db.py", line 2514, in convert_value
    return fn(v)
  File "<string>", line 2, in fn
  File "/home/matt/.local/lib/python3.9/site-packages/sqlite_utils/recipes.py", line 19, in parsedate
    parser.parse(value, dayfirst=dayfirst, yearfirst=yearfirst)
  File "/usr/lib/python3/dist-packages/dateutil/parser/_parser.py", line 1374, in parse
    return DEFAULTPARSER.parse(timestr, **kwargs)
  File "/usr/lib/python3/dist-packages/dateutil/parser/_parser.py", line 649, in parse
    raise ParserError("Unknown string format: %s", timestr)
dateutil.parser._parser.ParserError: Unknown string format: None
Traceback (most recent call last):
  File "/home/matt/.local/bin/sqlite-utils", line 8, in <module>
    sys.exit(cli())
  File "/usr/lib/python3/dist-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3/dist-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3/dist-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/matt/.local/lib/python3.9/site-packages/sqlite_utils/cli.py", line 2707, in convert
    db[table].convert(
  File "/home/matt/.local/lib/python3.9/site-packages/sqlite_utils/db.py", line 2530, in convert
    self.db.execute(sql, where_args or [])
  File "/home/matt/.local/lib/python3.9/site-packages/sqlite_utils/db.py", line 463, in execute
    return self.conn.execute(sql, parameters)
sqlite3.OperationalError: user-defined function raised exception

I definitely have some invalid data in the db. Happy to send a copy if it's helpful.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Options for how `r.parsedate()` should handle invalid dates 1173023272  
1116336340 https://github.com/simonw/sqlite-utils/issues/430#issuecomment-1116336340 https://api.github.com/repos/simonw/sqlite-utils/issues/430 IC_kwDOCGYnMM5CifDU rayvoelker 9308268 2022-05-03T17:03:31Z 2022-05-03T17:03:31Z NONE

So, the good news is that it appears that setting one of those PRAGMA statements fixed the issue of table.extract() method call on this large database completing (that I described above.) The bad news is that I'm not sure which one!

I wonder if it's something system / environment specific about SQLite, or maybe something else going on.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases 1224112817  
1115542067 https://github.com/simonw/datasette/issues/1732#issuecomment-1115542067 https://api.github.com/repos/simonw/datasette/issues/1732 IC_kwDOBm6k_c5CfdIz tannewt 52649 2022-05-03T01:50:44Z 2022-05-03T01:50:44Z NONE

I haven’t set one up unfortunately. My time is very limited because we just had a baby.

On Mon, May 2, 2022, at 6:42 PM, Simon Willison wrote:

Thanks, this definitely sounds like a bug. Do you have simple steps to reproduce this?

—
Reply to this email directly, view it on GitHub https://github.com/simonw/datasette/issues/1732#issuecomment-1115533820, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAM3KIY5L6FENZ22XANTHDVICAAXANCNFSM5UYOTKQA.
You are receiving this because you authored the thread.Message ID: @.***>

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Custom page variables aren't decoded 1221849746  
1114601882 https://github.com/simonw/datasette/issues/1479#issuecomment-1114601882 https://api.github.com/repos/simonw/datasette/issues/1479 IC_kwDOBm6k_c5Cb3ma Rik-de-Kort 32839123 2022-05-02T08:10:27Z 2022-05-02T11:54:49Z NONE

Also ran into this issue today using datasette package. The stack trace takes up my whole PowerShell history, though (recursionerror), but it also concerns the temporary directory.
Our development machines have a very zealous scanner that appears to insert itself between every call to the filesystem. I suspected that was causing some racing, but this turned out not to be the case: inserting time.sleep(3) on line 451 of datasette/datasette/utils/__init__.py does not make the problem go away. Commenting out the tmp.cleanup() line does.

The next error I get is docker-specific, so that probably does resolve the Datasette error here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Win32 "used by another process" error with datasette publish 1010112818  
1111955628 https://github.com/simonw/datasette/issues/1633#issuecomment-1111955628 https://api.github.com/repos/simonw/datasette/issues/1633 IC_kwDOBm6k_c5CRxis henrikek 6613091 2022-04-28T09:12:56Z 2022-04-28T09:12:56Z NONE

I have verified that the problem with base_url still exists in the latest version 0.61.1. I would need some guidance if my code change suggestion is correct or if base_url should be included in some other code?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
base_url or prefix does not work with _exact match 1129052172  
1111506339 https://github.com/simonw/sqlite-utils/issues/159#issuecomment-1111506339 https://api.github.com/repos/simonw/sqlite-utils/issues/159 IC_kwDOCGYnMM5CQD2j dracos 154364 2022-04-27T21:35:13Z 2022-04-27T21:35:13Z NONE

Just stumbled across this, wondering why none of my deletes were working.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
.delete_where() does not auto-commit (unlike .insert() or .upsert()) 702386948  
1111451790 https://github.com/simonw/datasette/issues/1727#issuecomment-1111451790 https://api.github.com/repos/simonw/datasette/issues/1727 IC_kwDOBm6k_c5CP2iO glyph 716529 2022-04-27T20:30:33Z 2022-04-27T20:30:33Z NONE

I should try seeing what happens with WAL mode enabled.

I've only skimmed above but it looks like you're doing mainly read-only queries? WAL mode is about better interactions between writers & readers, primarily.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Research: demonstrate if parallel SQL queries are worthwhile 1217759117  
1111448928 https://github.com/simonw/datasette/issues/1727#issuecomment-1111448928 https://api.github.com/repos/simonw/datasette/issues/1727 IC_kwDOBm6k_c5CP11g glyph 716529 2022-04-27T20:27:05Z 2022-04-27T20:27:05Z NONE

You don't want to re-use an SQLite connection from multiple threads anyway: https://www.sqlite.org/threadsafe.html

Multiple connections can operate on the file in parallel, but a single connection can't:

Multi-thread. In this mode, SQLite can be safely used by multiple threads provided that no single database connection is used simultaneously in two or more threads.

(emphasis mine)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Research: demonstrate if parallel SQL queries are worthwhile 1217759117  
1107459446 https://github.com/simonw/datasette/pull/1717#issuecomment-1107459446 https://api.github.com/repos/simonw/datasette/issues/1717 IC_kwDOBm6k_c5CAn12 codecov[bot] 22429695 2022-04-23T11:56:36Z 2022-04-23T11:56:36Z NONE

Codecov Report

Merging #1717 (9b9a314) into main (d57c347) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##             main    #1717   +/-   ##
=======================================
  Coverage   91.75%   91.75%           
=======================================
  Files          34       34           
  Lines        4574     4575    +1     
=======================================
+ Hits         4197     4198    +1     
  Misses        377      377           
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>datasette/publish/cloudrun.py</td> <td>97.05% <100.00%> (+0.04%)</td> <td>:arrow_up:</td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d57c347...9b9a314. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add timeout option to Cloudrun build 1213281044  
1105464661 https://github.com/simonw/datasette/pull/1574#issuecomment-1105464661 https://api.github.com/repos/simonw/datasette/issues/1574 IC_kwDOBm6k_c5B5A1V dholth 208018 2022-04-21T16:51:24Z 2022-04-21T16:51:24Z NONE

tfw you have more ephemeral storage than upstream bandwidth

FROM python:3.10-slim AS base

RUN apt update && apt -y install zstd

ENV DATASETTE_SECRET 'sosecret'
RUN --mount=type=cache,target=/root/.cache/pip
    pip install -U datasette datasette-pretty-json datasette-graphql

ENV PORT 8080
EXPOSE 8080

FROM base AS pack

COPY . /app
WORKDIR /app

RUN datasette inspect --inspect-file inspect-data.json
RUN zstd --rm *.db

FROM base AS unpack

COPY --from=pack /app /app
WORKDIR /app

CMD ["/bin/bash", "-c", "shopt -s nullglob && zstd --rm -d *.db.zst && datasette serve --host 0.0.0.0 --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT *.db"]
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
introduce new option for datasette package to use a slim base image 1084193403  
1103312860 https://github.com/simonw/datasette/issues/1713#issuecomment-1103312860 https://api.github.com/repos/simonw/datasette/issues/1713 IC_kwDOBm6k_c5Bwzfc fgregg 536941 2022-04-20T00:52:19Z 2022-04-20T00:52:19Z NONE

feels related to #1402

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Datasette feature for publishing snapshots of query results 1203943272  
1100243987 https://github.com/simonw/datasette/pull/1159#issuecomment-1100243987 https://api.github.com/repos/simonw/datasette/issues/1159 IC_kwDOBm6k_c5BlGQT lovasoa 552629 2022-04-15T17:24:43Z 2022-04-15T17:24:43Z NONE

@simonw : do you think this could be merged ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Improve the display of facets information 774332247  
1099443468 https://github.com/simonw/datasette/issues/1713#issuecomment-1099443468 https://api.github.com/repos/simonw/datasette/issues/1713 IC_kwDOBm6k_c5BiC0M rayvoelker 9308268 2022-04-14T17:26:27Z 2022-04-14T17:26:27Z NONE

What would be an awesome feature as a plugin would be to be able to save a query (and possibly even results) to a github gist. Being able to share results that way would be super fantastic. Possibly even in Jupyter Notebook format (since github and github gists nicely render those)!

I know there's the handy datasette-saved-queries plugin, but a button that could export stuff out and then even possibly import stuff back in (I'm sort of thinking the way that Google Colab allows you to save to github, and then pull the notebook back in is a really great workflow

https://github.com/cincinnatilibrary/collection-analysis/blob/master/reports/colab_datasette_example.ipynb )

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Datasette feature for publishing snapshots of query results 1203943272  
1092850719 https://github.com/simonw/datasette/pull/1703#issuecomment-1092850719 https://api.github.com/repos/simonw/datasette/issues/1703 IC_kwDOBm6k_c5BI5Qf codecov[bot] 22429695 2022-04-08T13:18:04Z 2022-04-08T13:18:04Z NONE

Codecov Report

Merging #1703 (73aabe6) into main (90d1be9) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1703   +/-   ##
=======================================
  Coverage   91.75%   91.75%           
=======================================
  Files          34       34           
  Lines        4573     4573           
=======================================
  Hits         4196     4196           
  Misses        377      377           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 90d1be9...73aabe6. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0 1197298420  
1087428593 https://github.com/simonw/datasette/issues/1549#issuecomment-1087428593 https://api.github.com/repos/simonw/datasette/issues/1549 IC_kwDOBm6k_c5A0Nfx fgregg 536941 2022-04-04T11:17:13Z 2022-04-04T11:17:13Z NONE

another way to get the behavior of downloading the file is to use the download attribute of the anchor tag

https://developer.mozilla.org/en-US/docs/Web/HTML/Element/a#attr-download

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Redesign CSV export to improve usability 1077620955  
1084216224 https://github.com/simonw/datasette/pull/1574#issuecomment-1084216224 https://api.github.com/repos/simonw/datasette/issues/1574 IC_kwDOBm6k_c5An9Og fs111 33631 2022-03-31T07:45:25Z 2022-03-31T07:45:25Z NONE

@simonw I like that you want to go "slim by default". Do you want another PR for that or should I just wait?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
introduce new option for datasette package to use a slim base image 1084193403  
1082476727 https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1082476727 https://api.github.com/repos/simonw/sqlite-utils/issues/420 IC_kwDOCGYnMM5AhUi3 strada 770231 2022-03-29T23:52:38Z 2022-03-29T23:52:38Z NONE

@simonw Thanks for looking into it and documenting the solution!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to use a `--convert` function that runs initialization code first 1178546862  
1081860312 https://github.com/simonw/datasette/pull/1694#issuecomment-1081860312 https://api.github.com/repos/simonw/datasette/issues/1694 IC_kwDOBm6k_c5Ae-DY codecov[bot] 22429695 2022-03-29T13:17:30Z 2022-03-29T13:17:30Z NONE

Codecov Report

Merging #1694 (83ff967) into main (e73fa72) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1694   +/-   ##
=======================================
  Coverage   91.74%   91.74%           
=======================================
  Files          34       34           
  Lines        4565     4565           
=======================================
  Hits         4188     4188           
  Misses        377      377           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e73fa72...83ff967. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0 1184850675  
1081079506 https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1081079506 https://api.github.com/repos/simonw/sqlite-utils/issues/421 IC_kwDOCGYnMM5Ab_bS learning4life 24938923 2022-03-28T19:58:55Z 2022-03-28T20:05:57Z NONE

Sure, it is from the documentation example:
Extracting columns into a separate table

wget "https://github.com/wri/global-power-plant-database/blob/232a6666/output_database/global_power_plant_database.csv?raw=true"

sqlite-utils insert global.db power_plants \
    'global_power_plant_database.csv?raw=true' --csv
# Extract those columns:
sqlite-utils extract global.db power_plants country country_long \
    --table countries \
    --fk-column country_id \
    --rename country_long name
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"Error: near "(": syntax error" when using sqlite-utils indexes CLI 1180427792  
1079806857 https://github.com/simonw/datasette/issues/1688#issuecomment-1079806857 https://api.github.com/repos/simonw/datasette/issues/1688 IC_kwDOBm6k_c5AXIuJ hydrosquall 9020979 2022-03-27T01:01:14Z 2022-03-27T01:01:14Z NONE

Thank you! I went through the cookiecutter template, and published my first package here: https://github.com/hydrosquall/datasette-nteract-data-explorer

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins? 1181432624  
1079550754 https://github.com/simonw/datasette/issues/1688#issuecomment-1079550754 https://api.github.com/repos/simonw/datasette/issues/1688 IC_kwDOBm6k_c5AWKMi hydrosquall 9020979 2022-03-26T01:27:27Z 2022-03-26T03:16:29Z NONE

Is there a way to serve a static assets when using the plugins/ directory method instead of installing plugins as a new python package?

As a workaround, I found I can serve my statics from a non-plugin specific folder using the --static CLI flag.

datasette ~/Library/Safari/History.db \
  --plugins-dir=plugins/ \
  --static assets:dist/

It's not ideal because it means I'll change the cache pattern path depending on how the plugin is running (via pip install or as a one off script), but it's usable as a workaround.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins? 1181432624  
1079018557 https://github.com/simonw/datasette/pull/1685#issuecomment-1079018557 https://api.github.com/repos/simonw/datasette/issues/1685 IC_kwDOBm6k_c5AUIQ9 codecov[bot] 22429695 2022-03-25T13:16:48Z 2022-03-25T13:16:48Z NONE

Codecov Report

Merging #1685 (933ce47) into main (c496f2b) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1685   +/-   ##
=======================================
  Coverage   91.74%   91.74%           
=======================================
  Files          34       34           
  Lines        4565     4565           
=======================================
  Hits         4188     4188           
  Misses        377      377           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c496f2b...933ce47. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0 1180778860  
1078126065 https://github.com/simonw/datasette/issues/1684#issuecomment-1078126065 https://api.github.com/repos/simonw/datasette/issues/1684 IC_kwDOBm6k_c5AQuXx fgregg 536941 2022-03-24T20:08:56Z 2022-03-24T20:13:19Z NONE

would be nice if the behavior was

  1. try to facet all the columns
  2. for bigger tables try to facet the indexed columns
  3. for the biggest tables, turn off autofacetting completely

This is based on my assumption that what determines autofaceting is the rarity of unique values. Which may not be true!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for disabling faceting on large tables only 1179998071  
1077047295 https://github.com/simonw/datasette/issues/1581#issuecomment-1077047295 https://api.github.com/repos/simonw/datasette/issues/1581 IC_kwDOBm6k_c5AMm__ fgregg 536941 2022-03-24T04:08:18Z 2022-03-24T04:08:18Z NONE

this has been addressed by the datasette-hashed-urls plugin

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
when hashed urls are turned on, the _memory db has improperly long-lived cache expiry 1089529555  
1077047152 https://github.com/simonw/datasette/pull/1582#issuecomment-1077047152 https://api.github.com/repos/simonw/datasette/issues/1582 IC_kwDOBm6k_c5AMm9w fgregg 536941 2022-03-24T04:07:58Z 2022-03-24T04:07:58Z NONE

this has been obviated by the datasette-hashed-urls plugin

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
don't set far expiry if hash is '000' 1090055810  
1076662556 https://github.com/simonw/sqlite-utils/pull/419#issuecomment-1076662556 https://api.github.com/repos/simonw/sqlite-utils/issues/419 IC_kwDOCGYnMM5ALJEc codecov[bot] 22429695 2022-03-23T18:12:47Z 2022-03-23T18:12:47Z NONE

Codecov Report

Merging #419 (228f736) into main (93fa79d) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main     #419   +/-   ##
=======================================
  Coverage   96.55%   96.55%           
=======================================
  Files           6        6           
  Lines        2498     2498           
=======================================
  Hits         2412     2412           
  Misses         86       86           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 93fa79d...228f736. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Ignore common generated files 1178484369  
1074256603 https://github.com/simonw/sqlite-utils/issues/417#issuecomment-1074256603 https://api.github.com/repos/simonw/sqlite-utils/issues/417 IC_kwDOCGYnMM5AB9rb blaine 9954 2022-03-21T18:19:41Z 2022-03-21T18:19:41Z NONE

That makes sense; just a little hint that points folks towards doing the right thing might be helpful!

fwiw, the reason I was using jq in the first place was just a quick way to extract one attribute from an actual JSON array. When I initially imported it, I got a table with a bunch of embedded JSON values, rather than a native table, because each array entry had two attributes, one with the data I actually wanted. Not sure how common a use-case this is, though (and easily fixed, aside from the jq weirdness!)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
insert fails on JSONL with whitespace 1175744654  
1073152522 https://github.com/dogsheep/google-takeout-to-sqlite/issues/10#issuecomment-1073152522 https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10 IC_kwDODFE5qs4_9wIK csusanu 9290214 2022-03-20T02:38:07Z 2022-03-20T02:38:07Z NONE

This line needs to say "MyActivity.json" instead of "My Activity.json". Google must have changed the file name.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
sqlite3.OperationalError: no such table: main.my_activity 1123393829  
1073139067 https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1073139067 https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 IC_kwDOC8tyDs4_9s17 lchski 343884 2022-03-20T00:54:18Z 2022-03-20T00:54:18Z NONE

Update: this appears to be because of running the command twice without clearing the DB in between. Tries to insert a Workout that already exists, causing a collision on the (auto-generated) id column. Had a different error with a clean DB, likely due to the workout points format; will make a new issue for that.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
UNIQUE constraint failed: workouts.id 771608692  
1073123231 https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1073123231 https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 IC_kwDOC8tyDs4_9o-f lchski 343884 2022-03-19T22:39:29Z 2022-03-19T22:39:29Z NONE

I have this issue, too, with a fresh export. None of my Workout entries in export.xml have an id key, though the sample export.xml in the tests folder doesn’t either, so I don’t think this is the culprit. Indeed, it seems @simonw is using the hash_id function from sqlite_utils, which creates a column (id, in this case) based on a hash of the row’s contents.

When I run the script, a workouts table is created, with one entry: my first workout. No workout_points table is created, as I’d expect from utils.py. I then get essentially the same error as noted in this thread:

Importing from HealthKit [###################################-] 98% 00:00:01 Traceback (most recent call last): File "/Users/lchski/.pyenv/versions/3.10.3/bin/healthkit-to-sqlite", line 8, in <module> sys.exit(cli()) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/healthkit_to_sqlite/cli.py", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/healthkit_to_sqlite/utils.py", line 34, in convert_xml_to_sqlite workout_to_db(el, db, zipfile) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/healthkit_to_sqlite/utils.py", line 57, in workout_to_db pk = db["workouts"].insert(record, alter=True, hash_id="id").last_pk File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/sqlite_utils/db.py", line 2822, in insert return self.insert_all( File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/sqlite_utils/db.py", line 2950, in insert_all self.insert_chunk( File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/sqlite_utils/db.py", line 2715, in insert_chunk result = self.db.execute(query, params) File "/Users/lchski/.pyenv/versions/3.10.3/lib/python3.10/site-packages/sqlite_utils/db.py", line 458, in execute return self.conn.execute(sql, parameters) sqlite3.IntegrityError: UNIQUE constraint failed: workouts.id

Are there maybe duplicate workouts in the data, which’d cause multiple rows to share the same id? It’s strange, though, that no workout_points is created at all. Export created from iOS 15.3.1.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
UNIQUE constraint failed: workouts.id 771608692  
1072954795 https://github.com/simonw/datasette/issues/1228#issuecomment-1072954795 https://api.github.com/repos/simonw/datasette/issues/1228 IC_kwDOBm6k_c4_8_2r Kabouik 7107523 2022-03-19T06:44:40Z 2022-03-19T06:44:40Z NONE

... unless your data had a column called n?

Exactly, that's highly likely even though I can't double check from this computer just now. Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
500 error caused by faceting if a column called `n` exists 810397025  
1068193035 https://github.com/simonw/datasette/pull/1659#issuecomment-1068193035 https://api.github.com/repos/simonw/datasette/issues/1659 IC_kwDOBm6k_c4_q1UL codecov[bot] 22429695 2022-03-15T16:28:25Z 2022-03-15T17:56:09Z NONE

Codecov Report

Merging #1659 (85dde28) into main (c10cd48) will increase coverage by 0.03%.
The diff coverage is 100.00%.

:exclamation: Current head 85dde28 differs from pull request most recent head 99b8263. Consider uploading reports for the commit 99b8263 to get more accurate results

@@            Coverage Diff             @@
##             main    #1659      +/-   ##
==========================================
+ Coverage   92.06%   92.10%   +0.03%     
==========================================
  Files          34       34              
  Lines        4576     4584       +8     
==========================================
+ Hits         4213     4222       +9     
+ Misses        363      362       -1     
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>datasette/app.py</td> <td>94.36% <100.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/url_builder.py</td> <td>100.00% <100.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/utils/__init__.py</td> <td>94.84% <100.00%> (-0.13%)</td> <td>:arrow_down:</td> </tr> <tr> <td>datasette/views/base.py</td> <td>96.07% <100.00%> (+0.58%)</td> <td>:arrow_up:</td> </tr> <tr> <td>datasette/views/table.py</td> <td>96.21% <100.00%> (+0.01%)</td> <td>:arrow_up:</td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c10cd48...99b8263. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Tilde encoding 1169895600  
1068154183 https://github.com/simonw/datasette/pull/1656#issuecomment-1068154183 https://api.github.com/repos/simonw/datasette/issues/1656 IC_kwDOBm6k_c4_qr1H codecov[bot] 22429695 2022-03-15T15:55:34Z 2022-03-15T15:55:34Z NONE

Codecov Report

Merging #1656 (5d9883f) into main (c10cd48) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1656   +/-   ##
=======================================
  Coverage   92.06%   92.06%           
=======================================
  Files          34       34           
  Lines        4576     4576           
=======================================
  Hits         4213     4213           
  Misses        363      363           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c10cd48...5d9883f. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 1168357113  
1066194130 https://github.com/simonw/datasette/issues/1384#issuecomment-1066194130 https://api.github.com/repos/simonw/datasette/issues/1384 IC_kwDOBm6k_c4_jNTS khusmann 167160 2022-03-13T22:23:04Z 2022-03-13T22:23:04Z NONE

Ah, sorry, I didn't get what you were saying you the first time. Using _metadata_local in that way makes total sense -- I agree, refreshing metadata each cell was seeming quite excessive. Now I'm on the same page! :)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for dynamic metadata 930807135  
1066143991 https://github.com/simonw/datasette/issues/1384#issuecomment-1066143991 https://api.github.com/repos/simonw/datasette/issues/1384 IC_kwDOBm6k_c4_jBD3 khusmann 167160 2022-03-13T17:13:09Z 2022-03-13T17:13:09Z NONE

Thanks for taking the time to reply @brandonrobertz , this is really helpful info.

See "Many small queries are efficient in sqlite" for more information on the rationale here. Also note that in the datasette-live-config reference plugin, the DB connection is cached, so that eliminated most of the performance worries we had.

Ah, that's nifty! Yeah, then caching on the python side is likely a waste :) I'm new to working with sqlite so this is super good to know the many-small-queries is a common pattern

I tested on very large Datasette deployments (hundreds of DBs, millions of rows).

For my reference, did you include a render_cell plugin calling get_metadata in those tests? I'm less concerned now that I know a little more about sqlite's caching, but that special situation will jump you to a few orders of magnitude above what the sqlite article describes (e.g. 200 vs 20,000 queries+metadata merges for a page displaying 100 rows of a 200 column table). It wouldn't scale with db size as much as # of visible cells being rendered on the page, although they would be identical queries I suppose so will cache well.

(If you didn't test this specific situation, no worries -- I'm just trying to calibrate my intuition on this and can do my own benchmarks at some point.)

Simon talked about eventually making something like this a standard feature of Datasette

Yeah, getting metadata (and static pages as well for that matter) from internal tables definitely has my vote for including as a standard feature! Its really nice to be able to distribute a single *.db with all the metadata and static pages bundled. My metadata are sufficiently complex/domain specific that it makes sense to continue on my own plugin for now, but I'll be thinking about more general parts I can spin off as possible contributions to liveconfig (if you're open to them) or other plugins in this ecosystem.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for dynamic metadata 930807135  
1066139147 https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147 https://api.github.com/repos/simonw/sqlite-utils/issues/408 IC_kwDOCGYnMM4_i_4L learning4life 24938923 2022-03-13T16:45:00Z 2022-03-13T16:54:09Z NONE

@simonw

Now I get this:

(app-root) sqlite-utils indexes global.db --table
Error: near "(": syntax error
(app-root) sqlite-utils --version
sqlite-utils, version 3.25.1
(app-root) sqlite3 --version
3.36.0 2021-06-18 18:36:39
(app-root) python --version
Python 3.8.11

Dockerfile

FROM centos/python-38-centos7

USER root

RUN yum update -y
RUN yum upgrade -y


# epel
RUN yum -y install epel-release && yum clean all

# SQLite
RUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel 

WORKDIR /build/
COPY sqlite-autoconf-3360000.tar.gz ./
RUN tar -zxf sqlite-autoconf-3360000.tar.gz
WORKDIR /build/sqlite-autoconf-3360000
RUN ./configure
RUN make
RUN make install

# 
RUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip
RUN pip install sqlite-utils
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`deterministic=True` fails on versions of SQLite prior to 3.8.3 1145882578  
1065951744 https://github.com/simonw/datasette/issues/1384#issuecomment-1065951744 https://api.github.com/repos/simonw/datasette/issues/1384 IC_kwDOBm6k_c4_iSIA khusmann 167160 2022-03-12T19:47:17Z 2022-03-12T19:47:17Z NONE

Awesome, thanks @brandonrobertz !

The plugin is close, but looks like it only grabs remote metadata, is that right? Instead what I'm wanting is to grab metadata embedded in the attached databases. Rather than extending that plugin, at this point I've realized I need a lot more flexibility in metadata for my data model (esp around formatting cell values and custom file exports) so rather than extending that I'll continue working on a plugin specific to my app.

If I'm understanding your plugin code correctly, you query the db using the sync handle every time get_metdata is called, right? Won't this become a pretty big bottleneck if a hook into render_cell is trying to read metadata / plugin config?

Making the get_metadata async won't improve the situation by itself as only some of the code paths accessing metadata use that hook. The other paths use the internal metadata dict.

I agree -- because things like render_cell will potentially want to read metadata/config, get_metadata should really remain sync and lightweight, which we can do with something like the remote-metadata plugin that could also poll metadata tables in attached databases.

That leaves your app, where it sounds like you want changes made by the user in the browser in to be immediately reflected, rather than have to wait for the next metadata refresh. In this case I wonder if you could have your app make a sync write to the datasette object so the change would have the immediate effect, but then have a separate async polling mechanism to eventually write that change out to the database for long-term persistence. Then you'd have the best of both worlds, I think? But probably not worth the trouble if your use cases are small (and/or you're not reading metadata/config from tight loops like render_cell).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for dynamic metadata 930807135  
1065929510 https://github.com/simonw/datasette/issues/1384#issuecomment-1065929510 https://api.github.com/repos/simonw/datasette/issues/1384 IC_kwDOBm6k_c4_iMsm khusmann 167160 2022-03-12T17:49:59Z 2022-03-12T17:49:59Z NONE

Ok, I'm taking a slightly different approach, which I think is sort of close to the in-memory _metadata table idea.

I'm using a startup hook to load metadata / other info from the database, which I store in the datasette object for later:

@hookimpl
def startup(datasette):
    async def inner():
        datasette._mypluginmetadata = # await db query
    return inner

Then, I can use this in other plugins:

@hookimpl
def render_cell(value, column, table, database, datasette):
    # use datasette._mypluginmetadata

For my app I don't need anything to update dynamically so it's fine to pre-populate everything on startup. It's also good to have things precached especially for a hook like render_cell, which would otherwise require a ton of redundant db queries.

Makes me wonder if we could take a sort of similar caching approach with the internal _metadata table. Like have a little watchdog that could query all of the attached dbs for their _metadata tables every 5min or so, which then could be merged into the in memory _metadata table which then could be accessed sync by the plugins, or something like that.

For most the use cases I can think of, live updates don't need to take into effect immediately; refreshing a cache every 5min or on some other trigger (adjustable w a config setting) would be just fine.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for dynamic metadata 930807135  
1065334891 https://github.com/simonw/datasette/issues/1634#issuecomment-1065334891 https://api.github.com/repos/simonw/datasette/issues/1634 IC_kwDOBm6k_c4_f7hr dholth 208018 2022-03-11T17:38:08Z 2022-03-11T17:38:08Z NONE

I noticed the image was large when using fly. Is it possible to use a -slim base?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update Dockerfile generated by `datasette publish` 1131295060  
1062450649 https://github.com/simonw/datasette/issues/1655#issuecomment-1062450649 https://api.github.com/repos/simonw/datasette/issues/1655 IC_kwDOBm6k_c4_U7XZ fgregg 536941 2022-03-09T01:10:46Z 2022-03-09T01:10:46Z NONE

i increased the max_returned_row, because I have some scripts that get CSVs from this site, and this makes doing pagination of CSVs less annoying for many cases. i know that's streaming csvs is something you are hoping to address in 1.0. let me know if there's anything i can do to help with that.

as for what if anything can be done about the size of the dom, I don't have any ideas right now, but i'll poke around.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data 1163369515  
1062124485 https://github.com/simonw/datasette/issues/1384#issuecomment-1062124485 https://api.github.com/repos/simonw/datasette/issues/1384 IC_kwDOBm6k_c4_TrvF khusmann 167160 2022-03-08T19:26:32Z 2022-03-08T19:26:32Z NONE

Looks like I'm late to the party here, but wanted to join the convo if there's still time before this interface is solidified in v1.0. My plugin use case is for education / social science data, which is meta-data heavy in the documentation of measurement scales, instruments, collection procedures, etc. that I want to connect to columns, tables, and dbs (and render in static pages, but looks like I can do that with the jinja plugin hook). I'm still digging in and I think @brandonrobertz 's approach will work for me at least for now, but I want to bump this thread in the meantime -- are there still plans for an async metadata hook at some point in the future? (or are you considering other directions?)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Plugin hook for dynamic metadata 930807135  
1059823151 https://github.com/simonw/datasette/pull/1648#issuecomment-1059823151 https://api.github.com/repos/simonw/datasette/issues/1648 IC_kwDOBm6k_c4_K54v codecov[bot] 22429695 2022-03-05T19:56:41Z 2022-03-07T15:38:08Z NONE

Codecov Report

Merging #1648 (32548b8) into main (7d24fd4) will increase coverage by 0.02%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##             main    #1648      +/-   ##
==========================================
+ Coverage   92.03%   92.05%   +0.02%     
==========================================
  Files          34       34              
  Lines        4557     4570      +13     
==========================================
+ Hits         4194     4207      +13     
  Misses        363      363              
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>datasette/url_builder.py</td> <td>100.00% <100.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/utils/__init__.py</td> <td>94.97% <100.00%> (+0.10%)</td> <td>:arrow_up:</td> </tr> <tr> <td>datasette/views/base.py</td> <td>95.49% <100.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/views/table.py</td> <td>96.19% <100.00%> (ø)</td> <td></td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7d24fd4...32548b8. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Use dash encoding for table names and row primary keys in URLs 1160432941  
1060021753 https://github.com/simonw/datasette/pull/1649#issuecomment-1060021753 https://api.github.com/repos/simonw/datasette/issues/1649 IC_kwDOBm6k_c4_LqX5 codecov[bot] 22429695 2022-03-06T19:13:09Z 2022-03-06T19:13:09Z NONE

Codecov Report

Merging #1649 (59b2c16) into main (0499f17) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1649   +/-   ##
=======================================
  Coverage   92.03%   92.03%           
=======================================
  Files          34       34           
  Lines        4557     4557           
=======================================
  Hits         4194     4194           
  Misses        363      363           
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>datasette/utils/__init__.py</td> <td>94.86% <ø> (ø)</td> <td></td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0499f17...59b2c16. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add /opt/homebrew to where spatialite extension can be found 1160677684  
1016456784 https://github.com/simonw/datasette/pull/1602#issuecomment-1016456784 https://api.github.com/repos/simonw/datasette/issues/1602 IC_kwDOBm6k_c48leZQ codecov[bot] 22429695 2022-01-19T13:17:24Z 2022-03-06T01:30:46Z NONE

Codecov Report

Merging #1602 (9eb0bdf) into main (5010d13) will increase coverage by 0.13%.
The diff coverage is n/a.

:exclamation: Current head 9eb0bdf differs from pull request most recent head a9c69dc. Consider uploading reports for the commit a9c69dc to get more accurate results

@@            Coverage Diff             @@
##             main    #1602      +/-   ##
==========================================
+ Coverage   92.03%   92.16%   +0.13%     
==========================================
  Files          34       34              
  Lines        4557     4531      -26     
==========================================
- Hits         4194     4176      -18     
+ Misses        363      355       -8     
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>datasette/tracer.py</td> <td>82.95% <0.00%> (-1.09%)</td> <td>:arrow_down:</td> </tr> <tr> <td>datasette/cli.py</td> <td>77.85% <0.00%> (-0.09%)</td> <td>:arrow_down:</td> </tr> <tr> <td>datasette/utils/__init__.py</td> <td>94.79% <0.00%> (-0.07%)</td> <td>:arrow_down:</td> </tr> <tr> <td>datasette/__init__.py</td> <td>100.00% <0.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/app.py</td> <td>95.37% <0.00%> (+1.05%)</td> <td>:arrow_up:</td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 73f2d25...a9c69dc. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2 1108084641  
1059863997 https://github.com/simonw/datasette/issues/1439#issuecomment-1059863997 https://api.github.com/repos/simonw/datasette/issues/1439 IC_kwDOBm6k_c4_LD29 karlcow 505230 2022-03-06T00:57:57Z 2022-03-06T00:57:57Z NONE

Probably too late… but I have just seen this because
http://simonwillison.net/2022/Mar/5/dash-encoding/#atom-everything

And it reminded me of comma tools at W3C.
http://www.w3.org/,tools

Example, the text version of W3C homepage
https://www.w3.org/,text

The challenge comes down to telling the difference between the following:

* `/db/table` - an HTML table page

/db/table

* `/db/table.csv` - the CSV version of `/db/table`

/db/table,csv

* `/db/table.csv` - no this one is actually a database table called `table.csv`

/db/table.csv

* `/db/table.csv.csv` - the CSV version of `/db/table.csv`

/db/table.csv,csv

* `/db/table.csv.csv.csv` and so on...

/db/table.csv.csv,csv

I haven't checked all the cases in the thread.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Rethink how .ext formats (v.s. ?_format=) works before 1.0 973139047  
1059652834 https://github.com/simonw/sqlite-utils/issues/412#issuecomment-1059652834 https://api.github.com/repos/simonw/sqlite-utils/issues/412 IC_kwDOCGYnMM4_KQTi zaneselvans 596279 2022-03-05T02:14:40Z 2022-03-05T02:14:40Z NONE

We do a lot of df.to_sql() to write into sqlite, mostly in this moddule

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Optional Pandas integration 1160182768  
1059097969 https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1059097969 https://api.github.com/repos/simonw/sqlite-utils/issues/408 IC_kwDOCGYnMM4_II1x learning4life 24938923 2022-03-04T11:55:21Z 2022-03-04T11:55:21Z NONE

Thanks @simonw

I will test it after my vacation 👍

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`deterministic=True` fails on versions of SQLite prior to 3.8.3 1145882578  
1051473892 https://github.com/simonw/datasette/issues/260#issuecomment-1051473892 https://api.github.com/repos/simonw/datasette/issues/260 IC_kwDOBm6k_c4-rDfk zaneselvans 596279 2022-02-26T02:24:15Z 2022-02-26T02:24:15Z NONE

Is there already functionality that can be used to validate the metadata.json file? Is there a JSON Schema that defines it? Or a validation that's available via datasette with Python? We're working on automatically building the metadata in CI and when we deploy to cloud run, and it would be nice to be able to check whether the the metadata we're outputting is valid in our tests.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Validate metadata.json on startup 323223872  
1050123919 https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1050123919 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62 IC_kwDODEm0Qs4-l56P sw-yx 6764957 2022-02-24T18:10:18Z 2022-02-24T18:10:18Z NONE

gonna close this for now since i'm not actively working on it.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
KeyError: 'created_at' for private accounts? 1088816961  
1049879118 https://github.com/simonw/datasette/issues/1641#issuecomment-1049879118 https://api.github.com/repos/simonw/datasette/issues/1641 IC_kwDOBm6k_c4-k-JO fgregg 536941 2022-02-24T13:49:26Z 2022-02-24T13:49:26Z NONE

maybe worth considering adding buttons for paren, asterisk, etc. under the input text box on mobile?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Tweak mobile keyboard settings 1149310456  
1049775451 https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1049775451 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62 IC_kwDODEm0Qs4-kk1b miuku 43036882 2022-02-24T11:43:31Z 2022-02-24T11:43:31Z NONE

i seem to have fixed this issue by applying for elevated API access

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
KeyError: 'created_at' for private accounts? 1088816961  
1043626870 https://github.com/simonw/datasette/issues/327#issuecomment-1043626870 https://api.github.com/repos/simonw/datasette/issues/327 IC_kwDOBm6k_c4-NHt2 dholth 208018 2022-02-17T23:37:24Z 2022-02-17T23:37:24Z NONE

On second thought any kind of quick-to-decompress-on-startup could be helpful if we're paying for the container registry and deployment bandwidth but not ephemeral storage.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Explore if SquashFS can be used to shrink size of packaged Docker containers 335200136  
1043609198 https://github.com/simonw/datasette/issues/327#issuecomment-1043609198 https://api.github.com/repos/simonw/datasette/issues/327 IC_kwDOBm6k_c4-NDZu dholth 208018 2022-02-17T23:21:36Z 2022-02-17T23:33:01Z NONE

On fly.io. This particular database goes from 1.4GB to 200M. Slower, part of that might be having no --inspect-file?

$ datasette publish fly ...   --generate-dir /tmp/deploy-this
...
$ mksquashfs large.db large.squashfs
$ rm large.db # don't accidentally put it in the image
$ cat Dockerfile
FROM python:3.8
COPY . /app
WORKDIR /app

ENV DATASETTE_SECRET 'xyzzy'
RUN pip install -U datasette
# RUN datasette inspect large.db --inspect-file inspect-data.json
ENV PORT 8080
EXPOSE 8080
CMD mount -o loop -t squashfs large.squashfs /mnt; datasette serve --host 0.0.0.0 -i /mnt/large.db --cors --port $PORT

It would also be possible to copy the file onto the ~6GB available on the ephemeral container filesystem on startup. A little against the spirit of the thing? On this example the whole docker image is 2.42 GB and the squashfs version is 1.14 GB.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Explore if SquashFS can be used to shrink size of packaged Docker containers 335200136  
1041363433 https://github.com/simonw/sqlite-utils/issues/406#issuecomment-1041363433 https://api.github.com/repos/simonw/sqlite-utils/issues/406 IC_kwDOCGYnMM4-EfHp psychemedia 82988 2022-02-16T10:57:03Z 2022-02-16T10:57:19Z NONE

Wondering if this actually relates to https://github.com/simonw/sqlite-utils/issues/402 ?

I also wonder if this would be a sensible approach for eg registering pint based quantity conversions into and out of the db, perhaps storing the quantity as a serialised magnitude measurement single column string?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Creating tables with custom datatypes 1128466114  
1041325398 https://github.com/simonw/sqlite-utils/issues/402#issuecomment-1041325398 https://api.github.com/repos/simonw/sqlite-utils/issues/402 IC_kwDOCGYnMM4-EV1W psychemedia 82988 2022-02-16T10:12:48Z 2022-02-16T10:18:55Z NONE

My hunch is that the case where you want to consider input from more than one column will actually be pretty rare - the only case I can think of where I would want to do that is for latitude/longitude columns

Other possible pairs: unconventional date/datetime and timezone pairs eg 2022-02-16::17.00, London; or more generally, numerical value and unit of measurement pairs (eg if you want to cast into and out of different measurement units using packages like pint) or currencies etc. Actually, in that case, I guess you may be presenting things that are unit typed already, and so a conversion would need to parse things into an appropriate, possibly two column value, unit format.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Advanced class-based `conversions=` mechanism 1125297737  
1041313679 https://github.com/simonw/sqlite-utils/issues/406#issuecomment-1041313679 https://api.github.com/repos/simonw/sqlite-utils/issues/406 IC_kwDOCGYnMM4-ES-P psychemedia 82988 2022-02-16T09:59:51Z 2022-02-16T10:00:10Z NONE

The CustomColumnType() approach looks good. This pushes you into the mindspace that you are defining and working with a custom column type.

When creating the table, you could then error, or at least warn, if someone wasn't setting a column on a type or a custom column type, which I guess is where mypy comes in?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Creating tables with custom datatypes 1128466114  
1040519196 https://github.com/simonw/sqlite-utils/pull/407#issuecomment-1040519196 https://api.github.com/repos/simonw/sqlite-utils/issues/407 IC_kwDOCGYnMM4-BRAc codecov[bot] 22429695 2022-02-15T16:52:21Z 2022-02-15T18:12:03Z NONE

Codecov Report

Merging #407 (a974da5) into main (e7f0401) will increase coverage by 0.71%.
The diff coverage is 85.00%.

@@            Coverage Diff             @@
##             main     #407      +/-   ##
==========================================
+ Coverage   95.91%   96.62%   +0.71%     
==========================================
  Files           6        6              
  Lines        2421     2460      +39     
==========================================
+ Hits         2322     2377      +55     
+ Misses         99       83      -16     
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>sqlite_utils/cli.py</td> <td>95.76% <85.00%> (+0.06%)</td> <td>:arrow_up:</td> </tr> <tr> <td>sqlite_utils/utils.py</td> <td>94.59% <0.00%> (ø)</td> <td></td> </tr> <tr> <td>sqlite_utils/db.py</td> <td>97.72% <0.00%> (+1.43%)</td> <td>:arrow_up:</td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e7f0401...a974da5. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add SpatiaLite helpers to CLI 1138948786  
1035717429 https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1035717429 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 IC_kwDOD079W849u8s1 harperreed 18504 2022-02-11T01:55:38Z 2022-02-11T01:55:38Z NONE

I would love this merged!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update for Big Sur 771511344  
1034222709 https://github.com/simonw/datasette/issues/1633#issuecomment-1034222709 https://api.github.com/repos/simonw/datasette/issues/1633 IC_kwDOBm6k_c49pPx1 henrikek 6613091 2022-02-09T21:47:02Z 2022-02-09T21:47:02Z NONE

Is this the correct solution to add the base_url row to url_builder.py?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
base_url or prefix does not work with _exact match 1129052172  
1033772902 https://github.com/simonw/datasette/issues/236#issuecomment-1033772902 https://api.github.com/repos/simonw/datasette/issues/236 IC_kwDOBm6k_c49nh9m jordaneremieff 1376648 2022-02-09T13:40:52Z 2022-02-09T13:40:52Z NONE

Hi @simonw,

I've received some inquiries over the last year or so about Datasette and how it might be supported by Mangum. I maintain Mangum which is, as far as I know, the only project that provides support for ASGI applications in AWS Lambda.

If there is anything that I can help with here, please let me know because I think what Datasette provides to the community (even beyond OSS) is noble and worthy of special consideration.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
datasette publish lambda plugin 317001500  
1033641009 https://github.com/simonw/sqlite-utils/pull/203#issuecomment-1033641009 https://api.github.com/repos/simonw/sqlite-utils/issues/203 IC_kwDOCGYnMM49nBwx psychemedia 82988 2022-02-09T11:06:18Z 2022-02-09T11:06:18Z NONE

Is there any progress elsewhere on the handling of compound / composite foreign keys, or is this PR still effectively open?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
changes to allow for compound foreign keys 743384829  
1033332570 https://github.com/simonw/sqlite-utils/issues/403#issuecomment-1033332570 https://api.github.com/repos/simonw/sqlite-utils/issues/403 IC_kwDOCGYnMM49l2da fgregg 536941 2022-02-09T04:22:43Z 2022-02-09T04:22:43Z NONE

dddoooope

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to add a primary key to a rowid table using `sqlite-utils transform --pk` 1126692066  
1032126353 https://github.com/simonw/sqlite-utils/issues/403#issuecomment-1032126353 https://api.github.com/repos/simonw/sqlite-utils/issues/403 IC_kwDOCGYnMM49hP-R fgregg 536941 2022-02-08T01:45:15Z 2022-02-08T01:45:31Z NONE

you can hack something like this to achieve this result:

sqlite-utils convert my_database my_table rowid "{'id': value}" --multi

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to add a primary key to a rowid table using `sqlite-utils transform --pk` 1126692066  
1032120014 https://github.com/simonw/sqlite-utils/issues/26#issuecomment-1032120014 https://api.github.com/repos/simonw/sqlite-utils/issues/26 IC_kwDOCGYnMM49hObO fgregg 536941 2022-02-08T01:32:34Z 2022-02-08T01:32:34Z NONE

if you are curious about prior art, https://github.com/jsnell/json-to-multicsv is really good!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for turning nested JSON into foreign keys / many-to-many 455486286  
1031463789 https://github.com/simonw/datasette/pull/1631#issuecomment-1031463789 https://api.github.com/repos/simonw/datasette/issues/1631 IC_kwDOBm6k_c49euNt codecov[bot] 22429695 2022-02-07T13:21:48Z 2022-02-07T13:21:48Z NONE

Codecov Report

Merging #1631 (62eed84) into main (03305ea) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1631   +/-   ##
=======================================
  Coverage   92.19%   92.19%           
=======================================
  Files          34       34           
  Lines        4546     4546           
=======================================
  Hits         4191     4191           
  Misses        355      355           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 03305ea...62eed84. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19 1125973221  
1030807433 https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030807433 https://api.github.com/repos/simonw/sqlite-utils/issues/399 IC_kwDOCGYnMM49cN-J chris48s 6025893 2022-02-06T10:54:09Z 2022-02-06T10:54:09Z NONE

Interesting that some accept an SRID and others do not - presumably GeomFromGeoJSON() always uses SRID=4326?

The ewtk/ewkb ones don't accept an SRID is because ewkt encodes the SRID in the string, so you would do this with a wkt string:

GeomFromText('POINT(529090 179645)', 27700)

but for ewkt it would be

GeomFromEWKT('SRID=27700;POINT(529090 179645)')

The specs for KML and GeoJSON specify a Coordinate Reference System for the format

  • https://datatracker.ietf.org/doc/html/rfc7946#section-4
  • https://docs.opengeospatial.org/is/12-007r2/12-007r2.html#1274

GML can specify the SRID in the XML at feature level e.g:

<gml:Point srsName="EPSG:27700">
  <gml:coordinates>529090, 179645</gml:coordinates>
</gml:Point>

There's a few more obscure formats in there, but broadly I think it is safe to assume an SRID param exists on the function for cases where the SRID is not implied by or specified in the input format.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Make it easier to insert geometries, with documentation and maybe code 1124731464  
1029980337 https://github.com/simonw/datasette/pull/1629#issuecomment-1029980337 https://api.github.com/repos/simonw/datasette/issues/1629 IC_kwDOBm6k_c49ZECx codecov[bot] 22429695 2022-02-04T13:21:09Z 2022-02-04T13:21:09Z NONE

Codecov Report

Merging #1629 (1c0d848) into main (1af1041) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1629   +/-   ##
=======================================
  Coverage   92.16%   92.16%           
=======================================
  Files          34       34           
  Lines        4531     4531           
=======================================
  Hits         4176     4176           
  Misses        355      355           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1af1041...1c0d848. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0 1124191982  
1029177700 https://github.com/simonw/sqlite-utils/pull/385#issuecomment-1029177700 https://api.github.com/repos/simonw/sqlite-utils/issues/385 IC_kwDOCGYnMM49WAFk codecov[bot] 22429695 2022-02-03T16:38:45Z 2022-02-04T05:52:39Z NONE

Codecov Report

Merging #385 (af86b17) into main (74586d3) will decrease coverage by 0.61%.
The diff coverage is 28.00%.

@@            Coverage Diff             @@
##             main     #385      +/-   ##
==========================================
- Coverage   96.52%   95.91%   -0.62%     
==========================================
  Files           6        6              
  Lines        2389     2421      +32     
==========================================
+ Hits         2306     2322      +16     
- Misses         83       99      +16     
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>sqlite_utils/cli.py</td> <td>95.69% <ø> (+0.15%)</td> <td>:arrow_up:</td> </tr> <tr> <td>sqlite_utils/db.py</td> <td>96.29% <15.00%> (-1.40%)</td> <td>:arrow_down:</td> </tr> <tr> <td>sqlite_utils/utils.py</td> <td>94.59% <80.00%> (ø)</td> <td></td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 74586d3...af86b17. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add new spatialite helper methods 1102899312  
1028419517 https://github.com/simonw/datasette/pull/1617#issuecomment-1028419517 https://api.github.com/repos/simonw/datasette/issues/1617 IC_kwDOBm6k_c49TG-9 codecov[bot] 22429695 2022-02-02T22:30:26Z 2022-02-03T01:36:07Z NONE

Codecov Report

Merging #1617 (af293c9) into main (2aa686c) will increase coverage by 0.06%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##             main    #1617      +/-   ##
==========================================
+ Coverage   92.09%   92.16%   +0.06%     
==========================================
  Files          34       34              
  Lines        4518     4531      +13     
==========================================
+ Hits         4161     4176      +15     
+ Misses        357      355       -2     
<table> <thead> <tr> <th>Impacted Files</th> <th>Coverage Δ</th> <th></th> </tr> </thead> <tbody> <tr> <td>datasette/app.py</td> <td>95.37% <100.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/views/table.py</td> <td>96.19% <0.00%> (ø)</td> <td></td> </tr> <tr> <td>datasette/utils/__init__.py</td> <td>94.79% <0.00%> (+<0.01%)</td> <td>:arrow_up:</td> </tr> <tr> <td>datasette/views/base.py</td> <td>95.49% <0.00%> (+0.07%)</td> <td>:arrow_up:</td> </tr> <tr> <td>datasette/views/special.py</td> <td>95.09% <0.00%> (+2.38%)</td> <td>:arrow_up:</td> </tr> </tbody> </table>

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2aa686c...af293c9. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Ensure template_path always uses "/" to match jinja 1120990806  
1028423514 https://github.com/simonw/datasette/pull/1626#issuecomment-1028423514 https://api.github.com/repos/simonw/datasette/issues/1626 IC_kwDOBm6k_c49TH9a codecov[bot] 22429695 2022-02-02T22:36:37Z 2022-02-02T22:39:52Z NONE

Codecov Report

Merging #1626 (4b4d0e1) into main (b5e6b1a) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1626   +/-   ##
=======================================
  Coverage   92.16%   92.16%           
=======================================
  Files          34       34           
  Lines        4531     4531           
=======================================
  Hits         4176     4176           
  Misses        355      355           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b5e6b1a...4b4d0e1. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Try test suite against macOS and Windows 1122451096  
1028387529 https://github.com/simonw/datasette/pull/1622#issuecomment-1028387529 https://api.github.com/repos/simonw/datasette/issues/1622 IC_kwDOBm6k_c49S_LJ codecov[bot] 22429695 2022-02-02T21:45:21Z 2022-02-02T21:45:21Z NONE

Codecov Report

Merging #1622 (fbaf317) into main (8d5779a) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1622   +/-   ##
=======================================
  Coverage   92.11%   92.11%           
=======================================
  Files          34       34           
  Lines        4525     4525           
=======================================
  Hits         4168     4168           
  Misses        357      357           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8d5779a...fbaf317. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Test against Python 3.11-dev 1122414274  
1028294089 https://github.com/simonw/datasette/issues/1618#issuecomment-1028294089 https://api.github.com/repos/simonw/datasette/issues/1618 IC_kwDOBm6k_c49SoXJ strada 770231 2022-02-02T19:42:03Z 2022-02-02T19:42:03Z NONE

Thanks for looking into this. It might have been nice if explain surfaced these function calls. Looks like explain query plan does, but only for basic queries.

sqlite-utils fixtures.db 'explain query plan select * from pragma_function_list(), pragma_database_list(), pragma_module_list()' -t
  id    parent    notused  detail
----  --------  ---------  ------------------------------------------------
   4         0          0  SCAN pragma_function_list VIRTUAL TABLE INDEX 0:
   8         0          0  SCAN pragma_database_list VIRTUAL TABLE INDEX 0:
  12         0          0  SCAN pragma_module_list VIRTUAL TABLE INDEX 0:
sqlite-utils fixtures.db 'explain query plan select * from pragma_function_list() as fl, pragma_database_list() as dl, pragma_module_list() as ml' -t
  id    parent    notused  detail
----  --------  ---------  ------------------------------
   4         0          0  SCAN fl VIRTUAL TABLE INDEX 0:
   8         0          0  SCAN dl VIRTUAL TABLE INDEX 0:
  12         0          0  SCAN ml VIRTUAL TABLE INDEX 0:
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Reconsider policy on blocking queries containing the string "pragma" 1121121305  
1025732071 https://github.com/simonw/datasette/pull/1616#issuecomment-1025732071 https://api.github.com/repos/simonw/datasette/issues/1616 IC_kwDOBm6k_c49I23n codecov[bot] 22429695 2022-01-31T13:20:18Z 2022-01-31T13:20:18Z NONE

Codecov Report

Merging #1616 (4ebe94b) into main (2aa686c) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1616   +/-   ##
=======================================
  Coverage   92.09%   92.09%           
=======================================
  Files          34       34           
  Lines        4518     4518           
=======================================
  Hits         4161     4161           
  Misses        357      357           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2aa686c...4ebe94b. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Bump black from 21.12b0 to 22.1.0 1119413338  
1023997327 https://github.com/simonw/datasette/issues/1615#issuecomment-1023997327 https://api.github.com/repos/simonw/datasette/issues/1615 IC_kwDOBm6k_c49CPWP aidansteele 369053 2022-01-28T08:37:36Z 2022-01-28T08:37:36Z NONE

Oops, it feels like this should perhaps be migrated to GitHub Discussions - sorry! I don't think I have the ability to do that 😅

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Potential simplified publishing mechanism 1117132741  
1021264135 https://github.com/dogsheep/dogsheep.github.io/pull/6#issuecomment-1021264135 https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6 IC_kwDODMzF1s4830EH ligurio 1151557 2022-01-25T14:52:40Z 2022-01-25T14:52:40Z NONE

@simonw, could you review?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add testres-db tool 842765105  
1017993482 https://github.com/simonw/datasette/issues/1608#issuecomment-1017993482 https://api.github.com/repos/simonw/datasette/issues/1608 IC_kwDOBm6k_c48rVkK astrojuanlu 316517 2022-01-20T22:46:16Z 2022-01-20T22:46:16Z NONE

Or you can use https://sphinx-version-warning.readthedocs.io/! 😄

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Documentation should clarify /stable/ vs /latest/ 1109808154  
1012128696 https://github.com/simonw/datasette/pull/1593#issuecomment-1012128696 https://api.github.com/repos/simonw/datasette/issues/1593 IC_kwDOBm6k_c48U9u4 codecov[bot] 22429695 2022-01-13T13:18:35Z 2022-01-13T13:18:35Z NONE

Codecov Report

Merging #1593 (df73ebb) into main (8c401ee) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main    #1593   +/-   ##
=======================================
  Coverage   92.09%   92.09%           
=======================================
  Files          34       34           
  Lines        4516     4516           
=======================================
  Hits         4159     4159           
  Misses        357      357           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8c401ee...df73ebb. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18 1101705012  
1010559681 https://github.com/simonw/datasette/issues/1590#issuecomment-1010559681 https://api.github.com/repos/simonw/datasette/issues/1590 IC_kwDOBm6k_c48O-rB eelkevdbos 1001306 2022-01-12T02:10:20Z 2022-01-12T02:10:20Z NONE

In my example, path matching happens at the application layer (being the Django channels URLRouter). That might be a somewhat exotic solution that would normally be solved by a proxy like Apache or Nginx. However, in my specific use case, this is a "feature" enabling me to do simple management of databases and metadata from within a Django admin app instance mapped in that same router.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Table+query JSON and CSV links broken when using `base_url` setting 1099723916  
1010556333 https://github.com/simonw/datasette/issues/1590#issuecomment-1010556333 https://api.github.com/repos/simonw/datasette/issues/1590 IC_kwDOBm6k_c48O92t eelkevdbos 1001306 2022-01-12T02:03:59Z 2022-01-12T02:03:59Z NONE

Thank you for the quick reply! Just a quick observation, I am running this locally without a proxy, whereas your fly example seems to be running behind an apache proxy (if the name is accurate). Can it be that the apache proxy strips the prefix before it passes on the request to the daphne backend?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Table+query JSON and CSV links broken when using `base_url` setting 1099723916  
1009548580 https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1009548580 https://api.github.com/repos/simonw/sqlite-utils/issues/365 IC_kwDOCGYnMM48LH0k fgregg 536941 2022-01-11T02:43:34Z 2022-01-11T02:43:34Z NONE

thanks so much! always a pleasure to see how you work through these things

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
create-index should run analyze after creating index 1096558279  

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 288.656ms · About: github-to-sqlite