id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 267513424,MDU6SXNzdWUyNjc1MTM0MjQ=,1,Addressable pages for every row in a table,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T00:44:16Z,2017-10-24T14:11:04Z,2017-10-24T14:11:03Z,OWNER,," /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267513523,MDU6SXNzdWUyNjc1MTM1MjM=,2,Initial proof-of-concept,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T00:45:37Z,2017-10-23T01:26:39Z,2017-10-23T00:45:53Z,OWNER,,Implemented in https://github.com/simonw/stateless-datasets/commit/de04d7a854d71003ffcf98028eab976a936c2dba,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267515678,MDU6SXNzdWUyNjc1MTU2Nzg=,3,"Make individual column valuables addressable, with smart content types",9599,simonw,open,0,,,,,1,2017-10-23T01:11:32Z,2017-12-10T03:11:58Z,,OWNER,,"Some SQLite databases embed images in columns. It would be cool if these had URLs. /database-name-7sha256/table-name/compound-pk/column /database-name-7sha256/table-name/compound-pk/column.json /database-name-7sha256/table-name/compound-pk/column.png /database-name-7sha256/table-name/compound-pk/column.gif /database-name-7sha256/table-name/compound-pk/column.txt The one without an explicit file extension auto-detects the correct extension.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267515836,MDU6SXNzdWUyNjc1MTU4MzY=,4,Make URLs immutable,9599,simonw,closed,0,,,2857392,Ship first public release,8,2017-10-23T01:13:30Z,2017-10-24T02:38:24Z,2017-10-24T02:38:24Z,OWNER,,"Absolutely everything should have a far-future expires header Part of the URL will be the truncated sha1 hash of the database file itself, calculated at build time",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267516066,MDU6SXNzdWUyNjc1MTYwNjY=,5,Implement sensible query pagination,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-10-23T01:16:00Z,2017-11-10T20:41:39Z,2017-11-10T20:41:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267516329,MDU6SXNzdWUyNjc1MTYzMjk=,6,Better JSON response options,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T01:18:47Z,2017-10-24T15:07:58Z,2017-10-24T15:07:58Z,OWNER,,"Default returns this: { “Columns”: [“id”, “name”, “age”], “Rows”: [ [45, “Simon”, 36] ] } .jsono instead returns a list of objects each duplicating the headers in its keys. They both probably share the same pagination mechanism so it might not be a jsono flat list.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267516650,MDU6SXNzdWUyNjc1MTY2NTA=,7,Framework where by every page is JSON plus a template,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T01:22:03Z,2017-10-24T02:27:25Z,2017-10-24T02:27:25Z,OWNER,,"Every single page of my interface should be implemented as a function that returns JSON. I can then build my jinja templates on top of the exact data that would be returned by the API version.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517314,MDU6SXNzdWUyNjc1MTczMTQ=,8,Attempting an INSERT or UPDATE should return a sane error message,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T01:28:25Z,2017-10-23T15:28:12Z,2017-10-23T15:28:08Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517348,MDU6SXNzdWUyNjc1MTczNDg=,9,Initial test suite,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T01:28:46Z,2017-10-24T05:55:33Z,2017-10-24T05:55:33Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517381,MDU6SXNzdWUyNjc1MTczODE=,10,Set up Travis,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T01:29:07Z,2017-11-04T23:48:57Z,2017-11-04T23:48:57Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267522549,MDU6SXNzdWUyNjc1MjI1NDk=,11,Code that generates compile-time properties about the database ,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T02:18:24Z,2017-10-23T16:04:23Z,2017-10-23T16:04:23Z,OWNER,,"At a minimum this will include: * sha hash of each database file * list of tables with row counts for each database file",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267523511,MDU6SXNzdWUyNjc1MjM1MTE=,12,Make it so you can override templates,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-10-23T02:25:35Z,2017-11-30T16:42:46Z,2017-11-30T16:38:34Z,OWNER,,"The app will ship with default templates but, just like with the Django admin, you will be able to override them using either explicit configuration settings or just by dropping in templates with certain file names. Template inheritance should work here, both allowing you to override just the base template and allowing you to customize tiny bits of others.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267542338,MDU6SXNzdWUyNjc1NDIzMzg=,13,Add a syntax highlighting SQL editor,9599,simonw,closed,0,,,,,1,2017-10-23T05:03:33Z,2017-11-15T02:04:51Z,2017-11-15T02:04:51Z,OWNER,,https://ace.c9.io/#nav=embedding looks like a good option,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267707940,MDU6SXNzdWUyNjc3MDc5NDA=,14,Datasette Plugins,9599,simonw,closed,0,,,,,22,2017-10-23T15:15:28Z,2019-05-13T18:58:20Z,2019-05-13T18:58:19Z,OWNER,,"It would be neat if additional functionality could be opted-in to the system in the form of easy-to-add plugins, hosted as separate packages. First example: a Google Analytics plugin, which adds GA tracking code with your tracking ID to the web interface for your dataset. This may be an opportunity to experiment with entry points: http://amir.rachum.com/blog/2017/07/28/python-entry-points/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267713226,MDU6SXNzdWUyNjc3MTMyMjY=,15,Support multiple databases,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T15:29:51Z,2017-10-24T02:01:38Z,2017-10-24T02:01:38Z,OWNER,,"I'm going to loop through every database file in the app root directory and bundle all of them. Each one will be accessible at /databasename Note this is without the file extension, and we will disallow multiple files with the same name but different extensions. Supported extensions to start with will be `.db` and `.sqlite` and `.sqlite3`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267726219,MDU6SXNzdWUyNjc3MjYyMTk=,16,Default HTML/CSS needs to look reasonable and be responsive,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T16:05:22Z,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,,"Version one should have the following characteristics: - Looks OK - Works great on mobile - Loads extremely fast - No JavaScript! At least not in v1.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267732005,MDU6SXNzdWUyNjc3MzIwMDU=,17,"In development mode, should still pick up new .db files",9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T16:22:40Z,2017-10-24T02:26:48Z,2017-10-24T02:26:47Z,OWNER,,Follow on from #11 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267739593,MDU6SXNzdWUyNjc3Mzk1OTM=,18,See if I can get a websockets interface working,9599,simonw,closed,0,,,,,1,2017-10-23T16:46:41Z,2021-01-04T20:05:52Z,2021-01-04T20:05:48Z,OWNER,,"Since I am already running on Sanic, how hard would it be to add a websocket ebdpoint that lets you talk to sqlite interactively? Could this be used to efficiently support streaming in answers to giant queries?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267741262,MDU6SXNzdWUyNjc3NDEyNjI=,19,Efficient url for downloading the raw database file,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T16:52:17Z,2017-10-25T15:21:16Z,2017-10-25T15:19:37Z,OWNER,,Use Sanic support for steaming large files http://sanic.readthedocs.io/en/latest/sanic/response.html#file-streaming,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267759136,MDU6SXNzdWUyNjc3NTkxMzY=,20,Config file with support for defining canned queries,9599,simonw,closed,0,9599,simonw,2949431,Custom templates edition,9,2017-10-23T17:53:06Z,2017-12-05T19:05:35Z,2017-12-05T17:44:09Z,OWNER,,"Probably using YAML because then we get support for multiline strings: bats: db: bats.sqlite3 name: ""Bat sightings"" queries: specific_row: | select * from Bats where a = 1; ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267769034,MDU6SXNzdWUyNjc3NjkwMzQ=,21,Use Sanic configuration mechanism ,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T18:25:14Z,2017-11-10T20:45:42Z,2017-11-10T20:45:42Z,OWNER,,http://sanic.readthedocs.io/en/latest/sanic/config.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267769431,MDU6SXNzdWUyNjc3Njk0MzE=,22,Refactor to use class based views ,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T18:26:22Z,2019-05-27T20:05:56Z,2017-10-24T02:25:53Z,OWNER,,http://sanic.readthedocs.io/en/latest/sanic/class_based_views.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267788884,MDU6SXNzdWUyNjc3ODg4ODQ=,23,Support Django-style filters in querystring arguments,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T19:29:42Z,2017-10-25T04:23:03Z,2017-10-25T04:23:02Z,OWNER,,"e.g /database/table?name__contains=Simon&age__gte=4 Same format as Django: double underscore as the split. If you need to match against a column that happens to contain a double underscore in its official name, do this: /database/table?weird__column__exact=Simon __exact is the default operation if none is supplied.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267828746,MDU6SXNzdWUyNjc4Mjg3NDY=,24,Implement full URL design,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T21:49:05Z,2017-10-24T14:12:00Z,2017-10-24T14:12:00Z,OWNER,,"Full URL design: /database-name /database-name.json /database-name-7sha256 /database-name-7sha256.json /database-name/table-name /database-name/table-name.json /database-name-7sha256/table-name /database-name-7sha256/table-name.json /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267857622,MDU6SXNzdWUyNjc4NTc2MjI=,25,Endpoint that returns SQL ready to be piped into DB,9599,simonw,closed,0,,,,,2,2017-10-24T00:19:26Z,2017-11-15T05:11:12Z,2017-11-15T05:11:11Z,OWNER,,It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267861210,MDU6SXNzdWUyNjc4NjEyMTA=,26,Command line tool for uploading one or more DBs to Now,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-10-24T00:43:10Z,2017-11-11T07:25:30Z,2017-11-11T07:25:30Z,OWNER,,"Uploading files appears to be undocumented, but I found it in their code here: https://github.com/zeit/now-cli/blob/0ca7d1fe44ebdf460b64fdc38ba543b8e295ac40/src/providers/sh/util/index.js#L291",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267886330,MDU6SXNzdWUyNjc4ODYzMzA=,27,Ability to plot a simple graph,9599,simonw,closed,0,,,,,3,2017-10-24T03:34:59Z,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,,"Might be as simple as: pick he type of chart (bar, line) and then pick the column for the X axis and the column for the Y axis. Maybe also allow a pie chart. It’s up to the user to come up with SQL that gets the right values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267886865,MDU6SXNzdWUyNjc4ODY4NjU=,28,/database?sql= should redirect correctly,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-24T03:38:44Z,2017-10-24T23:54:30Z,2017-10-24T23:54:30Z,OWNER,,Needs to redirect to the location with the hash while retaining the query string. This should also work with the .json extension.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268050821,MDU6SXNzdWUyNjgwNTA4MjE=,29,Handle bytestring records encoding to JSON,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-24T14:18:45Z,2017-10-24T14:59:00Z,2017-10-24T14:58:47Z,OWNER,,"http://localhost:8006/northwind-40d049b/Categories.json 500s right now The string representation of one of the values looks like this: b""\x15\x1c/\x00\x02\x00 This is a bytestring from the database which cannot be naively converted to a unicode string.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268078453,MDU6SXNzdWUyNjgwNzg0NTM=,30,Do something neat with foreign keys,9599,simonw,closed,0,,,,,1,2017-10-24T15:29:29Z,2017-11-14T18:29:08Z,2017-11-14T18:29:01Z,OWNER,,"https://www.sqlite.org/pragma.html#pragma_foreign_key_list SQLite has robust support for introspecting foreign keys. I could use that to automatically link to the corresponding record from my tables.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268087542,MDU6SXNzdWUyNjgwODc1NDI=,31,Idea: colour scheme based on sha256 of db,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-24T15:52:38Z,2018-05-28T18:10:45Z,2017-11-09T14:14:59Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268106803,MDU6SXNzdWUyNjgxMDY4MDM=,32,Try running SQLite queries in a separate thread,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-24T16:48:42Z,2017-11-09T14:05:56Z,2017-11-09T14:05:56Z,OWNER,,"https://pymotw.com/3/asyncio/executors.html Would be good to have some actual benchmarks so I can evaluate if this is worth it or not.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268110769,MDU6SXNzdWUyNjgxMTA3Njk=,33,Use locust for benchmarking and load tests,9599,simonw,open,0,,,,,0,2017-10-24T17:00:09Z,2017-12-10T03:12:16Z,,OWNER,,"https://github.com/locustio/locust Needed for #32 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268176505,MDU6SXNzdWUyNjgxNzY1MDU=,34,Support CSV export with a .csv extension,9599,simonw,closed,0,,,,,1,2017-10-24T20:34:43Z,2021-06-17T18:14:48Z,2018-05-28T20:45:34Z,OWNER,,"Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports. How would this work against a view which doesn’t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records? Relates to #5 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/34/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268262480,MDU6SXNzdWUyNjgyNjI0ODA=,36,"date, year, month and day querystring lookups",9599,simonw,closed,0,,,,,3,2017-10-25T04:23:45Z,2018-05-28T17:30:53Z,2018-05-28T17:30:53Z,OWNER,,"- [ ] `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date - [ ] `?timestamp___year=2017` - return every item where the timestamp falls within 2017 - [ ] `?timestamp___month=1` - return every item where the month component is January - [ ] `?timestamp___day=10` - return every item where the day-of-the-month component is 10 Follow on from #23 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268453968,MDU6SXNzdWUyNjg0NTM5Njg=,37,Ability to serialize massive JSON without blocking event loop,9599,simonw,closed,0,,,,,2,2017-10-25T15:58:03Z,2020-05-30T17:29:20Z,2020-05-30T17:29:20Z,OWNER,,"We run the risk of someone attempting a select statement that returns thousands of rows and hence takes several seconds just to JSON encode the response, effectively blocking the event loop and pausing all other traffic. The Twisted community have a solution for this, can we adapt that in some way? http://as.ynchrono.us/2010/06/asynchronous-json_18.html?m=1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268462768,MDU6SXNzdWUyNjg0NjI3Njg=,38,Experiment with patterns for concurrent long running queries,9599,simonw,closed,0,,,,,5,2017-10-25T16:23:42Z,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,,I want to understand how the system could perform under load with many concurrent long-running queries. Can we serve these without blocking the event loop?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268470572,MDU6SXNzdWUyNjg0NzA1NzI=,40,Implement command-line tool interface,9599,simonw,closed,0,,,2857392,Ship first public release,11,2017-10-25T16:47:15Z,2017-11-11T07:27:33Z,2017-11-11T07:27:33Z,OWNER,,"The first version needs to take one or more file names or URLs, then generate and deploy an app to Now. It will assume you already have the now command installed and configured.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268590777,MDU6SXNzdWUyNjg1OTA3Nzc=,41,Homepage should show summary of databases,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-26T00:18:11Z,2017-10-27T04:05:35Z,2017-10-27T04:05:35Z,OWNER,,"I sch database should have a name, optional description, download link and a summary of the tables Flights.db Flights and suchlike blah. URL? License? 577373 rows across 14 tables airports, routes, airlines... Title of the homepage is derived from the databases or can be manually overridden e. “Datasets of Flights, NHS, Blah...” - or if only one database just the title of that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268591332,MDU6SXNzdWUyNjg1OTEzMzI=,42,Homepage UI for editing metadata file,9599,simonw,closed,0,,,,,4,2017-10-26T00:22:03Z,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,,"Since we are going to have a metadata file which sets the title/description/etc for each database, why not allow you to run the app in —dev mode which makes the homepage into a WYSIWYG editor that can save to that file format.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268592894,MDU6SXNzdWUyNjg1OTI4OTQ=,43,"While running, server should spot new db files added to its directory ",9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-26T00:32:37Z,2017-11-14T08:25:53Z,2017-11-14T08:25:37Z,OWNER,,"Maybe in each request it checks the time and if 5s has elapsed since t last scanned the directory it scans it again This would allow people with dedicated hosting to run the app there and just upload new datasets whenever they want. It would also be very convenient for development.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 269731374,MDU6SXNzdWUyNjk3MzEzNzQ=,44,?_group_count=country - return counts by specific column(s),9599,simonw,closed,0,,,,,7,2017-10-30T19:50:32Z,2018-04-26T15:09:58Z,2018-04-26T15:09:58Z,OWNER,,"Imagine if this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283/airports.jsono?country__contains=gu&_group_count=country Turned into this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283?sql=select%20country,%20count(*)%20as%20group_count_country%20from%20airports%20where%20country%20like%20%27%gu%%27%20group%20by%20country%20order%20by%20group_count_country%20desc This would involve introducing a new precedent of query string arguments that start with an _ having special meanings. While we're at it, could try adding _fields=x,y,z Tasks: - [x] Get initial version working - [ ] Refactor code to not just ""pretend to be a view"" - [ ] Get foreign key relationships expanded",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271242824,MDU6SXNzdWUyNzEyNDI4MjQ=,45,Run SQLite operations in a thread pool,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-05T02:27:12Z,2017-11-05T02:27:34Z,2017-11-05T02:27:33Z,OWNER,,"Let's run SQLite operations in threads, so we don't end up blocking our core event loop. These articles are helpful: * https://pymotw.com/3/asyncio/executors.html * https://marlinux.wordpress.com/2017/05/19/python-3-6-asyncio-sqlalchemy/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271301468,MDU6SXNzdWUyNzEzMDE0Njg=,46,Dockerfile should build more recent SQLite with FTS5 and spatialite support,9599,simonw,closed,0,,,,,13,2017-11-05T18:16:22Z,2017-11-17T14:32:12Z,2017-11-17T14:32:12Z,OWNER,,"The SQLite bundled with Python 3 doesn't support the FTS5 search extension. It would be nice if the SQLite built by our Dockerfile could support as many modern SQLite features as possible. https://web.archive.org/web/20170212034155/http://charlesleifer.com/blog/using-the-sqlite-json1-and-fts5-extensions-with-python/ has instructions on building a more recent SQLite and the pysqlite package. Our Dockerfile could carry out an updated version of this process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271831408,MDU6SXNzdWUyNzE4MzE0MDg=,47,Create neat example database,9599,simonw,closed,0,,,,,5,2017-11-07T13:29:38Z,2017-11-14T03:08:13Z,2017-11-14T03:08:13Z,OWNER,,How about data from open elections eg https://github.com/openelections/openelections-data-ca?files=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272391665,MDU6SXNzdWUyNzIzOTE2NjU=,48,Switch to ujson,9599,simonw,closed,0,,,,,4,2017-11-08T23:50:29Z,2019-06-24T06:57:54Z,2019-06-24T06:57:43Z,OWNER,,"ujson is already a dependency of Sanic, and should be quite a bit faster.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272661336,MDU6SXNzdWUyNzI2NjEzMzY=,49,Pick a name,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-09T17:56:17Z,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,,"Options so far: * immutabase * datasite * sqlstatic * dbserve * sqlserve Terms to play with: * immutable * sqlite * dataset * json * static * serve",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272694136,MDU6SXNzdWUyNzI2OTQxMzY=,50,Unit tests against application itself,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-09T19:31:49Z,2017-11-11T22:23:22Z,2017-11-11T22:23:22Z,OWNER,,"Use Sanic’s testing mechanism. Test should create a temporary SQLite database file on disk by executing sql that is stored in the test themselves. For the moment we can just test the JSON API more thoroughly and just sanity check that the HTML output doesn’t throw any errors.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272735257,MDU6SXNzdWUyNzI3MzUyNTc=,51,Make a proper README,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-09T21:46:07Z,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,,Include instructions on building a local Docker container - currently detailed here: https://gist.github.com/simonw/0ea5c960608c2d876e4637a5e48aa95d (those instructions don't work now that we have removed the Dockerfile in favour of a template generated by `datasette publish`),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273026602,MDU6SXNzdWUyNzMwMjY2MDI=,52,Solution for temporarily uploading DB so it can be built by docker,9599,simonw,closed,0,,,,,2,2017-11-10T18:55:25Z,2017-12-10T03:02:57Z,2017-12-10T03:02:57Z,OWNER,,For the `datasette publish` command I ideally need a way of uploading the specified DB to somewhere temporary on the internet so that when the Dockerfile is built by the final hosting location it can download that database as part of the build process.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273054652,MDU6SXNzdWUyNzMwNTQ2NTI=,53,Implement a better database index page,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-11-10T20:47:36Z,2017-11-12T21:19:33Z,2017-11-12T01:50:27Z,OWNER,,"This view isn't great. I should do a better job of separating out tables from views and indexes, showing the count of rows in each table, and maybe move the SQL to the individual table pages. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273121803,MDU6SXNzdWUyNzMxMjE4MDM=,54,Views should not attempt to link to records / use rowids,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T05:44:54Z,2017-11-12T21:29:42Z,2017-11-12T21:29:33Z,OWNER,,"http://localhost:8001/parlgov-development-25f9855/view_variable ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127117,MDU6SXNzdWUyNzMxMjcxMTc=,55,Ship first version to PyPI,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-11T07:38:48Z,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,,"Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127443,MDU6SXNzdWUyNzMxMjc0NDM=,56,Easy way to block search engine crawling in robots.txt,9599,simonw,closed,0,,,,,1,2017-11-11T07:46:07Z,2018-05-28T20:50:25Z,2018-05-28T20:50:24Z,OWNER,,For people who don't want their datasets to be crawled by search engines.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127694,MDU6SXNzdWUyNzMxMjc2OTQ=,57,Ship a Docker image of the whole thing,9599,simonw,closed,0,,,,,7,2017-11-11T07:51:28Z,2018-06-28T04:01:51Z,2018-06-28T04:01:38Z,OWNER,,"The generated Docker images can then just inherit from that. This will speed up deploys as no need to `pip install` anything. - [x] Ship that image to Docker Hub - [ ] Update the generated Dockerfile to use it",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273128608,MDU6SXNzdWUyNzMxMjg2MDg=,58,"publish command should detect if ""now"" is installed",9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T08:10:17Z,2017-11-11T16:00:07Z,2017-11-11T16:00:07Z,OWNER,,"If now is not installed, it should tell you where to get it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273157085,MDU6SXNzdWUyNzMxNTcwODU=,59,datasette publish hyper,9599,simonw,closed,0,,,,,4,2017-11-11T16:27:26Z,2019-05-13T19:01:00Z,2019-05-13T19:00:44Z,OWNER,,"This is a bit tricky, because unlike Now there doesn't seem to be a way to tell Hyper to ""build this Dockerfile and deploy the resulting image"". They expect you to build a container and publish it to a registry instead. https://docs.hyper.sh/Reference/CLI/load.html allows you to publish an image directly from a tarball, but that still leaves the challenge of creating that image. The nice thing about the Now integration is that you don't need to have Docker installed on your local machine.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273163905,MDU6SXNzdWUyNzMxNjM5MDU=,60,Rethink how metadata is generated and stored,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T18:01:28Z,2017-11-11T20:12:17Z,2017-11-11T20:12:16Z,OWNER,,"I broke the existing mechanism in 407795b61217205625f2d4e084afbf69f1db781b In order to get unit tests for the sanic app working. I think i should ditch the build-metadata.json cache file entirely and calculate the SHA hashes on startup. Not sure what to do about the table row counts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273173116,MDU6SXNzdWUyNzMxNzMxMTY=,61,Common header and footer,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T20:20:08Z,2017-11-11T20:37:19Z,2017-11-11T20:37:19Z,OWNER,,"Split from #16 - [x] A link to the homepage from some kind of navigation bar in the header - [x] link to github.com/simonw/datasette in the footer - [x] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273174397,MDU6SXNzdWUyNzMxNzQzOTc=,62,Link to .json and .jsono versions on various pages,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T20:37:47Z,2017-11-11T22:41:06Z,2017-11-11T22:41:06Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273174447,MDU6SXNzdWUyNzMxNzQ0NDc=,63,Review design of JSON output,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T20:38:33Z,2017-11-11T22:20:17Z,2017-11-11T22:20:17Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273181020,MDU6SXNzdWUyNzMxODEwMjA=,64,Support for ?field__isnull=1 or similar,9599,simonw,closed,0,,,,,1,2017-11-11T22:26:52Z,2017-11-17T14:38:21Z,2017-11-17T14:38:21Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273191608,MDU6SXNzdWUyNzMxOTE2MDg=,65,Re-implement ?sql= mode,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:47:17Z,2017-11-12T02:36:37Z,2017-11-12T02:35:42Z,OWNER,,"Here's the code I removed: async def data(self, request, name, hash): sql = 'select * from sqlite_master' custom_sql = False params = {} if request.args.get('sql'): params = request.raw_args sql = params.pop('sql') validate_sql_select(sql) custom_sql = True rows = await self.execute(name, sql, params) columns = [r[0] for r in rows.description] return { 'database': name, 'rows': rows, 'columns': columns, 'query': { 'sql': sql, 'params': params, } }, { 'database_hash': hash, 'custom_sql': custom_sql, } ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273191806,MDU6SXNzdWUyNzMxOTE4MDY=,66,Show table SQL on table page,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:51:23Z,2017-11-12T21:17:29Z,2017-11-12T21:17:29Z,OWNER,,"Let's do the SQL for the table you are looking at, plus SQL for any indexes that mention that table. The page for a view should show the SQL for that view.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/66/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273192789,MDU6SXNzdWUyNzMxOTI3ODk=,67,Command that builds a local docker container,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-12T02:13:29Z,2017-11-13T16:17:52Z,2017-11-13T16:17:52Z,OWNER,,Be nice to indicate that this isn't just for Now. Shouldn't be too hard either.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273247186,MDU6SXNzdWUyNzMyNDcxODY=,68,Support for title/source/license metadata,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:04:21Z,2017-12-04T04:55:43Z,2017-11-13T15:26:11Z,OWNER,,"I've decided this is important for launch: I want to set a precedent for people citing, licensing and documenting their datasets. Not sure how best to go about supporting this. I'd like to allow for the following data to be optionally attached to any given database: - Title - Description, potentially in markdown? - Original source URL - License I'd also like the ability to attach descriptions to individual tables - and maybe even to table columns? The question then becomes: how should this information be stored. A few options: - In the SQLite database itself, in a specially named table. Problem here is that this means having to modify SQLite databases before publishing them. - In a separate SQLite database that can be published alongside the databases we are publishing. - In a JSON file. This is neat, but JSON files are not a great editing experience once you start including multiple lines (e.g. a markdown description). - In a YAML file. This is a better format for multi-line descriptions, but still isn't a great editing experience. Whatever the format, it can be made much more usable by offering a web-based editing UI for populating it (a special mode the server can be run in).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273248366,MDU6SXNzdWUyNzMyNDgzNjY=,69,Enforce pagination (or at least limits) for arbitrary custom SQL,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:21:33Z,2017-11-13T20:32:47Z,2017-11-13T19:35:47Z,OWNER,,"It's way too easy to accidentally trigger a page that returns 100,000 rows at the moment. I need to use the LIMIT clause on views and custom SQL - I can support pagination ""next"" links using offset as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273267081,MDU6SXNzdWUyNzMyNjcwODE=,70,Paginate views using OFFSET/LIMIT,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-12T21:30:29Z,2017-11-13T21:11:01Z,2017-11-13T21:11:01Z,OWNER,,"As with #69 these should obey a maximum offset setting, which can be over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273278840,MDU6SXNzdWUyNzMyNzg4NDA=,71,Set up some example datasets on a Cloudflare-backed domain,9599,simonw,closed,0,,,2857392,Ship first public release,10,2017-11-13T00:06:30Z,2017-11-13T02:09:34Z,2017-11-13T02:09:34Z,OWNER,,"To better demonstrate the caching and HTTP/2 features, I'd like to go live with some demos that are hosted behind Cloudflare. - [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette - [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273283166,MDU6SXNzdWUyNzMyODMxNjY=,72,publish command should take an optional --name argument,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T00:59:35Z,2017-11-13T02:12:27Z,2017-11-13T02:12:27Z,OWNER,,"To set the directory name so that now will inherit it as the name of the app. Defaults to datasette ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273296178,MDU6SXNzdWUyNzMyOTYxNzg=,73,_nocache=1 query string option for use with sort-by-random,9599,simonw,closed,0,,,,,2,2017-11-13T02:57:10Z,2018-05-28T17:25:15Z,2018-05-28T17:25:15Z,OWNER,,The one place where we wouldn’t want cdching is if we have something which uses sort by random to return random items. We can offer a _nocache=1 querystring argument to support this.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273296684,MDU6SXNzdWUyNzMyOTY2ODQ=,74,Send a 302 redirect to the new hash for hits to old hashes,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T03:00:59Z,2017-11-13T18:49:59Z,2017-11-13T18:49:59Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273509159,MDU6SXNzdWUyNzM1MDkxNTk=,75,Add --cors argument to serve,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T17:16:19Z,2017-11-13T18:17:52Z,2017-11-13T18:17:52Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273510781,MDU6SXNzdWUyNzM1MTA3ODE=,76,publish should have required argument specifying publisher,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T17:21:26Z,2017-11-13T18:41:01Z,2017-11-13T18:41:01Z,OWNER,,Initially the only argument will be “now” - but “hyper” can be added in the future,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273537940,MDU6SXNzdWUyNzM1Mzc5NDA=,77,Add Travis CI badge to README,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T18:52:25Z,2017-11-13T21:24:15Z,2017-11-13T21:24:15Z,OWNER,,"Also fix this newline issue: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273554949,MDU6SXNzdWUyNzM1NTQ5NDk=,78,Rename after to next and provide a next_url,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T19:48:31Z,2017-11-13T20:35:03Z,2017-11-13T20:35:03Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273569068,MDU6SXNzdWUyNzM1NjkwNjg=,79,Add more detailed API documentation to the README,9599,simonw,closed,0,,,,,3,2017-11-13T20:36:21Z,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,,"Need to document: - [ ] The ?column__gt=4 style filter arguments for tables - [ ] The ?sql= API, and how named parameters work - [ ] How API pagination works - [ ] How redirects and cache headers work",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273569477,MDU6SXNzdWUyNzM1Njk0Nzc=,80,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination),9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-13T20:37:46Z,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,,Final versions should be deployed using the first released version of datasette.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273595473,MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw,81,:fire: Removes DS_Store,50527,jefftriplett,closed,0,,,,,2,2017-11-13T22:07:52Z,2017-11-14T02:24:54Z,2017-11-13T22:16:55Z,CONTRIBUTOR,simonw/datasette/pulls/81,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273596159,MDU6SXNzdWUyNzM1OTYxNTk=,82,Post a blog entry announcing it to the world,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T22:10:35Z,2017-11-14T01:46:10Z,2017-11-14T01:46:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273626815,MDU6SXNzdWUyNzM2MjY4MTU=,83,Individual row view is broken,9599,simonw,closed,0,,,,,0,2017-11-14T00:29:11Z,2017-11-14T00:45:34Z,2017-11-14T00:45:34Z,OWNER,,"https://parlgov.datasettes.com/parlgov-25f9855/viewcalc_parliament_composition/18 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/83/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273660425,MDU6SXNzdWUyNzM2NjA0MjU=,84,datasette package --metadata does not work with a relative path,9599,simonw,closed,0,,,,,0,2017-11-14T04:00:50Z,2017-11-15T05:18:35Z,2017-11-15T05:18:35Z,OWNER,," $ datasette package ~/parlgov-db/parlgov.db --metadata=~/parlgov-db/parlgov.json Usage: datasette package [OPTIONS] FILES... Error: Invalid value for ""-m"" / ""--metadata"": Could not open file: ~/parlgov-db/parlgov.json: No such file or directory simonw-07542:~ simonw$ cd ~/parlgov-db/ simonw-07542:parlgov-db simonw$ datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json Sending build context to Docker daemon 4.46MB Step 1/7 : FROM python:3 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/84/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273678673,MDU6SXNzdWUyNzM2Nzg2NzM=,85,Detect foreign keys and use them to link HTML pages together,9599,simonw,closed,0,,,2919870,Foreign key edition,6,2017-11-14T06:12:05Z,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,,"https://stackoverflow.com/a/44430157/6083 documents the PRAGMA needed to extract foreign key references for a table. At a minimum we can link column values known to be foreign keys to the corresponding row page. We could try to summarize the linked row in some way too - somehow extracting a sensible link title, maybe based on additional configuration in the metadata.json file. Still todo: - [x] Fix it to csvs-to-sqlite refactoring command correctly creates primary key on generated tables - [x] Ship new csvs-to-sqlite with refactoring command - [x] Refactor column logic to be more predictable in our templates (the rowid special case) - [x] Mechanism by which table metadata can specify the ""label"" column for a table - [x] Automatically set the label column as the first column that isn't a primary key (falling back on primary key) - [x] Code which runs a ""select id, label from table where id in (...)"" query as part of the tableview and populates a lookup dictionary - [x] Modify templates to use values from that lookup dictionary",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273703829,MDU6SXNzdWUyNzM3MDM4Mjk=,86,Filter UI on table page,9599,simonw,closed,0,,,2919870,Foreign key edition,10,2017-11-14T08:22:43Z,2017-11-23T20:34:32Z,2017-11-23T20:34:32Z,OWNER,,A UI for building up simple table queries by adding additional filter rules that get executed as query parameters in the URL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/86/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273709194,MDU6SXNzdWUyNzM3MDkxOTQ=,87,Configure Travis to release new tags to PyPI,9599,simonw,closed,0,,,,,1,2017-11-14T08:44:08Z,2018-07-10T17:49:13Z,2018-07-10T17:49:12Z,OWNER,,https://docs.travis-ci.com/user/deployment/pypi/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273775212,MDU6SXNzdWUyNzM3NzUyMTI=,88,Add NHS England Hospitals example to wiki,15543,tomdyson,closed,0,,,,,4,2017-11-14T12:29:10Z,2021-03-22T23:46:36Z,2017-11-14T22:54:06Z,CONTRIBUTOR,,"https://nhs-england-hospitals.now.sh and an associated map visualisation: http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ Datasette is wonderful! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273816720,MDExOlB1bGxSZXF1ZXN0MTUyNTIyNzYy,89,SQL syntax highlighting with CodeMirror,15543,tomdyson,closed,0,,,,,1,2017-11-14T14:43:33Z,2017-11-15T02:03:01Z,2017-11-15T02:03:01Z,CONTRIBUTOR,simonw/datasette/pulls/89,"Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {""Ctrl-Space"": ""autocomplete""}, hintOptions: {tables: { users: [""name"", ""score"", ""birthDate""], countries: [""name"", ""population"", ""size""] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273846123,MDU6SXNzdWUyNzM4NDYxMjM=,90,datasette publish heroku,9599,simonw,closed,0,,,,,8,2017-11-14T16:01:39Z,2017-12-10T03:06:34Z,2017-12-10T03:05:48Z,OWNER,,"Heroku has Docker container support so this should not be too hard: https://devcenter.heroku.com/articles/container-registry-and-runtime See also #59 This should work exactly like the existing “datasette publish now....” command except it would be “datasette publish heroku...”",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/90/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273878873,MDU6SXNzdWUyNzM4Nzg4NzM=,91,"Option to serve databases from a different prefix, serve regular content elsewhere",9599,simonw,closed,0,,,,,1,2017-11-14T17:32:46Z,2017-12-10T03:07:58Z,2017-12-10T03:07:53Z,OWNER,,"It would be useful if the databases themselves could be served from a prefix e.g. datasette serve mydb.db --path-prefix=db Now my database is at `http://localhost:8001/db/mydb-23423` This would free up the rest of the URL namespace for other things. Maybe we could have an option to serve static content from a known folder e.g. datasette serve mydb.db --path-prefix=db --root-content=~/my-project/static Now a hit to `http://localhost:8001/news/` serves content from `~/my-project/static/news/index.html` This would make it trivial to package up entire HTML/CSS/JS apps with one or more underlying SQLite databases. Running without `--cors` would be fine here because any JS apps would be hosted on the same origin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273895344,MDU6SXNzdWUyNzM4OTUzNDQ=,92,Add --license --license_url --source --source_url --title arguments to datasette publish,9599,simonw,closed,0,,,,,0,2017-11-14T18:27:07Z,2017-11-15T05:04:41Z,2017-11-15T05:04:41Z,OWNER,,"I keep on using the `echo '{""source"": ""...""}' | datasette publish now --metadata=-` pattern, which suggests it makes sense for us to support these as optional arguments. https://gist.github.com/simonw/9f8bf23b37a42d7628c4dcc4bba10253",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/92/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273944952,MDU6SXNzdWUyNzM5NDQ5NTI=,93,Package as standalone binary,67420,atomotic,closed,0,,,,,18,2017-11-14T21:14:07Z,2021-11-21T07:00:23Z,2021-11-21T07:00:23Z,NONE,,"hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute. i would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273961179,MDExOlB1bGxSZXF1ZXN0MTUyNjMxNTcw,94,Initial add simple prod ready Dockerfile refs #57,247192,macropin,closed,0,,,,,1,2017-11-14T22:09:09Z,2017-11-15T03:08:04Z,2017-11-15T03:08:04Z,CONTRIBUTOR,simonw/datasette/pulls/94,"Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273998513,MDU6SXNzdWUyNzM5OTg1MTM=,95,Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit,9599,simonw,closed,0,,,,,1,2017-11-15T01:02:16Z,2017-11-15T02:56:13Z,2017-11-15T02:56:13Z,OWNER,,This cannot be greater than the configured time limit.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/95/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274001453,MDU6SXNzdWUyNzQwMDE0NTM=,96,UI for editing named parameters,9599,simonw,closed,0,,,,,3,2017-11-15T01:19:21Z,2017-11-16T01:45:51Z,2017-11-16T01:33:38Z,OWNER,,"On any page displaying a custom query that includes named parameters, we should show HTML form fields for editing those parameters. Eg the breed parameter on https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274022950,MDU6SXNzdWUyNzQwMjI5NTA=,97,Link to JSON for the list of tables ,9599,simonw,closed,0,,,,,3,2017-11-15T03:29:05Z,2018-05-29T18:51:35Z,2018-05-28T20:57:21Z,OWNER,,https://twitter.com/yschimke/status/930606210855854080,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/97/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274023417,MDU6SXNzdWUyNzQwMjM0MTc=,98,Default to 127.0.0.1 not 0.0.0.0,9599,simonw,closed,0,,,,,0,2017-11-15T03:31:55Z,2017-11-15T05:08:54Z,2017-11-15T05:08:54Z,OWNER,,https://twitter.com/yschimke/status/930606210855854080,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274023625,MDU6SXNzdWUyNzQwMjM2MjU=,99,Start a change log,9599,simonw,closed,0,,,,,0,2017-11-15T03:33:21Z,2017-11-16T15:12:46Z,2017-11-16T15:12:45Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,coisnepe,closed,0,,,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get **context, File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274161964,MDU6SXNzdWUyNzQxNjE5NjQ=,101,TemplateAssertionError: no filter named 'tojson',450244,eaubin,closed,0,,,,,1,2017-11-15T13:47:32Z,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,,"I get an exception clicking on the table link: ``` 2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last): File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 219, in view_get **context, File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274264175,MDU6SXNzdWUyNzQyNjQxNzU=,102,datasette publish elasticbeanstalk,9599,simonw,closed,0,,,,,1,2017-11-15T18:48:31Z,2021-01-04T20:13:20Z,2021-01-04T20:13:19Z,OWNER,,"It looks like Elastic Beanstalk is the most convenient way to deploy a docker container to AWS without first deploying a cluster. https://aws.amazon.com/blogs/devops/dockerizing-a-python-web-app/ looks helpful. We would need to automate the deployment with Boto: http://boto3.readthedocs.io/en/latest/reference/services/elasticbeanstalk.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/102/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274265878,MDU6SXNzdWUyNzQyNjU4Nzg=,103,datasette publish appengine,9599,simonw,closed,0,,,,,1,2017-11-15T18:54:18Z,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,,"Similar approach to Heroku, discussed in #90 Looks like this could be pretty easy: https://cloud.google.com/appengine/docs/flexible/python/quickstart",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274284246,MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw,104,[WIP] Add publish to heroku support,21148,jacobian,closed,0,,,,,6,2017-11-15T19:56:22Z,2017-11-21T20:55:05Z,2017-11-21T20:55:05Z,CONTRIBUTOR,simonw/datasette/pulls/104," Refs #90 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274314940,MDU6SXNzdWUyNzQzMTQ5NDA=,105,Consider data-package as a format for metadata,9599,simonw,closed,0,,,,,4,2017-11-15T21:43:34Z,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,,http://frictionlessdata.io/specs/data-package/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274315193,MDU6SXNzdWUyNzQzMTUxOTM=,106,Document how pagination works,9599,simonw,closed,0,,,,,1,2017-11-15T21:44:32Z,2019-06-24T06:42:33Z,2019-06-24T06:42:33Z,OWNER,,I made a start at that in this comment: https://news.ycombinator.com/item?id=15691926,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274343647,MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw,107,add support for ?field__isnull=1,3433657,raynae,closed,0,,,,,4,2017-11-15T23:36:36Z,2017-11-17T15:12:29Z,2017-11-17T13:29:22Z,CONTRIBUTOR,simonw/datasette/pulls/107,Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274374317,MDU6SXNzdWUyNzQzNzQzMTc=,108,"Include version in python code, output in template",9599,simonw,closed,0,,,,,0,2017-11-16T02:32:40Z,2017-11-16T15:30:04Z,2017-11-16T15:30:04Z,OWNER,,It would be useful if I could tell which version of datasette was running on a site. Embed version number and include it in maybe a tooltip on the “powered by datasette” link,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274378301,MDU6SXNzdWUyNzQzNzgzMDE=,109,Set up readthedocs,9599,simonw,closed,0,,,,,1,2017-11-16T02:58:01Z,2017-11-16T16:53:26Z,2017-11-16T16:13:56Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274578142,MDU6SXNzdWUyNzQ1NzgxNDI=,110,Add --load-extension option to datasette for loading extra SQLite extensions,9599,simonw,closed,0,,,,,2,2017-11-16T16:26:19Z,2017-11-16T18:38:30Z,2017-11-16T16:58:50Z,OWNER,,"This would allow users with extra SQLite extensions installed (like spatialite) to load them at runtime. Inspired by this comment: https://github.com/simonw/datasette/issues/46#issuecomment-344810525",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274615452,MDU6SXNzdWUyNzQ2MTU0NTI=,111,Add “updated” to metadata,9599,simonw,open,0,,,,,12,2017-11-16T18:22:20Z,2021-09-21T22:48:27Z,,OWNER,,"To give an indication as to when the data was last updated. This should be a field in the metadata that is then shown on the index page and in the footer, if it is set. Also support setting it using an option to “datasette publish” and “datasette package” - which can either be a string or can be the magic string “today” to set it to today’s date: datasette publish file.db --updated=today",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 274617240,MDU6SXNzdWUyNzQ2MTcyNDA=,112,Allow --load-extension to be set via environment variables,9599,simonw,closed,0,,,,,1,2017-11-16T18:28:31Z,2017-11-17T14:19:23Z,2017-11-17T14:17:27Z,OWNER,,"This will make it easier to package up datasette in a Docker container with a bunch of pre-compiled extensions without the user having to remember to include all of the options every time. Click has a mechanism for this: http://click.pocoo.org/5/options/#multiple-values-from-environment-values",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274662378,MDU6SXNzdWUyNzQ2NjIzNzg=,113,Fix the bug on the database custom SQL query view,9599,simonw,closed,0,,,2919870,Foreign key edition,0,2017-11-16T21:01:26Z,2017-11-17T15:40:52Z,2017-11-17T15:40:52Z,OWNER,,"https://sf-film-locations.now.sh/sf-film-locations-57704b7?sql=select+*+from+Film_Locations_in_San_Francisco This is the bug I fixed in 01e0c3fa18cd0dd7970e208790ffd683a420c924 - but I only fixed it in one place.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/113/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274733145,MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1,114,"Add spatialite, switch to debian and local build",54999,ingenieroariel,closed,0,,,,,1,2017-11-17T02:37:09Z,2017-11-17T03:50:52Z,2017-11-17T03:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/114,"Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274877366,MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy,115,Add keyboard shortcut to execute SQL query,198537,rgieseke,closed,0,,,,,1,2017-11-17T14:13:33Z,2017-11-17T15:16:34Z,2017-11-17T14:22:56Z,CONTRIBUTOR,simonw/datasette/pulls/115,"Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274884209,MDU6SXNzdWUyNzQ4ODQyMDk=,116,Add documentation section about SQLite extensions,9599,simonw,closed,0,,,,,1,2017-11-17T14:36:30Z,2018-05-28T17:23:42Z,2018-05-28T17:23:41Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274900388,MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx,117,Don't prevent tabbing to `Run SQL` button,198537,rgieseke,closed,0,,,,,1,2017-11-17T15:27:50Z,2017-11-19T20:30:24Z,2017-11-18T00:53:43Z,CONTRIBUTOR,simonw/datasette/pulls/117,"Mentioned in #115 Here you go!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275048699,MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1,118,Foreign key information on row and table pages,9599,simonw,closed,0,,,,,0,2017-11-18T03:13:27Z,2017-11-18T03:15:57Z,2017-11-18T03:15:50Z,OWNER,simonw/datasette/pulls/118,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275082158,MDU6SXNzdWUyNzUwODIxNTg=,119,"Build an ""export this data to google sheets"" plugin",9599,simonw,closed,0,,,,,1,2017-11-18T14:14:51Z,2020-06-04T18:46:40Z,2020-06-04T18:46:39Z,OWNER,,"Inspired by https://github.com/kren1/tosheets It should be a plug-in because I'd like to keep all interactions with proprietary / non-open-source software encapsulated in plugins rather than shipped as part of core.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/119/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275087397,MDU6SXNzdWUyNzUwODczOTc=,120,Plugin that adds an authentication layer of some sort,9599,simonw,closed,0,,,,,4,2017-11-18T15:39:13Z,2020-03-16T18:48:06Z,2020-03-16T18:48:06Z,OWNER,,"Would allow people who want to host private data to do so. .sh ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/120/reactions"", ""total_count"": 7, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",,completed 275089535,MDU6SXNzdWUyNzUwODk1MzU=,121,?_json=foo&_json=bar query string argument ,9599,simonw,closed,0,,,,,4,2017-11-18T16:09:55Z,2018-05-31T13:48:12Z,2018-05-28T18:11:51Z,OWNER,,"Causes the specified columns in the output to be treated as JSON, and returned deserialized in the .json or .jsono response. This will be particularly powerful when combined with https://sqlite.org/json1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275092453,MDU6SXNzdWUyNzUwOTI0NTM=,122,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead",9599,simonw,closed,0,,,,,5,2017-11-18T16:52:28Z,2018-04-08T14:54:09Z,2018-04-08T14:54:09Z,OWNER,,"I want to support three variants for the rows output: * a list of lists, with a columns key saying what they are * a list of dictionaries * a single dictionary where the keys are the primary keys of the rows and the values are the row dictionaries themselves I also want to make the various bits of metadata opt-in - so you don't get the SQL statement unless you ask for it. These output options should be controlled by query string arguments. I will set the .jsono URL to redirect to .json with the corresponding options. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/122/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275125561,MDU6SXNzdWUyNzUxMjU1NjE=,123,Datasette serve should accept paths/URLs to CSVs and other file formats,9599,simonw,open,0,,,,,9,2017-11-19T02:05:48Z,2021-07-19T00:04:32Z,,OWNER,,"This would remove the csvs-to-sqlite step which I end up using for almost everything. I'm hesitant to introduce pandas as a required dependency though since it require compiling numpy. Could build it so this option is only available if you have pandas installed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/123/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 275125805,MDU6SXNzdWUyNzUxMjU4MDU=,124,Option to open readonly but not immutable,9599,simonw,closed,0,,,,,5,2017-11-19T02:11:03Z,2019-06-24T06:43:46Z,2019-06-24T06:43:46Z,OWNER,,Immutable assumes no other process can modify the file. An option to open reqdonly instead would enable other processes to update the file in place.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135393,MDU6SXNzdWUyNzUxMzUzOTM=,125,Plot rows on a map with Leaflet and Leaflet.markercluster,9599,simonw,closed,0,,,,,2,2017-11-19T06:05:05Z,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,,"https://github.com/Leaflet/Leaflet.markercluster would allow us to paginate-load in an enormous set of rows with latitude/longitude points, e.g. https://australian-dunnies.now.sh/ Here's a demo of it loading 50,000 markers: https://leaflet.github.io/Leaflet.markercluster/example/marker-clustering-realworld.50000.html - and it looks like it's easy to support progress bars for if we were iteratively loading 1,000 markers at a time using datasette pagination.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135535,MDU6SXNzdWUyNzUxMzU1MzU=,126,Blog entry announcing foreign key support,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T06:09:06Z,2017-11-30T16:49:24Z,2017-11-30T16:49:24Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135719,MDU6SXNzdWUyNzUxMzU3MTk=,127,"Filtered tables should show count of all matching rows, if fast enough",9599,simonw,closed,0,,,2919870,Foreign key edition,2,2017-11-19T06:13:29Z,2017-11-24T22:02:01Z,2017-11-24T22:02:01Z,OWNER,,"Relates to #86. If you are viewing a filtered page e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/bob-ross%2Felements-by-episode?CLOUDS=1 we should show the count of matching rows. Since this could be an expensive operation, we will run it with a strict time limit (maybe 50ms). If the time limit is exceeded we will display ""many"" instead, perhaps? Maybe even link to a count(*) query that would get the full 1000ms time limit which the user can click on if they like (that could even Ajax-in the result).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275159710,MDU6SXNzdWUyNzUxNTk3MTA=,128,"Every visualization should have an ""embed"" button",9599,simonw,open,0,,,,,0,2017-11-19T13:38:13Z,2019-05-13T18:33:51Z,,OWNER,,"At least for the first round of visualizations, any time you construct one using the UI the result should include an ""embed this"" button that returns source code to copy and paste These examples should use unpkg.com (or similarl) urls with SRI hashes, eg https://www.srihash.org - and should load data from the datasette JSON API.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275164558,MDU6SXNzdWUyNzUxNjQ1NTg=,129,Hide FTS-created tables by default on the database index page,9599,simonw,closed,0,,,,,2,2017-11-19T14:50:42Z,2017-11-22T20:22:02Z,2017-11-22T20:19:04Z,OWNER,,"SQLite databases that use FTS include a number of automatically generated tables, e.g.: https://sf-trees-search.now.sh/sf-trees-search-a899b92 Of these, only the `Street_Tree_List` table is actually relevant to the user. We can detect which tables are FTS tables by first finding the virtual tables: sqlite> .headers on sqlite> select * from sqlite_master where rootpage = 0; type|name|tbl_name|rootpage|sql table|Search|Search|0|CREATE VIRTUAL TABLE ""Street_Tree_List_fts"" USING FTS4 (""qAddress"", ""qCaretaker"", ""qSpecies"") Then parsing the above to figure out which ones are USING FTS? - then assume that any table which starts with that `Street_Tree_List_fts` prefix was created to support search: sqlite> select * from sqlite_master where type='table' and tbl_name like 'Street_Tree_List_fts%'; type|name|tbl_name|rootpage|sql table|Search_content|Search_content|10355|CREATE TABLE 'Street_Tree_List_fts_content'(docid INTEGER PRIMARY KEY, 'c0qAddress', 'c1qCaretaker', 'c2qSpecies') table|Search_segments|Search_segments|10356|CREATE TABLE 'Street_Tree_List_fts_segments'(blockid INTEGER PRIMARY KEY, block BLOB) table|Search_segdir|Search_segdir|10357|CREATE TABLE 'Street_Tree_List_fts_segdir'(level INTEGER,idx INTEGER,start_block INTEGER,leaves_end_block INTEGER,end_block INTEGER,root BLOB,PRIMARY KEY(level, idx)) table|Search_docsize|Search_docsize|10359|CREATE TABLE 'Street_Tree_List_fts_docsize'(docid INTEGER PRIMARY KEY, size BLOB) table|Search_stat|Search_stat|10360|CREATE TABLE 'Street_Tree_List_fts_stat'(id INTEGER PRIMARY KEY, value BLOB) We won't hide these completely - instead, we'll default the database index view to not showing them with a message that says ""5 hidden tables"" and support ?_hidden=1 to display them.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275166078,MDU6SXNzdWUyNzUxNjYwNzg=,130,"Rename ""datasette build"" to ""datasette inspect""",9599,simonw,closed,0,,,,,0,2017-11-19T15:08:02Z,2017-12-07T16:57:58Z,2017-12-07T16:57:58Z,OWNER,,"This command introspects the databases and writes out a JSON summary. I think I'd like to use `datasette build` for something more interesting, potentially duplicating functionality from https://github.com/simonw/csvs-to-sqlite Since the internal method that does this is called `ds.inspect()` that seems like a reasonable replacement name for the command.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275166669,MDU6SXNzdWUyNzUxNjY2Njk=,131,UI support for running FTS searches,9599,simonw,closed,0,,,,,3,2017-11-19T15:16:20Z,2017-11-19T17:18:05Z,2017-11-19T17:00:12Z,OWNER,,"Here's an example query that searches all FTS indexed columns in a table: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+Street_Tree_List_fts+match+%27grove+london+dpw%27%29%0D%0A And here's a query that searches a specific column: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+qSpecies+match+%27london%27%29%0D%0A If we detect that a table has FTS enabled (which we can do by looking for it as a content table reference in another FTS table's create definition) we should add a search box to the table page which constructs this query - maybe using `?_search=XXX` in the query string?
` element is an easy fix: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453846217,MDU6SXNzdWU0NTM4NDYyMTc=,506,Option to display binary data,9599,simonw,closed,0,,,,,10,2019-06-08T23:44:12Z,2019-06-11T15:48:27Z,2019-06-09T16:07:39Z,OWNER,,"In #442 we suppressed rendering of binary data: It turns out there is one use-case where displaying binary data is useful: when you're poking around looking at random SQLite databases you find in `~/Library` trying to figure out what they are for. So, a mechanism for opting in to ugly display of binary data again would be useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455486286,MDU6SXNzdWU0NTU0ODYyODY=,26,Mechanism for turning nested JSON into foreign keys / many-to-many,9599,simonw,open,0,,,,,14,2019-06-13T00:52:06Z,2022-06-29T23:35:29Z,,OWNER,,"The GitHub JSON APIs have a really interesting convention with respect to related objects. Consider https://api.github.com/repos/simonw/sqlite-utils/issues - here's a truncated subset: ```json { ""id"": 449818897, ""node_id"": ""MDU6SXNzdWU0NDk4MTg4OTc="", ""number"": 24, ""title"": ""Additional Column Constraints?"", ""user"": { ""login"": ""IgnoredAmbience"", ""id"": 98555, ""node_id"": ""MDQ6VXNlcjk4NTU1"", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/98555?v=4"", ""gravatar_id"": """" }, ""labels"": [ { ""id"": 993377884, ""node_id"": ""MDU6TGFiZWw5OTMzNzc4ODQ="", ""url"": ""https://api.github.com/repos/simonw/sqlite-utils/labels/enhancement"", ""name"": ""enhancement"", ""color"": ""a2eeef"", ""default"": true } ], ""state"": ""open"" } ``` The `user` column lists a complete user. The `labels` column has a list of labels. Since both user and label have populated `id` field this is actually enough information for us to create records for them AND set up the corresponding foreign key (for user) and m2m relationships (for labels). It would be really neat if `sqlite-utils` had some kind of mechanism for correctly processing these kind of patterns. Thanks to `jq` there's not much need for extra customization of the shape here - if we support a narrowly defined structure users can use `jq` to reshape arbitrary JSON to match.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/26/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 455496504,MDU6SXNzdWU0NTU0OTY1MDQ=,27,sqlite-utils create-table command,9599,simonw,closed,0,,,,,8,2019-06-13T01:43:30Z,2020-05-03T15:26:15Z,2020-05-03T15:26:15Z,OWNER,,"Spun off from #24 - it would be useful if CLI users could create new tables (with explicit column types, not null rules and defaults) without having to insert an example record. - [x] Get it working - [x] Support `--pk` - [x] Support `--not-null` - [x] Support `--default` - [x] Support `--fk colname othertable othercol` - [x] Support `--replace` and `--ignore` - [x] Documentation",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455852801,MDU6SXNzdWU0NTU4NTI4MDE=,507,Every datasette plugin on the ecosystem page should have a screenshot,9599,simonw,open,0,,,,,4,2019-06-13T17:02:51Z,2020-09-17T02:47:35Z,,OWNER,,https://github.com/simonw/datasette/blob/master/docs/ecosystem.rst,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 455965174,MDU6SXNzdWU0NTU5NjUxNzQ=,508,Ability to set default sort order for a table or view in metadata.json,9599,simonw,closed,0,9599,simonw,,,1,2019-06-13T21:40:51Z,2020-05-28T18:53:03Z,2020-05-28T18:53:02Z,OWNER,,"It can go here in the documentation: https://datasette.readthedocs.io/en/stable/metadata.html#setting-which-columns-can-be-used-for-sorting Also need to fix this sentence which is no longer true: > By default, database views in Datasette do not support sorting",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/508/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455996809,MDU6SXNzdWU0NTU5OTY4MDk=,28,"Rearrange the docs by area, not CLI vs Python",9599,simonw,closed,0,,,,,1,2019-06-13T23:33:35Z,2019-07-15T02:37:20Z,2019-07-15T02:37:20Z,OWNER,,"The docs for eg inserting data should live on the same page, rather than being split across the API and CLI pages.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 456568880,MDU6SXNzdWU0NTY1Njg4ODA=,509,Support opening multiple databases with the same stem,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,4,2019-06-15T19:32:00Z,2020-12-22T20:04:35Z,2020-12-22T20:04:35Z,OWNER,,"e.g. I should be able to do this: datasette App/data.db Other_App/data.db This currently errors because you can't have two databases taking the `/data` URL path. Instead, how about in this particular case assigning the second database `/data-1`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/509/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 456569067,MDU6SXNzdWU0NTY1NjkwNjc=,510,Ability to facet by delimiter (e.g. comma separated fields),9599,simonw,open,0,9599,simonw,,,1,2019-06-15T19:34:41Z,2019-07-08T15:44:51Z,,OWNER,,"E.g. if a field contains ""Tags,With,Commas"" be able to facet them in the same way as `_facet_array=` lets you facet `[""Tags"", ""With"", ""Commas""]`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 456578474,MDU6SXNzdWU0NTY1Nzg0NzQ=,511,Get Datasette tests passing on Windows in GitHub Actions,9599,simonw,open,0,,,,,13,2019-06-15T21:41:58Z,2021-07-11T17:23:05Z,,OWNER,,"This should almost happen as a side-effect or moving from Sanic to Uvicorn during the port to ASGI: #272 Additional steps: - test it manually - update documentation - set up some form of Windows CI ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 457147936,MDU6SXNzdWU0NTcxNDc5MzY=,512,"""about"" parameter in metadata does not appear when alone",7936571,chrismp,open,0,,,,,3,2019-06-17T21:04:20Z,2019-10-11T15:49:13Z,,NONE,,"Here's an example of metadata I have for one database on datasette. ``` ""Records-requests"": { ""tables"": { ""Some table"": { ""about"": ""This table has data."" } } } ``` The text in `about` does not show up when I publish the data. But it shows up after I add a `""source""` parameter in the metadata. Is this intended?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 457201907,MDU6SXNzdWU0NTcyMDE5MDc=,513,Is it possible to publish to Heroku despite slug size being too large?,7936571,chrismp,closed,0,,,,,2,2019-06-18T00:12:02Z,2019-06-21T22:35:54Z,2019-06-21T22:35:54Z,NONE,,"I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command. Compiled slug size: 535.5M is too large (max is 500M). Can I publish the databases and make datasette work on Heroku despite the large slug size?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 458941203,MDU6SXNzdWU0NTg5NDEyMDM=,29,Prevent accidental add-foreign-key with invalid column,9599,simonw,closed,0,,,,,0,2019-06-20T23:57:24Z,2019-06-20T23:58:26Z,2019-06-20T23:58:26Z,OWNER,,"You can corrupt your database by running: $ sqlite-utils add-foreign-key my.db table non_existent_column other_table other_column ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459397625,MDU6SXNzdWU0NTkzOTc2MjU=,514,Documentation with recommendations on running Datasette in production without using Docker,7936571,chrismp,closed,0,,,5971510,Datasette 0.50,27,2019-06-21T22:48:12Z,2020-10-08T23:55:53Z,2020-10-08T23:33:05Z,NONE,,"I've got some SQLite databases too big to push to Heroku or the other services with built-in support in datasette. So instead I moved my datasette code and databases to a remote server on Kimsufi. In the folder containing the SQLite databases I run the following code. `nohup datasette serve -h 0.0.0.0 *.db --cors --port 8000 --metadata metadata.json > output.log 2>&1 &`. When I go to `http://my-remote-server.com:8000`, the site loads. But I know this is not a good long-term solution to running datasette on this server. What is the ""correct"" way to have this site run, preferably on server port 80?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459469278,MDU6SXNzdWU0NTk0NjkyNzg=,515,Try shrinking official image with docker-slim,9599,simonw,open,0,,,,,0,2019-06-22T12:25:37Z,2019-06-22T12:25:37Z,,OWNER,,"This looks really promising: https://github.com/docker-slim/docker-slim If it can shave substantial size from our official container reliably we could add it to the automated build process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459509126,MDU6SXNzdWU0NTk1MDkxMjY=,516,Enforce import sort order with isort,9599,simonw,open,0,,,,,8,2019-06-22T20:35:50Z,2023-08-23T02:15:36Z,,OWNER,,"I want to use isort to order imports. A few steps here: - [x] Add a .isort.cfg file (see below) - [x] Use `isort -rc` to reformat existing code - [ ] Commit this change - [x] Add a unit test that ensures future changes remain isort compatible",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459537047,MDU6SXNzdWU0NTk1MzcwNDc=,517,"Add unit test for ""static"" mechanism in plugins",9599,simonw,closed,0,,,,,1,2019-06-23T05:03:31Z,2021-01-04T20:15:19Z,2021-01-04T20:15:19Z,OWNER,,"Split out from #272 - this is actually quite tricky. Here's the relevant code: https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/utils.py#L602-L614",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459587155,MDExOlB1bGxSZXF1ZXN0MjkwODk3MTA0,518,Port Datasette from Sanic to ASGI + Uvicorn,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,12,2019-06-23T15:18:42Z,2019-06-24T13:42:50Z,2019-06-24T03:13:09Z,OWNER,simonw/datasette/pulls/518,"Most of the code here was fleshed out in comments on #272 (Port Datasette to ASGI) - this pull request will track the final pieces: - [x] Update test harness to more correctly simulate the `raw_path` issue - [x] Use `raw_path` so table names containing `/` can work correctly - [x] Bug: JSON not served with correct content-type - [x] Get ?_trace=1 working again - [x] Replacement for `@app.listener(""before_server_start"")` - [x] Bug: `/fixtures/table%2Fwith%2Fslashes.csv?_format=json` downloads as CSV - [x] Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency - [x] Final code tidy-up before merging to master",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 459590021,MDU6SXNzdWU0NTk1OTAwMjE=,519,Decide what goes into Datasette 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-06-23T15:47:41Z,2021-11-15T23:26:11Z,2021-11-15T23:26:11Z,OWNER,,Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459598080,MDU6SXNzdWU0NTk1OTgwODA=,520,asgi_wrapper plugin hook,9599,simonw,closed,0,9599,simonw,,,3,2019-06-23T17:16:45Z,2019-07-03T04:40:34Z,2019-07-03T04:06:28Z,OWNER,,"After #272 we can finally add this hook. It will allow plugins to wrap their own ASGI middleware around Datasette. Potential use-cases include: * adding authentication * custom CORS headers (see #454) * maybe gzip support? * possibly defining entirely new routes, though that may be better handled by a separate hook",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459621683,MDU6SXNzdWU0NTk2MjE2ODM=,521,Easier way of creating custom row templates,9599,simonw,closed,0,,,,,6,2019-06-23T21:49:27Z,2019-07-03T03:23:56Z,2019-07-03T03:23:56Z,OWNER,,"I was messing around with a custom `_rows_and_columns.html` template and ended up with this: ```html {% for row in display_rows %}{% endfor %} ``` This is nasty. I'd like to be able to do something like this instead: ``` {% for row in display_rows %}
{% for cell in row %} {% if cell.column == ""First_Name"" %}{{ cell.value }} {% elif cell.column == ""Last_Name"" %} {{ cell.value }} {% elif cell.column == ""Short_Description"" %}
{{ cell.column }}: {{ cell.value }}
{% else %} {{ cell.column }}: {{ cell.value }} {% endif %} {% endfor %}
{{ row[""First_Name""] }} {{ row[""Last_Name""] }}
... ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459622390,MDU6SXNzdWU0NTk2MjIzOTA=,522,Handle case-insensitive headers in a nicer way,9599,simonw,open,0,,,,,1,2019-06-23T21:56:34Z,2019-06-26T18:48:53Z,,OWNER,,Spun out from https://github.com/simonw/datasette/pull/518#discussion_r296486289,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459627549,MDU6SXNzdWU0NTk2Mjc1NDk=,523,Show total/unfiltered row count when filtering,2657547,rixx,closed,0,,,,,2,2019-06-23T22:56:48Z,2019-06-24T01:38:14Z,2019-06-24T01:38:14Z,CONTRIBUTOR,,"When I'm seeing a filtered view of a table, I'd like to be able to see something like '2 rows where status != ""closed"" (of 1000 total)' to have a context for the data I'm seeing – e.g. currently my database is being filled by an importer, so this information would be super helpful. Since this information would be a performance hit, maybe something like '12 rows where status != ""closed"" (of ??? total)' with lazy-loading on-click(?) could be applied (Or via a ""How many total?"" tooltip, or …)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459689615,MDExOlB1bGxSZXF1ZXN0MjkwOTcxMjk1,524,"Sort commits using isort, refs #516",9599,simonw,closed,0,,,,,1,2019-06-24T05:04:48Z,2023-08-23T01:31:08Z,2023-08-23T01:31:08Z,OWNER,simonw/datasette/pulls/524,Also added a lint unit test to ensure they stay sorted. #516,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 459714943,MDU6SXNzdWU0NTk3MTQ5NDM=,525,Add section on sqite-utils enable-fts to the search documentation,9599,simonw,closed,0,9599,simonw,,,2,2019-06-24T06:39:16Z,2019-06-24T16:36:35Z,2019-06-24T16:29:43Z,OWNER,,"https://datasette.readthedocs.io/en/stable/full_text_search.html already has a section about csvs-to-sqlite, sqlite-utils is even more relevant.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459882902,MDU6SXNzdWU0NTk4ODI5MDI=,526,Stream all results for arbitrary SQL and canned queries,50578294,matej-fr,open,0,,,,,23,2019-06-24T13:09:45Z,2022-09-28T04:01:25Z,,NONE,,"I think that there is a difficulty with canned queries. When I want to stream all results of a canned query TwoDays I get only first 1.000 records. Example: `http://myserver/history_sample/two_days.csv?_stream=on` returns only first 1.000 records. If I do the same with the whole database i.e. `http://myserver/history_sample/database.csv?_stream=on` I get correctly all records. Any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459936585,MDU6SXNzdWU0NTk5MzY1ODU=,527,Unable to use rank when fts-table generated with csvs-to-sqlite,2181410,clausjuhl,closed,0,,,,,3,2019-06-24T14:49:48Z,2019-06-24T15:21:18Z,2019-06-24T15:09:10Z,NONE,,"Hi Simon. If i generate a fts-table with the csvs-to-sqlite f-option, I'm unable to use (in datasette's GUI) the internal ranking of the table for sorting or viewing, but if I generate the fts-table with the enable-fts argument from sqlite-utils, everyrthing works ok. Eg.: datasette, version 0.28 sqlite-utils, version 1.2.1 csvs-to-sqlite, version 0.9 No column named rank with these commands: $ csvs-to-sqlite minutes.csv minutes.db -f text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Everything ok with these commands: $ csvs-to-sqlite minutes.csv minutes.db $ sqlite-utils enable-fts minutes.db text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Am I doing something wrong? Thank you for a great application!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 460095928,MDU6SXNzdWU0NjAwOTU5Mjg=,528,Establish a pattern for Datasette plugins built on top of Pandas,9599,simonw,open,0,,,,,0,2019-06-24T21:05:52Z,2019-06-24T21:05:52Z,,OWNER,,"The Pandas ecosystem is huge, varied and full of tools that are really good at doing interesting analysis on top of tabular data. Pandas should not be a dependency of Datasette core, but I think there is a lot of potential in having plugins which use Pandas to apply interesting analysis to data sucked out of Datasette's SQLite tables. One example ([thanks, Tony](https://twitter.com/psychemedia/status/1143259809715752962)): https://github.com/ResidentMario/missingno could form the basis of a fantastic plugin for getting a high-level overview of how complete each column in a table is. Some thought is needed here about what shape these kind of plugins might take, and what plugin hooks they would use.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 460396952,MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2,529,Use keyed rows - fixes #521,1383872,nathancahill,closed,0,,,,,1,2019-06-25T12:33:48Z,2019-06-25T12:35:07Z,2019-06-25T12:35:07Z,NONE,simonw/datasette/pulls/529,"Supports template syntax like this: ``` {% for row in display_rows %}{{ row[""First_Name""] }} {{ row[""Last_Name""] }}
... ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/529/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 460540321,MDU6SXNzdWU0NjA1NDAzMjE=,530,Extract codemirror SQL editor out into a plugin,9599,simonw,closed,0,,,,,1,2019-06-25T17:07:51Z,2020-10-01T00:42:08Z,2020-10-01T00:42:08Z,OWNER,,"Right now codemirror (used for the SQL editor on https://latest.datasette.io/fixtures?sql=select+*+from+%5B123_starts_with_digits%5D ) is the only JavaScript in Datasette. It's also the only vendored dependency. I'd like to move it out to a plugin. But... ideally I would like that plugin to be part of the default ""pip install datasette"" experience. I don't know what the best pattern for optional dependencies is. I don't want to have to tell people to run `pip install datasette[full]`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/530/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 461215118,MDU6SXNzdWU0NjEyMTUxMTg=,30,Option to open database in read-only mode,9599,simonw,closed,0,,,,,1,2019-06-26T22:50:38Z,2020-05-11T19:17:17Z,2020-05-11T19:17:17Z,OWNER,,Would this make it 100% safe to run reads against a database file that is being written to by another process?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 461237618,MDU6SXNzdWU0NjEyMzc2MTg=,31,Mechanism for adding multiple foreign key constraints at once,9599,simonw,closed,0,,,,,0,2019-06-27T00:04:30Z,2019-06-29T06:27:40Z,2019-06-29T06:27:40Z,OWNER,,"Needed by [db-to-sqlite](https://github.com/simonw/db-to-sqlite). It currently works by collecting all of the foreign key relationships it can find and then applying them at the end of the process. The problem is, the `add_foreign_key()` method looks like this: https://github.com/simonw/sqlite-utils/blob/86bd2bba689e25f09551d611ccfbee1e069e5b66/sqlite_utils/db.py#L498-L516 That means it's doing a full `VACUUM` for every single relationship it sets up - and if you have hundreds of foreign key relationships in your database this can take hours. I think the right solution is to have a `.add_foreign_keys(list_of_args)` method which does the bulk operation and then a single `VACUUM`. `.add_foreign_key(...)` can then call the bulk action with a single list item.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462094937,MDExOlB1bGxSZXF1ZXN0MjkyODc5MjA0,32,db.add_foreign_keys() method,9599,simonw,closed,0,,,,,1,2019-06-28T15:40:33Z,2019-06-29T06:27:39Z,2019-06-29T06:27:39Z,OWNER,simonw/sqlite-utils/pulls/32,"Refs #31. Still TODO: - [x] Unit tests - [x] Documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 462117311,MDU6SXNzdWU0NjIxMTczMTE=,531,/database/-/inspect,9599,simonw,open,0,,,,,1,2019-06-28T16:33:41Z,2019-07-08T15:43:57Z,,OWNER,,"Build `/database/-/inspect` which shows tables, columns, column types and foreign keys It won't show table counts. Or maybe it will include them optionally but only for `-i` databases, in a special area of the JSON reserved for immutable-only inspect details. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/465#issuecomment-506797086_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/531/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 462423839,MDU6SXNzdWU0NjI0MjM4Mzk=,33,index_foreign_keys / index-foreign-keys utilities,9599,simonw,closed,0,,,,,2,2019-06-30T16:42:03Z,2019-06-30T23:54:11Z,2019-06-30T23:50:55Z,OWNER,,"Sometimes it's good to have indices on all columns that are foreign keys, to allow for efficient reverse lookups. This would be a useful utility: $ sqlite-utils index-foreign-keys database.db ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462423972,MDExOlB1bGxSZXF1ZXN0MjkzMTE3MTgz,34,sqlite-utils index-foreign-keys / db.index_foreign_keys(),9599,simonw,closed,0,,,,,0,2019-06-30T16:43:40Z,2019-06-30T23:50:55Z,2019-06-30T23:50:55Z,OWNER,simonw/sqlite-utils/pulls/34,"Refs #33 - [x] `sqlite-utils index-foreign-keys` command - [x] `db.index_foreign_keys()` method - [x] unit tests - [x] documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 462430920,MDU6SXNzdWU0NjI0MzA5MjA=,35,table.update(...) method,9599,simonw,closed,0,,,,,2,2019-06-30T18:06:15Z,2019-07-28T15:43:52Z,2019-07-28T15:43:52Z,OWNER,,"Spun off from #23 - this method will allow a user to update a specific row. Currently the only way to do that it is to call `.upsert({full record})` with the primary key field matching an existing record - but this does not support partial updates. ```python db[""events""].update(3, {""name"": ""Renamed""}) ``` This method only works on an existing table, so there's no need for a `pk=""id""` specifier - it can detect the primary key by looking at the table. If the primary key is compound the first argument can be a tuple: ```python db[""events_venues""].update((3, 2), {""custom_label"": ""Label""}) ``` The method can be called without the second dictionary argument. Doing this selects the row specified by the primary key (throwing an error if it does not exist) and remembers it so that chained operations can be carried out - see proposal in https://github.com/simonw/sqlite-utils/issues/23#issuecomment-507055345 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462817589,MDU6SXNzdWU0NjI4MTc1ODk=,36,Support compound primary keys,9599,simonw,closed,0,,,,,0,2019-07-01T17:00:07Z,2019-07-15T04:28:52Z,2019-07-15T04:28:52Z,OWNER,,"This should work: ```python table = db[""dog_breeds""].insert({ ""dog_id"": 1, ""breed_id"": 2 }, pk=(""dog_id"", ""breed_id"")) ``` Needed for m2m work in #23",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462928038,MDU6SXNzdWU0NjI5MjgwMzg=,532,Switch setup.py to using ~= for dependencies,9599,simonw,closed,0,,,,,0,2019-07-01T21:53:48Z,2019-07-03T04:32:58Z,2019-07-03T04:32:58Z,OWNER,,"`~=` means ""compatible release"" https://www.python.org/dev/peps/pep-0440/#compatible-release See also https://stackoverflow.com/questions/39590187/in-requirements-txt-what-does-tilde-equals-mean",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 463492395,MDExOlB1bGxSZXF1ZXN0MjkzOTYyNDA1,533,"Support cleaner custom templates for rows and tables, closes #521",9599,simonw,closed,0,,,,,1,2019-07-03T00:40:18Z,2019-07-03T03:23:06Z,2019-07-03T03:23:06Z,OWNER,simonw/datasette/pulls/533,"- [x] Rename `_rows_and_columns.html` to `_table.html` - [x] Unit test - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463492815,MDU6SXNzdWU0NjM0OTI4MTU=,534,500 error on m2m facet detection,9599,simonw,open,0,,,,,1,2019-07-03T00:42:42Z,2020-12-17T05:08:22Z,,OWNER,,"This may help debug: ``` diff --git a/datasette/facets.py b/datasette/facets.py index 76d73e5..07a4034 100644 --- a/datasette/facets.py +++ b/datasette/facets.py @@ -499,11 +499,14 @@ class ManyToManyFacet(Facet): ""outgoing"" ] if len(other_table_outgoing_foreign_keys) == 2: - destination_table = [ - t - for t in other_table_outgoing_foreign_keys - if t[""other_table""] != self.table - ][0][""other_table""] + try: + destination_table = [ + t + for t in other_table_outgoing_foreign_keys + if t[""other_table""] != self.table + ][0][""other_table""] + except IndexError: + import pdb; pdb.pm() # Only suggest if it's not selected already if (""_facet_m2m"", destination_table) in args: continue ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 463531894,MDExOlB1bGxSZXF1ZXN0MjkzOTkyMzgy,535,"Added asgi_wrapper plugin hook, closes #520",9599,simonw,closed,0,,,,,0,2019-07-03T03:58:00Z,2019-07-03T04:06:26Z,2019-07-03T04:06:26Z,OWNER,simonw/datasette/pulls/535,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463534974,MDExOlB1bGxSZXF1ZXN0MjkzOTk0NDQz,536,"Switch to ~= dependencies, closes #532",9599,simonw,closed,0,,,,,0,2019-07-03T04:12:16Z,2019-07-03T04:32:55Z,2019-07-03T04:32:55Z,OWNER,simonw/datasette/pulls/536,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/536/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463544206,MDU6SXNzdWU0NjM1NDQyMDY=,537,"Populate ""endpoint"" key in ASGI scope",9599,simonw,open,0,,,,,12,2019-07-03T04:54:47Z,2019-07-22T06:03:18Z,,OWNER,,"This is a trick used by Starlette so that other layers of ASGI middleware can see which route was selected. They added it here: https://github.com/encode/starlette/commit/34d0097feb6f057bd050d5057df5a2f96b97384e If Datasette supports it as well we can benefit from it if we integrate this sentry_asgi middleware (probably as a `datasette-sentry` plugin): https://github.com/encode/sentry-asgi/blob/c6a42d44d31f85885b79e4ee898683ecf8104971/sentry_asgi/middleware.py#L34-L35",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/537/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 463915863,MDU6SXNzdWU0NjM5MTU4NjM=,538,Mechanism for secrets in plugin configuration,9599,simonw,closed,0,,,,,3,2019-07-03T19:23:34Z,2019-07-04T05:47:54Z,2019-07-04T05:47:54Z,OWNER,,"See https://github.com/simonw/datasette-auth-github/issues/1 We need a mechanism where by plugins can tap into ""secret"" config options without exposing them in the visible metadata.json (where plugin configs currently live, see https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration )",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464040911,MDExOlB1bGxSZXF1ZXN0Mjk0NDAwNDQ2,539,Secret plugin configuration options,9599,simonw,closed,0,,,,,2,2019-07-04T03:21:20Z,2019-07-04T05:36:45Z,2019-07-04T05:36:45Z,OWNER,simonw/datasette/pulls/539,Refs #538 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464449570,MDU6SXNzdWU0NjQ0NDk1NzA=,540,Add a universal navigation bar which can be modified by plugins,9599,simonw,closed,0,,,,,8,2019-07-05T03:50:33Z,2019-07-06T23:13:29Z,2019-07-06T23:11:35Z,OWNER,,"Needed by https://github.com/simonw/datasette-auth-github/issues/5 We already have a navigation breadcrumbs header on some pages, I can extend that to be present on every page and make it easy to modify with custom templates. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464779810,MDU6SXNzdWU0NjQ3Nzk4MTA=,541,Plugin hook for adding extra template context variables,9599,simonw,closed,0,,,,,2,2019-07-05T21:37:05Z,2019-07-06T00:05:59Z,2019-07-06T00:05:59Z,OWNER,,"It turns out I need this for https://github.com/simonw/datasette-auth-github/issues/5 It can be modelled on the `extra_body_script` hook: https://datasette.readthedocs.io/en/stable/plugins.html#extra-body-script-template-database-table-view-name-datasette",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464786717,MDExOlB1bGxSZXF1ZXN0Mjk0OTkyNTc4,542,extra_template_vars plugin hook,9599,simonw,closed,0,,,,,5,2019-07-05T22:19:17Z,2019-07-06T00:05:57Z,2019-07-06T00:05:56Z,OWNER,simonw/datasette/pulls/542,Refs #541,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464868844,MDU6SXNzdWU0NjQ4Njg4NDQ=,543,datasette publish option for setting plugin configuration secrets,9599,simonw,closed,0,,,4471010,Datasette 0.29,3,2019-07-06T16:21:23Z,2019-07-08T02:06:34Z,2019-07-08T02:06:34Z,OWNER,,Follow-on from #538 - the `datasette publish` command needs a way of passing secrets which will be made available to plugin configuration but will not be exposed in `/-/metadata.json`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464894812,MDExOlB1bGxSZXF1ZXN0Mjk1MDY1Nzk2,544,--plugin-secret option,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-06T22:18:20Z,2019-07-08T02:06:31Z,2019-07-08T02:06:31Z,OWNER,simonw/datasette/pulls/544,"Refs #543 - [x] Zeit Now v1 support - [x] Solve escaping of ENV in Dockerfile - [x] Heroku support - [x] Unit tests - [x] Cloud Run support - [x] Documentation ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464905894,MDU6SXNzdWU0NjQ5MDU4OTQ=,545,Fix header on 404 page,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T01:47:40Z,2019-07-07T20:26:55Z,2019-07-07T20:26:55Z,OWNER,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464987783,MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz,546,Facet by delimiter,9599,simonw,open,0,,,,,2,2019-07-07T20:06:05Z,2019-11-18T23:46:01Z,,OWNER,simonw/datasette/pulls/546,Refs #510,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464990184,MDU6SXNzdWU0NjQ5OTAxODQ=,547,Release notes for 0.29,9599,simonw,closed,0,,,4471010,Datasette 0.29,2,2019-07-07T20:30:28Z,2019-07-08T03:31:59Z,2019-07-08T03:31:59Z,OWNER,,There's a lot of stuff... https://github.com/simonw/datasette/compare/0.28...master,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464994105,MDU6SXNzdWU0NjQ5OTQxMDU=,548,Add datasette-cors and datasette-auth-github plugins to Ecosystem page,9599,simonw,closed,0,,,4471010,Datasette 0.29,0,2019-07-07T21:14:14Z,2019-07-08T02:02:36Z,2019-07-08T02:02:36Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465001185,MDU6SXNzdWU0NjUwMDExODU=,549,Send pull request to the repo that the _table.html template will break,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T22:45:17Z,2019-07-08T03:36:46Z,2019-07-08T03:36:45Z,OWNER,,"Bump this to 0.29 https://github.com/simonw/salaries-datasette/blob/master/requirements/base.txt And rename https://github.com/simonw/salaries-datasette/blob/master/templates/_rows_and_columns.html to _table.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465002978,MDU6SXNzdWU0NjUwMDI5Nzg=,550,Pull m2m faceting out of master so we can ship a release without it,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T23:10:48Z,2019-07-07T23:21:22Z,2019-07-07T23:21:22Z,OWNER,,After spending some time with #495 I believe I need to make some pretty major changes to how m2m faceting works. I don't want it to block the release of ASGI Datasette so I'm going to revert it back out of master for the moment and merge it back in after the release has gone out.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465003070,MDU6SXNzdWU0NjUwMDMwNzA=,551,Ship many-to-many faceting support (and facet-by-delimiter),9599,simonw,open,0,,,,,2,2019-07-07T23:11:45Z,2019-07-08T15:45:23Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 465019882,MDU6SXNzdWU0NjUwMTk4ODI=,552,"Add --plugin-secret support to ""datasette package""",9599,simonw,open,0,,,,,1,2019-07-08T01:46:47Z,2019-07-08T01:47:30Z,,OWNER,,"Split out from #544. I think I should combine this with #347 (renaming `datasette package` to `datasette publish docker`).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 465327844,MDU6SXNzdWU0NjUzMjc4NDQ=,553,Potential improvements to facet-by-date,9599,simonw,open,0,,,,,3,2019-07-08T15:37:53Z,2019-07-08T15:41:55Z,,OWNER,,"In addition to #483 Tobias had some useful suggestions on Twitter: https://twitter.com/rixxtr/status/1148253926476701696 > I think for date facets, it might be more meaningful to order them by date, rather than by size? Or offer both? I'm *definitely* often interested in size-over-time, so https://data.rixx.de/django_tickets/tickets?_facet_date=created#facet-created … isn't all that helpful! Screenshot of that link: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 465728430,MDExOlB1bGxSZXF1ZXN0Mjk1NzExNTA0,554,Fix static mounts using relative paths and prevent traversal exploits,3243482,abdusco,closed,0,,,,,4,2019-07-09T11:32:02Z,2019-07-11T16:29:26Z,2019-07-11T16:13:19Z,CONTRIBUTOR,simonw/datasette/pulls/554,"While debugging why my static mounts using a relative path (`--static mystatic:rel/path/to/dir`) not working, I noticed that the requests fail no matter what, returning 404 errors. The reason is that datasette tries to prevent traversal exploits by checking if the path is relative to its registered directory. This check fails when the mount is a relative directory, because `/abs/dir/file` obviously not under `dir/file`. https://github.com/simonw/datasette/blob/81fa8b6cdc5457b42a224779e5291952314e8d20/datasette/utils/asgi.py#L303-L306 This also has the consequence of returning any requested file, because when `/abs/dir/../../evil.file` resolves `aiofiles` happily returns it to the client after it resolves the path itself. The solution is to make sure we're checking relativity of paths after they're fully resolved. I've implemented the mentioned changes and also updated the tests.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 465731062,MDU6SXNzdWU0NjU3MzEwNjI=,555,Static mounts with relative paths not working,3243482,abdusco,closed,0,,,,,0,2019-07-09T11:38:35Z,2019-07-11T16:13:22Z,2019-07-11T16:13:22Z,CONTRIBUTOR,,"Datasette fails to serve files from static mounts that are created using relative paths `datasette --static mystatic:rel/path/to/static/dir`. I've explained the problem and the solution in the pull request: https://github.com/simonw/datasette/pull/554",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465773546,MDExOlB1bGxSZXF1ZXN0Mjk1NzQ4MjY4,556,Add support for running datasette as a module,3243482,abdusco,closed,0,,,,,1,2019-07-09T13:13:30Z,2019-07-11T16:07:45Z,2019-07-11T16:07:44Z,CONTRIBUTOR,simonw/datasette/pulls/556,"This PR allows running datasette using `python -m datasette` command in addition to just running the executable. This function is quite useful when debugging a plugin in a project because IDEs like PyCharm can easily start a debug session when datasette is run as a module in contrast to trying to attach a debugger to a running process. ![image](https://user-images.githubusercontent.com/3243482/60890448-fc4ede80-a263-11e9-8b42-d2a3db8d1a59.png) ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/556/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 465815372,MDU6SXNzdWU0NjU4MTUzNzI=,37,Experiment with type hints,9599,simonw,closed,0,,,,,6,2019-07-09T14:30:34Z,2021-08-18T21:48:57Z,2021-08-18T21:48:57Z,OWNER,,"Since it's designed to be used in Jupyter or for rapid prototyping in an IDE (and it's still pretty small) `sqlite-utils` feels like a great candidate for me to finally try out Python type hints. https://veekaybee.github.io/2019/07/08/python-type-hints/ is good. It suggests the mypy docs for getting started: https://mypy.readthedocs.io/en/latest/existing_code.html plus this tutorial: https://pymbook.readthedocs.io/en/latest/typehinting.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 466996584,MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw,557,Get tests running on Windows using Travis CI,9599,simonw,closed,0,,,,,4,2019-07-11T16:36:57Z,2021-07-10T23:39:48Z,2021-07-10T23:39:48Z,OWNER,simonw/datasette/pulls/557,Refs #511,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467218270,MDU6SXNzdWU0NjcyMTgyNzA=,558,Support unicode in url,380586,0x1997,closed,0,,,,,4,2019-07-12T04:43:24Z,2019-07-15T01:29:30Z,2019-07-14T02:49:33Z,NONE,,"Hi, I defined some custom queries in my `metadata.json`. There are Chinese characters in the names of the queries. So the urls are like `http://127.0.0.1:8001/mydb/测试查询`. When opening such urls, datasette will throw an exception. ``` Traceback (most recent call last): File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 100, in __call__ return await view(new_scope, receive, send) File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 172, in view request, **scope[""url_route""][""kwargs""] File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 267, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 471, in view_get for key in self.ds.renderers.keys() File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 471, infor key in self.ds.renderers.keys() File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 655, in path_with_format path = request.path File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 49, in path self.scope.get(""raw_path"", self.scope[""path""].encode(""latin-1"")) UnicodeEncodeError: 'latin-1' codec can't encode characters in position 9-11: ordinal not in range(256) ``` This used to work when datasette was based on sanic. Btw, thanks for the great work!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467623820,MDExOlB1bGxSZXF1ZXN0Mjk3MjQzMDcz,559,Bump to uvicorn 0.8.4,9599,simonw,closed,0,,,,,0,2019-07-12T22:30:29Z,2019-07-13T22:34:58Z,2019-07-13T22:34:58Z,OWNER,simonw/datasette/pulls/559,"https://github.com/encode/uvicorn/commits/0.8.4 Query strings will now be included in log files: https://github.com/encode/uvicorn/pull/384",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467790646,MDU6SXNzdWU0Njc3OTA2NDY=,560,CodeMirror fails to load on database page,9599,simonw,closed,0,,,,,3,2019-07-14T03:31:00Z,2019-09-03T01:03:02Z,2019-07-14T03:38:59Z,OWNER,,"It's not loading on https://latest.datasette.io/fixtures But it does load on https://latest.datasette.io/fixtures?sql=select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467862459,MDExOlB1bGxSZXF1ZXN0Mjk3NDEyNDY0,38,table.update() method,9599,simonw,closed,0,,,,,2,2019-07-14T17:03:49Z,2019-07-28T15:43:51Z,2019-07-28T15:43:51Z,OWNER,simonw/sqlite-utils/pulls/38,"Refs #35 Still to do: - [x] Unit tests - [x] Switch to using `.get()` - [x] Better exceptions, plus unit tests for what happens if pk does not exist - [x] Documentation - [x] Ensure compound primary keys work properly - [x] `alter=True` support",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467864071,MDU6SXNzdWU0Njc4NjQwNzE=,39,table.get(...) method,9599,simonw,closed,0,,,,,0,2019-07-14T17:20:51Z,2019-07-15T04:28:53Z,2019-07-15T04:28:53Z,OWNER,,"Utility method for fetching a record by its primary key. Accepts a single value (for primary key / rowid tables) or a list/tuple of values (for compound primary keys, refs #36). Raises a `NotFoundError` if the record cannot be found.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467928674,MDExOlB1bGxSZXF1ZXN0Mjk3NDU5Nzk3,40,.get() method plus support for compound primary keys,9599,simonw,closed,0,,,,,1,2019-07-15T03:43:13Z,2019-07-15T04:28:57Z,2019-07-15T04:28:52Z,OWNER,simonw/sqlite-utils/pulls/40,"- [x] Tests for the `NotFoundError` exception - [x] Documentation for `.get()` method - [x] Support `--pk` multiple times to define CLI compound primary keys - [x] Documentation for compound primary keys",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 469828961,MDExOlB1bGxSZXF1ZXN0Mjk4OTYyNTUx,561,Fix typos,15278512,minho42,closed,0,,,,,0,2019-07-18T15:13:35Z,2019-07-26T10:25:45Z,2019-07-26T10:25:45Z,CONTRIBUTOR,simonw/datasette/pulls/561,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 470131537,MDU6SXNzdWU0NzAxMzE1Mzc=,41,sqlite-utils insert --tsv option,9599,simonw,closed,0,,,,,0,2019-07-19T04:27:21Z,2019-07-19T04:50:47Z,2019-07-19T04:50:47Z,OWNER,,"Right now we only support ingesting CSV, but sometimes interesting data is released as TSV. https://www.washingtonpost.com/national/2019/07/18/how-download-use-dea-pain-pills-database/ for example.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470345929,MDU6SXNzdWU0NzAzNDU5Mjk=,42,"table.extract(...) method and ""sqlite-utils extract"" command",9599,simonw,closed,0,,,5897911,2.20,21,2019-07-19T14:09:36Z,2020-09-22T23:39:31Z,2020-09-22T23:37:49Z,OWNER,,"One of my favourite features of [csvs-to-sqlite](https://github.com/simonw/csvs-to-sqlite) is that it can ""extract"" columns into a separate lookup table - for example: csvs-to-sqlite big_csv_file.csv -c country output.db This will turn the `country` column in the resulting table into a integer foreign key against a new `country` table. You can see an example of what that looks like here: https://san-francisco.datasettes.com/registered-business-locations-3d50679/Business+Corridor was extracted from https://san-francisco.datasettes.com/registered-business-locations-3d50679/Registered_Business_Locations_-_San_Francisco?Business%20Corridor=1 I'd like to have the same capability in `sqlite-utils` - but with the ability to run it against an existing SQLite table rather than just against a CSV.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470542938,MDU6SXNzdWU0NzA1NDI5Mzg=,562,Facet by array shouldn't suggest for arrays that are not arrays-of-strings,9599,simonw,closed,0,,,,,2,2019-07-19T20:51:29Z,2019-11-01T19:42:10Z,2019-11-01T19:37:55Z,OWNER,,"It's triggering for arrays that look like this at the moment: ```json [ { ""type"": ""HKWorkoutEventTypeSegment"", ""date"": ""2019-05-21 09:43:50 -0700"", ""duration"": ""12.2780519704024"", ""durationUnit"": ""min"" }, { ""type"": ""HKWorkoutEventTypeSegment"", ""date"": ""2019-05-21 09:43:50 -0700"", ""duration"": ""19.467273102204"", ""durationUnit"": ""min"" } ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637068,MDU6SXNzdWU0NzA2MzcwNjg=,1,Use XML Analyser to figure out the structure of the export XML,9599,simonw,closed,0,,,,,1,2019-07-20T05:19:02Z,2019-07-20T05:20:09Z,2019-07-20T05:20:09Z,MEMBER,,https://github.com/simonw/xml_analyser,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637152,MDU6SXNzdWU0NzA2MzcxNTI=,2,Import workouts,9599,simonw,closed,0,,,,,1,2019-07-20T05:20:21Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,From #1,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637206,MDU6SXNzdWU0NzA2MzcyMDY=,3,Import ActivitySummary,9599,simonw,closed,0,,,,,0,2019-07-20T05:21:00Z,2019-07-20T05:58:07Z,2019-07-20T05:58:07Z,MEMBER,,"From #1 ```python 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980, 'activeEnergyBurnedGoal': 980, 'activeEnergyBurnedUnit': 980, 'appleExerciseTime': 980, 'appleExerciseTimeGoal': 980, 'appleStandHours': 980, 'appleStandHoursGoal': 980, 'dateComponents': 980}, 'child_counts': {}, 'count': 980, 'parent_counts': {'HealthData': 980}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470640505,MDU6SXNzdWU0NzA2NDA1MDU=,4,Import Records,9599,simonw,closed,0,,,,,1,2019-07-20T06:11:20Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,"From #1: ```python 'Record': {'attr_counts': {'creationDate': 2672233, 'device': 2665111, 'endDate': 2672233, 'sourceName': 2672233, 'sourceVersion': 2671779, 'startDate': 2672233, 'type': 2672233, 'unit': 2650012, 'value': 2672232}, 'child_counts': {'HeartRateVariabilityMetadataList': 2318, 'MetadataEntry': 287974}, 'count': 2672233, 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691622,MDU6SXNzdWU0NzA2OTE2MjI=,5,Add progress bar,9599,simonw,closed,0,,,,,2,2019-07-20T16:29:07Z,2019-07-22T03:30:13Z,2019-07-22T02:49:22Z,MEMBER,,"Showing a progress bar would be nice, using Click. The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser: https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691999,MDU6SXNzdWU0NzA2OTE5OTk=,43,.add_column() doesn't match indentation of initial creation,9599,simonw,closed,0,,,,,3,2019-07-20T16:33:10Z,2019-07-23T13:09:11Z,2019-07-23T13:09:05Z,OWNER,,"I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this: ```sql CREATE TABLE [records] ( [type] TEXT, [sourceName] TEXT, [sourceVersion] TEXT, [unit] TEXT, [creationDate] TEXT, [startDate] TEXT, [endDate] TEXT, [value] TEXT, [metadata_Health Mate App Version] TEXT, [metadata_Withings User Identifier] TEXT, [metadata_Modified Date] TEXT, [metadata_Withings Link] TEXT, [metadata_HKWasUserEntered] TEXT , [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT) ``` It would be nice if the columns that were added later matched the indentation of the initial columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470856782,MDU6SXNzdWU0NzA4NTY3ODI=,6,Break up records into different tables for each type,9599,simonw,closed,0,,,,,1,2019-07-22T01:54:59Z,2019-07-22T03:28:55Z,2019-07-22T03:28:50Z,MEMBER,,"I don't think there's much benefit to having all of the different record types stored in the same enormous table. Here's what I get when I use `_facet=type`: I'm going to try splitting these up into separate tables - so `HKQuantityTypeIdentifierBodyMassIndex` becomes a table called `rBodyMassIndex` - and see if that's nicer to work with.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471292050,MDU6SXNzdWU0NzEyOTIwNTA=,563,incorrect json url for row-level data?,10352819,rprimet,closed,0,,,,,0,2019-07-22T19:59:38Z,2019-10-21T02:03:09Z,2019-10-21T02:03:09Z,CONTRIBUTOR,,"While visiting [this example page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/people/uk.org.publicwhip%2Fperson%2F10001) (linked from Datasette documentation), manually clicking on [the link](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/people/uk.org.publicwhip%2Fperson%2F10001?_format=json) (""This data as .json"") to the json data results in an error 500 `data() got an unexpected keyword argument 'as_format'` The [JSON page linked to from the documentation](https://register-of-members-interests.datasettes.com/regmem-d22c12c/people/uk.org.publicwhip%2Fperson%2F10001.json) however is correct (the page address ends in `.json` rather than using a query string `?format=json`) This particular datasette demo page is now a few versions behind, but I was able to reproduce the issue using v0.29.2 and a downloaded copy of the demo database (and also with the current HEAD). Here is a stack trace: ``` Traceback (most recent call last): File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 101, in __call__ return await view(new_scope, receive, send) File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 173, in view request, **scope[""url_route""][""kwargs""] File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/views/base.py"", line 267, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/views/base.py"", line 399, in view_get request, database, hash, **kwargs TypeError: data() got an unexpected keyword argument 'as_format' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471628483,MDU6SXNzdWU0NzE2Mjg0ODM=,44,Utilities for building lookup tables,9599,simonw,closed,0,,,,,2,2019-07-23T10:59:58Z,2019-07-23T13:07:01Z,2019-07-23T13:07:01Z,OWNER,,"While building https://github.com/dogsheep/healthkit-to-sqlite I found a need for a neat mechanism for easily building lookup tables - tables where each unique value in a column is replaced by a foreign key to a separate table. csvs-to-sqlite currently creates those with its ""extract"" mechanism - but that's written as custom code against Pandas. I'd like to eventually replace Pandas with sqlite-utils there. See also #42 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471684708,MDExOlB1bGxSZXF1ZXN0MzAwMjg2NTM1,45,"Implemented table.lookup(...), closes #44",9599,simonw,closed,0,,,,,0,2019-07-23T13:03:30Z,2019-07-23T13:07:00Z,2019-07-23T13:07:00Z,OWNER,simonw/sqlite-utils/pulls/45,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 471780443,MDU6SXNzdWU0NzE3ODA0NDM=,46,extracts= option for insert/update/etc,9599,simonw,closed,0,,,,,3,2019-07-23T15:55:46Z,2020-03-01T16:53:40Z,2019-07-23T17:00:44Z,OWNER,,"Relates to #42 and #44. I want the ability to extract values out into lookup tables during bulk insert/upsert operations. `db.insert_all(rows, extracts=[""species""])` - creates species table for values in the species column `db.insert_all(rows, extracts={""species"": ""Species""})` - as above but the new table is called `Species`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471797101,MDExOlB1bGxSZXF1ZXN0MzAwMzc3NTk5,47,extracts= table parameter,9599,simonw,closed,0,,,,,0,2019-07-23T16:30:29Z,2019-07-23T17:00:43Z,2019-07-23T17:00:43Z,OWNER,simonw/sqlite-utils/pulls/47,Still needs docs. Refs #46,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 471818939,MDU6SXNzdWU0NzE4MTg5Mzk=,48,"Jupyter notebook demo of the library, launchable on Binder",9599,simonw,closed,0,,,,,2,2019-07-23T17:05:05Z,2022-01-26T02:08:46Z,2022-01-26T02:08:39Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472097220,MDU6SXNzdWU0NzIwOTcyMjA=,7,Script uses a lot of RAM,9599,simonw,closed,0,,,,,3,2019-07-24T06:11:11Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,,"I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor. I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later. http://effbot.org/elementtree/iterparse.htm says I should use `elem.clear()` as I go. It also says: > The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element. So I will try that recipe and see if it helps.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472104705,MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx,8,Use less RAM,9599,simonw,closed,0,,,,,0,2019-07-24T06:35:01Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,dogsheep/healthkit-to-sqlite/pulls/8,Closes #7,197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 472115381,MDU6SXNzdWU0NzIxMTUzODE=,49,extracts= should support multiple-column extracts,9599,simonw,open,0,,,,,10,2019-07-24T07:06:41Z,2020-10-16T19:18:19Z,,OWNER,,"Lookup tables can be constructed on compound columns, but the `extracts=` option doesn't currently support that. Right now extracts can be defined in two ways: ```python # Extract these columns into tables with the same name: dogs = db.table(""dogs"", extracts=[""breed"", ""most_recent_trophy""]) # Same as above but with custom table names: dogs = db.table(""dogs"", extracts={""breed"": ""Breeds"", ""most_recent_trophy"": ""Trophies""}) ``` Need some kind of syntax for much more complicated extractions, like when two columns (say ""source"" and ""source_version"") are extracted into a single table.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 472429048,MDU6SXNzdWU0NzI0MjkwNDg=,9,Too many SQL variables,166463,tholo,closed,0,,,,,4,2019-07-24T18:24:17Z,2019-07-26T10:01:05Z,2019-07-26T10:01:05Z,NONE,,"Decided to try importing my data, and ran into this: ``` Traceback (most recent call last): File ""/Users/tholo/Source/health/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 50, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 80, in write_records column_order=[""startDate"", ""endDate"", ""value"", ""unit""], File ""/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py"", line 911, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: too many SQL variables ``` Added some debug output in sqlite_utils/db.py, which resulted in: ``` INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) ; ``` with the attached data: ``` ['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user """""" id=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056'] ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473083260,MDU6SXNzdWU0NzMwODMyNjA=,50,"""Too many SQL variables"" on large inserts",9599,simonw,closed,0,,,,,4,2019-07-25T21:43:31Z,2022-11-04T14:38:36Z,2019-07-28T11:59:33Z,OWNER,,"Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9 It looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473288428,MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz,564,First proof-of-concept of Datasette Library,9599,simonw,open,0,,,,,1,2019-07-26T10:22:26Z,2023-02-07T15:14:11Z,,OWNER,simonw/datasette/pulls/564,"Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 473307794,MDU6SXNzdWU0NzMzMDc3OTQ=,565,Conflict between datasette and uvicorn click versions,440503,jonheslop,closed,0,,,,,1,2019-07-26T11:13:40Z,2020-10-02T00:09:55Z,2020-10-02T00:09:55Z,NONE,,"Hello Datasette is awesome thanks so much! I not very familiar with Python but I think there is a problem with datasette docker builds I keep getting this error ``` ERROR: uvicorn 0.8.4 has requirement click==7.*, but you'll have click 6.0 which is incompatible. ERROR: datasette 0.29.2 has requirement click~=7.0, but you'll have click 6.0 which is incompatible. ``` The full log from the docker build is here - https://gist.github.com/jonheslop/e01cd322e761cfaf34f0cb83f86411b0 Just in case it’s helpful this is my setup - https://github.com/dotwatcher/dotwatcher-data",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473733752,MDExOlB1bGxSZXF1ZXN0MzAxODI0MDk3,51,"Fix for too many SQL variables, closes #50",9599,simonw,closed,0,,,,,1,2019-07-28T11:30:30Z,2019-07-28T11:59:32Z,2019-07-28T11:59:32Z,OWNER,simonw/sqlite-utils/pulls/51,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 476413293,MDU6SXNzdWU0NzY0MTMyOTM=,52,Throws error if .insert_all() / .upsert_all() called with empty list,9599,simonw,closed,0,,,,,1,2019-08-03T04:09:00Z,2019-11-07T04:32:39Z,2019-11-07T04:32:39Z,OWNER,,See also https://github.com/simonw/db-to-sqlite/issues/18,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476436920,MDExOlB1bGxSZXF1ZXN0MzAzOTkwNjgz,53,Work in progress: m2m() method for creating many-to-many records,9599,simonw,closed,0,,,,,0,2019-08-03T10:03:56Z,2019-08-04T03:38:10Z,2019-08-04T03:37:33Z,OWNER,simonw/sqlite-utils/pulls/53,"- [x] `table.insert({""name"": ""Barry""}).m2m(""tags"", lookup={""tag"": ""Coworker""})` - [x] Explicit table name `.m2m(""humans"", ..., m2m_table=""relationships"")` - [x] Automatically use an existing m2m table if a single obvious candidate exists (a table with two foreign keys in the correct directions) - [x] Require the explicit `m2m_table=` argument if multiple candidates for the m2m table exist - [x] Documentation Refs #23",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 476437213,MDU6SXNzdWU0NzY0MzcyMTM=,566,Unexpected keyword argument 'hidden',8330931,dvot197007,closed,0,,,,,1,2019-08-03T10:07:57Z,2019-08-03T16:13:36Z,2019-08-03T16:13:36Z,NONE,,"I couldn't get a test example running. I am running python 3.6.8 and tried both windows and windows subsystem for linux, getting the same error. My test.db was created by converting a five line csv file with csvs-to-sqlite. The csv file is: col1, col2, col3 1,2,3 4,5,6 7,8,9 10,11,12 Here is the error message: (myvenv) davido@DESKTOP-L29G79U:~/dot/datasette-eg$ datasette test.db Traceback (most recent call last): File ""/home/davido/dot/datasette-eg/myvenv/bin/datasette"", line 7, in from datasette.cli import cli File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/datasette/cli.py"", line 2, in import uvicorn File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/__init__.py"", line 2, in from uvicorn.main import Server, main, run File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/main.py"", line 224, in headers: typing.List[str], File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/decorators.py"", line 170, in decorator _param_memo(f, OptionClass(param_decls, **attrs)) File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/core.py"", line 1430, in __init__ Parameter.__init__(self, param_decls, type=type, **attrs) TypeError: __init__() got an unexpected keyword argument 'hidden' Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476573875,MDU6SXNzdWU0NzY1NzM4NzU=,567,Datasette Edit,9599,simonw,closed,0,,,,,3,2019-08-04T17:09:28Z,2020-02-25T03:40:50Z,2020-02-25T03:40:50Z,OWNER,,"Datasette started out immutable. Then it gained the ability to run against read-only databases that were being modified by other processes. It's time for the next logical progression: the option to allow Datasette (or more likely individual plugins) to write to the database! This is going to require some careful rethinking of how connection management works.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/567/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476852861,MDU6SXNzdWU0NzY4NTI4NjE=,568,Add database_color as a configurable option,50906992,LBHELewis,open,0,,,,,1,2019-08-05T13:14:45Z,2023-08-11T05:19:42Z,,NONE,,This would be really useful as it would allow us to tie in with colour schemes.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 480961330,MDU6SXNzdWU0ODA5NjEzMzA=,54,"Ability to list views, and to access db[""view_name""].rows / rows_where / etc",20264,ftrain,closed,0,,,,,5,2019-08-15T02:00:28Z,2019-08-23T12:41:09Z,2019-08-23T12:20:15Z,NONE,,"The docs show me how to create a view via `db.create_view()` but I can't seem to get back to that view post-creation; if I query it as a table it returns `None`, and it doesn't appear in the table listing, even though querying the view works fine from inside the sqlite3 command-line. It'd be great to have the view as a pseudo-table, or if the python/sqlite3 module makes that hard to pull off (I couldn't figure it out), to have that edge-case documented next to the `db.create_view()` docs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 481885279,MDU6SXNzdWU0ODE4ODUyNzk=,569,More advanced connection pooling,9599,simonw,open,0,,,,,4,2019-08-17T13:20:41Z,2019-10-02T22:44:37Z,,OWNER,,"We need a much smarter way of handling database connections. Today, connections are simple: Datasette runs a number of threads (defaults to 3) and each thread gets a threadlocal read-only (or immutable) connection to each attached database - opened on demand. For Datasette Library (#417) I want to support potentially hundreds of attached databases. Datasette Edit (#567) is going to introduce a need for writable connections too. I'd also like to be able to run joins across multiple databases (#283) which further complicates things. Supporting thousands of open SQLite connections at once feels like it won't provide good enough performance (though I should benchmark that to be sure). Some kind of connection pooling is likely to be necessary.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 481887482,MDExOlB1bGxSZXF1ZXN0MzA4MjkyNDQ3,55,Ability to introspect and run queries against views,9599,simonw,closed,0,,,,,1,2019-08-17T13:40:56Z,2019-08-23T12:19:42Z,2019-08-23T12:19:42Z,OWNER,simonw/sqlite-utils/pulls/55,See #54 ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487598042,MDU6SXNzdWU0ODc1OTgwNDI=,1,Implement code to pull checkins from the Foursquare API,9599,simonw,closed,0,,,,,0,2019-08-30T17:40:02Z,2019-08-30T18:23:24Z,2019-08-30T18:23:24Z,MEMBER,,"The tool currently only works with a pre-prepared JSON file of checkins. When called without options, it should prompt the user to paste in a Foursquare OAuth token. The `--token=` option should work too, and should be backed up by an optional environment variable.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487598468,MDU6SXNzdWU0ODc1OTg0Njg=,2,--save option to dump checkins to a JSON file on disk,9599,simonw,closed,0,,,,,1,2019-08-30T17:41:06Z,2019-08-31T02:40:21Z,2019-08-31T02:40:21Z,MEMBER,,"This is a complement to the `--load` option - mainly useful for development purposes. (I'll rename `--file` to `--load` as part of this issue).",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487600595,MDU6SXNzdWU0ODc2MDA1OTU=,3,Option to fetch only checkins more recent than the current max checkin,9599,simonw,closed,0,,,,,4,2019-08-30T17:46:45Z,2019-10-16T20:41:23Z,2019-10-16T20:39:59Z,MEMBER,,"The Foursquare checkins API supports ""return every checkin occurring after this point"" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487601121,MDU6SXNzdWU0ODc2MDExMjE=,4,Online tool for getting a Foursquare OAuth token,9599,simonw,closed,0,,,,,1,2019-08-30T17:48:14Z,2019-08-31T18:07:26Z,2019-08-31T18:07:26Z,MEMBER,,"I will link to this from the documentation. See also this conversation on Twitter: https://twitter.com/simonw/status/1166822603023011840 I've decided to go with ""copy and paste in a token"" rather than hooking up a local web server that can have tokens passed to it.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487721884,MDU6SXNzdWU0ODc3MjE4ODQ=,5,Treat Foursquare timestamps as UTC,9599,simonw,closed,0,,,,,0,2019-08-31T02:44:47Z,2019-08-31T02:50:41Z,2019-08-31T02:50:41Z,MEMBER,,"Current test failure is due to timezone differences between my laptop and Circle CI: https://circleci.com/gh/dogsheep/swarm-to-sqlite/3 ``` E Full diff: E - [{'created': '2018-07-01T04:48:19', E ? ^ E + [{'created': '2018-07-01T02:48:19', E ? ^ E 'createdAt': 1530413299, ``` The timestamps I store in `created` should always be UTC.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487847945,MDExOlB1bGxSZXF1ZXN0MzEzMDA3NDgz,56,Escape the table name in populate_fts and search.,49260,amjith,closed,0,,,,,2,2019-09-01T06:29:05Z,2019-09-02T17:23:21Z,2019-09-02T17:23:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/56,"The table names weren't escaped using double quotes in the populate_fts method. Reproducible case: ``` >>> import sqlite_utils >>> db = sqlite_utils.Database(""abc.db"") >>> db[""http://example.com""].insert_all([ ... {""id"": 1, ""age"": 4, ""name"": ""Cleo""}, ... {""id"": 2, ""age"": 2, ""name"": ""Pancakes""} ... ], pk=""id"") >>> db[""http://example.com""].enable_fts([""name""]) Traceback (most recent call last): File """", line 1, in
... (truncated)db[""http://example.com""].enable_fts([""name""]) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 705, in enable_fts self.populate_fts(columns) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 715, in populate_fts self.db.conn.executescript(sql) sqlite3.OperationalError: unrecognized token: "":"" >>> ```",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487987958,MDExOlB1bGxSZXF1ZXN0MzEzMTA1NjM0,57,Add triggers while enabling FTS,49260,amjith,closed,0,,,,,4,2019-09-02T04:23:40Z,2019-09-03T01:03:59Z,2019-09-02T23:42:29Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/57,"This adds the option for a user to set up triggers in the database to keep their FTS table in sync with the parent table. Ref: https://sqlite.org/fts5.html#external_content_and_contentless_tables I would prefer to make the creation of triggers the default behavior, but that will break existing usage where people have been calling `populate_fts` after inserting new rows. I am happy to make changes to the PR as you see fit. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488293926,MDU6SXNzdWU0ODgyOTM5MjY=,58,Support enabling FTS on views,49260,amjith,closed,0,,,,,1,2019-09-02T18:56:36Z,2020-10-16T18:39:36Z,2020-10-16T18:39:31Z,CONTRIBUTOR,,"Right now enable_fts() is only implemented for Table(). Technically sqlite supports enabling fts on views. But it requires deeper thought since views don't have `rowid` and the current implementation of enable_fts() relies on the presence of `rowid` column. It is possible to provide an alternative rowid using the `content_rowid` option to the FTS5() function. Ref: https://sqlite.org/fts5.html#fts5_table_creation_and_initialization > The ""content_rowid"" option, used to set the rowid field of an external content table. This will further complicate `enable_fts()` function by adding an extra argument. I'm wondering if that is outside the scope of this tool or should I work on that feature and send a PR? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488338516,MDU6SXNzdWU0ODgzMzg1MTY=,570,detect_fts should handle alternative table escaping,9599,simonw,closed,0,,,,,0,2019-09-02T23:43:29Z,2019-09-03T00:32:28Z,2019-09-03T00:32:28Z,OWNER,,"sqlite-utils now uses a better way of escaping table names, which has highlighted a bug in Datasette. Datasette has its own version of the `detect_fts` function - at https://github.com/simonw/datasette/blob/d224ee2c98ac39c2c6e21a0ac0c62e5c3e1ccd11/datasette/utils/__init__.py#L466-L479 - which fails to pick up FTS tables created using the new escaping pattern. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/57#issuecomment-527258212_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488338965,MDU6SXNzdWU0ODgzMzg5NjU=,59,Ability to introspect triggers,9599,simonw,closed,0,,,,,0,2019-09-02T23:47:16Z,2019-09-03T01:52:36Z,2019-09-03T00:09:42Z,OWNER,,"Now that we're creating triggers (thanks to @amjith in #57) it would be neat if we could introspect them too. I'm thinking: `db.triggers` - lists all triggers for the database `db[""tablename""].triggers` - lists triggers for that table The underlying query for this is `select * from sqlite_master where type = 'trigger'` I'll return the trigger information in a new namedtuple, similar to how Indexes and ForeignKeys work.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/59/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488341021,MDExOlB1bGxSZXF1ZXN0MzEzMzgzMzE3,60,db.triggers and table.triggers introspection,9599,simonw,closed,0,,,,,0,2019-09-03T00:04:32Z,2019-09-03T00:09:42Z,2019-09-03T00:09:42Z,OWNER,simonw/sqlite-utils/pulls/60,Closes #59,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488343304,MDExOlB1bGxSZXF1ZXN0MzEzMzg0OTI2,571,detect_fts now works with alternative table escaping,9599,simonw,closed,0,,,,,0,2019-09-03T00:23:39Z,2019-09-03T00:32:28Z,2019-09-03T00:32:28Z,OWNER,simonw/datasette/pulls/571,Fixes #570,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488833136,MDU6SXNzdWU0ODg4MzMxMzY=,1,"Imported followers should go in ""users"", relationships in ""following""",9599,simonw,closed,0,,,,,0,2019-09-03T21:27:37Z,2019-09-04T20:23:04Z,2019-09-04T20:23:04Z,MEMBER,,"Right now `twitter-to-sqlite followers` dumps everything in a `followers` table, and doesn't actually record which account they are following! It should instead save them all in a global `users` table and then set up m2m relationships in a `following` table. This also means it should create a record for the specified user in order to record both sides of each relationship.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833698,MDU6SXNzdWU0ODg4MzM2OTg=,2,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user",9599,simonw,closed,0,,,,,3,2019-09-03T21:29:12Z,2019-09-04T20:02:11Z,2019-09-04T20:02:11Z,MEMBER,,"Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do: $ twitter-to-sqlite tweets simonw ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833975,MDU6SXNzdWU0ODg4MzM5NzU=,3,Command for running a search and saving tweets for that search,9599,simonw,closed,0,,,,,6,2019-09-03T21:29:56Z,2019-11-04T05:31:56Z,2019-11-04T05:31:16Z,MEMBER,, $ twitter-to-sqlite search dogsheep,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488835586,MDU6SXNzdWU0ODg4MzU1ODY=,4,Command for importing data from a Twitter Export file,9599,simonw,closed,0,,,,,2,2019-09-03T21:34:13Z,2019-10-11T06:45:02Z,2019-10-11T06:45:02Z,MEMBER,,"Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful. $ twitter-to-sqlite import twitter.db path-to-archive.zip ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488874815,MDU6SXNzdWU0ODg4NzQ4MTU=,5,Write tests that simulate the Twitter API,9599,simonw,open,0,,,,,1,2019-09-03T23:55:35Z,2019-09-03T23:56:28Z,,MEMBER,,I can use betamax for this: https://pypi.org/project/betamax/,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 489419782,MDU6SXNzdWU0ODk0MTk3ODI=,6,Extract extended_entities into a media table,9599,simonw,closed,0,,,,,0,2019-09-04T21:59:10Z,2019-09-04T22:08:01Z,2019-09-04T22:08:01Z,MEMBER,," ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 489429284,MDU6SXNzdWU0ODk0MjkyODQ=,572,Error running datasette publish with just --source_url,9599,simonw,closed,0,,,,,1,2019-09-04T22:19:22Z,2019-11-13T04:28:44Z,2019-11-13T04:28:44Z,OWNER,,"``` datasette publish now cleo.db \ --source_url=""https://twitter.com/cleopaws"" \ ``` Gave me this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490798130,MDU6SXNzdWU0OTA3OTgxMzA=,7,users-lookup command for fetching users,9599,simonw,closed,0,,,,,0,2019-09-08T19:47:59Z,2019-09-08T20:32:13Z,2019-09-08T20:32:13Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup ``` https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282 https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws ``` CLI design: ``` $ twitter-to-sqlite users-lookup simonw cleopaws $ twitter-to-sqlite users-lookup 783214 6253282 --ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490803176,MDU6SXNzdWU0OTA4MDMxNzY=,8,--sql and --attach options for feeding commands from SQL queries,9599,simonw,closed,0,,,,,4,2019-09-08T20:35:49Z,2020-03-20T23:13:01Z,2020-03-20T23:13:01Z,MEMBER,,"Say you want to fetch Twitter profiles for a list of accounts that are stored in another database: $ twitter-to-sqlite users-lookup users.db --attach attending.db \ --sql ""select Twitter from attending.attendes where Twitter is not null"" The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of: - [x] `twitter-to-sqlite users-lookup` - [x] `twitter-to-sqlite user-timeline` - [x] `twitter-to-sqlite followers` and `friends` The `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491219910,MDU6SXNzdWU0OTEyMTk5MTA=,61,importing CSV to SQLite as library,17739,witeshadow,closed,0,,,,,2,2019-09-09T17:12:40Z,2019-11-04T16:25:01Z,2019-11-04T16:25:01Z,NONE,,"CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491791152,MDU6SXNzdWU0OTE3OTExNTI=,9,followers-ids and friends-ids subcommands,9599,simonw,closed,0,,,,,1,2019-09-10T16:58:15Z,2019-09-10T17:36:55Z,2019-09-10T17:36:55Z,MEMBER,,"These will import follower and friendship IDs into the following tables, using these APIs: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492153532,MDU6SXNzdWU0OTIxNTM1MzI=,573,Exposing Datasette via Jupyter-server-proxy,82988,psychemedia,closed,0,,,,,3,2019-09-11T10:32:36Z,2020-03-26T09:41:30Z,2020-03-26T09:41:30Z,CONTRIBUTOR,,"It is possible to expose a running `datasette` service in a Jupyter environment such as a MyBinder environment using the [`jupyter-server-proxy`](https://github.com/jupyterhub/jupyter-server-proxy). For example, using [this demo Binder](https://mybinder.org/v2/gh/binder-examples/r/master?filepath=index.ipynb) which has the server proxy installed, we can then upload a simple test database from the notebook homepage, from a Jupyter termianl install datasette and set it running against the test db on eg port 8001 and then view it via the path `proxy/8001`. Clicking links results in 404s though because the `datasette` links aren't relative to the current path? ![image](https://user-images.githubusercontent.com/82988/64689964-44b69280-d487-11e9-8f9f-3681422bcc9f.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492297930,MDU6SXNzdWU0OTIyOTc5MzA=,10,Rethink progress bars for various commands,9599,simonw,closed,0,,,,,5,2019-09-11T15:06:47Z,2020-04-01T03:45:48Z,2020-04-01T03:45:48Z,MEMBER,,"Progress bars and the `--silent` option are implemented inconsistently across commands at the moment. This is made more challenging by the fact that for many operations the total length is not known. https://click.palletsprojects.com/en/7.x/api/#click.progressbar",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493599818,MDU6SXNzdWU0OTM1OTk4MTg=,1,Command for fetching starred repos,9599,simonw,closed,0,,,,,0,2019-09-14T08:36:29Z,2019-09-14T21:30:48Z,2019-09-14T21:30:48Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493668862,MDU6SXNzdWU0OTM2Njg4NjI=,2,Extract licenses from repos into a separate table,9599,simonw,closed,0,,,,,0,2019-09-14T21:33:41Z,2019-09-14T21:46:58Z,2019-09-14T21:46:58Z,MEMBER,," ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670426,MDU6SXNzdWU0OTM2NzA0MjY=,3,Command to fetch all repos belonging to a user or organization,9599,simonw,closed,0,,,,,2,2019-09-14T21:54:21Z,2019-09-17T00:17:53Z,2019-09-17T00:17:53Z,MEMBER,,"How about this: $ github-to-sqlite repos simonw",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670730,MDU6SXNzdWU0OTM2NzA3MzA=,4,Command to fetch stargazers for one or more repos,9599,simonw,closed,0,,,,,8,2019-09-14T21:58:22Z,2020-05-02T21:30:27Z,2020-05-02T21:30:27Z,MEMBER,,"Maybe this: $ github-to-sqlite stargazers github.db simonw/datasette It could accept more than one repos. Maybe have options similar to `--sql` in [twitter-to-sqlite](https://github.com/dogsheep/twitter-to-sqlite) so you can e.g. fetch all stargazers for all of the repos you have fetched into the database already (or all of the repos belonging to owner X)",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493671014,MDU6SXNzdWU0OTM2NzEwMTQ=,5,"Add ""incomplete"" boolean to users table for incomplete profiles",9599,simonw,closed,0,,,,,2,2019-09-14T22:01:50Z,2020-03-23T19:23:31Z,2020-03-23T19:23:30Z,MEMBER,,"User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc. Add a `incomplete` boolean flag to the `users` table to record this. Then later I can add a `backfill-users` command which loops through and fetches missing data for those incomplete profiles.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 494685791,MDU6SXNzdWU0OTQ2ODU3OTE=,574,Improve usage description of --host option,132978,terrycojones,closed,0,,,,,2,2019-09-17T15:12:12Z,2019-11-01T21:58:17Z,2019-11-01T21:57:54Z,NONE,,"It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for ""example running datasette on a host"" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 496415321,MDU6SXNzdWU0OTY0MTUzMjE=,1,Figure out some interesting example SQL queries,9599,simonw,open,0,,,,,9,2019-09-20T15:28:07Z,2021-05-03T03:46:23Z,,MEMBER,,My knowledge of genetics has left me short here. I'd love to be able to provide some interesting example SELECT queries - maybe one that spots if you are [likely to have red hair?](https://www.snpedia.com/index.php/Rs1805007),209590345,genome-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 497162288,MDU6SXNzdWU0OTcxNjIyODg=,575,Plugin documentation should cover how to bundle static/templates in setup.py,9599,simonw,closed,0,,,6026070,0.51,1,2019-09-23T15:15:18Z,2020-10-24T20:06:17Z,2020-10-24T20:03:53Z,OWNER,,"These sections here should cover it: https://datasette.readthedocs.io/en/latest/plugins.html#static-assets Example: https://github.com/simonw/datasette-auth-github/blob/bf01f8f01b87a6cb09c47380ba0a86e0546ebb38/setup.py#L30 ``` package_data={""datasette_auth_github"": [""templates/*.html""]}, ``` Also from https://github.com/simonw/datasette-plugin-demos/blob/0ccf9e6189e923046047acd7878d1d19a2cccbb1/setup.py#L18-L22 package_data={ 'datasette_plugin_demos': [ 'static/plugin.js', ], }, ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 497170355,MDU6SXNzdWU0OTcxNzAzNTU=,576,Documented internals API for use in plugins,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2019-09-23T15:28:50Z,2021-01-05T23:12:51Z,2021-01-05T23:12:37Z,OWNER,,"Quite a few of the plugin hooks make a `datasette”`instance of the Datasette class available to the plugins, so that they can look up configuration settings and execute database queries. This means it should provide a documented, stable API so that plugin authors can rely on it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 497171390,MDU6SXNzdWU0OTcxNzEzOTA=,577,Utility mechanism for plugins to render templates,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2019-09-23T15:30:36Z,2020-02-04T20:26:20Z,2020-02-04T20:26:19Z,OWNER,,"Sometimes a plugin will need to render a template for some custom UI. We need a documented API for doing this, which ensures that everything will work correctly if you extend base.html etc. See also #576. This could be a `.render()` method on the Datasette class, but that feels a bit weird - should that class also take responsibility for rendering?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 499954048,MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx,578,Added support for multi arch builds,887095,heussd,closed,0,,,,,3,2019-09-29T18:43:03Z,2019-11-13T19:13:15Z,2019-11-13T19:13:15Z,NONE,simonw/datasette/pulls/578,Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 500783373,MDU6SXNzdWU1MDA3ODMzNzM=,62,[enhancement] Method to delete a row in python,4454869,Sergeileduc,closed,0,,,,,5,2019-10-01T09:45:47Z,2019-11-04T16:30:34Z,2019-11-04T16:18:18Z,NONE,,"Hi ! Thanks for the lib ! Obviously, every possible sql queries won't have a dedicated method. But I was thinking : a method to delete a row (I'm terrible with names, maybe `delete_where()` or something, would be useful. I have a Database, with primary key. For the moment, I use : ```Python3 db.conn.execute(f""DELETE FROM table WHERE key = {key_id}"") db.conn.commit() ``` to delete a row I don't need anymore, giving his primary key. Works like a charm. Just an idea : ```Python3 table.delete_where_pkey({'key': key_id}) ``` or something (I know, I'm terrible at naming methods...). Pros : well, no need to write SQL query. Cons : WHERE normally allows to do many more things (operators =, <>, >, <, BETWEEN), not to mention AND, OR, etc... Method is maybe to specific, and/or a pain to render more flexible. Again, just a thought. Writing his own sql works too, so... Thanks again. See yah.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 501773982,MDExOlB1bGxSZXF1ZXN0MzIzOTgzNzMy,579,New connection pooling,9599,simonw,open,0,,,,,1,2019-10-02T23:22:19Z,2019-11-15T22:57:21Z,,OWNER,simonw/datasette/pulls/579,See #569,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 502355384,MDU6SXNzdWU1MDIzNTUzODQ=,580,Testing utilities should be available to plugins,9599,simonw,closed,0,,,,,5,2019-10-03T23:58:26Z,2020-02-28T07:58:46Z,2020-02-28T07:58:46Z,OWNER,,"I'm trying to write a plugin at the moment ([datasette-atom](https://github.com/simonw/datasette-atom)) which needs to run unit tests against a full in-memory Datasette instance, in the same way that the Datasette test suite itself works. I got it working by creating copies of the [TestClient and TestResponse classes](https://github.com/simonw/datasette/blob/a314b761866d250c16f1ff6dd682010cf4181eb4/tests/fixtures.py#L22-L96) within the plugin itself: https://github.com/simonw/datasette-atom/commit/c0e3bd9556d7b31f253a8bf666d42205cd24f4fc#diff-33337525d2d877f7cc7f33737bfd2d7b I had to do this because those classes are in the `tests/` directory within Datasette, so they don't get included in the package that ships to PyPI. It would be better if these classes were included in the main package in a way that made it easy for plugins to reuse them to write their own tests.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/580/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 502993509,MDU6SXNzdWU1MDI5OTM1MDk=,581,Redesign register_output_renderer callback,9599,simonw,closed,0,,,5471110,Datasette 0.43,24,2019-10-05T17:43:23Z,2020-05-28T02:24:14Z,2020-05-28T02:21:50Z,OWNER,,"In building https://github.com/simonw/datasette-atom it became clear that the callback function (which currently accepts just args, data and view_name) would also benefit from access to a mechanism to render templates and a `datasette` instance so it can execute SQL. To maintain backwards compatibility with existing plugins, we can introspect the callback function to see if it wants those new arguments or not. At a minimum I want to make `datasette` and ASGI `scope` available.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503045221,MDU6SXNzdWU1MDMwNDUyMjE=,11,Commands for recording real-time tweets from the streaming API,9599,simonw,closed,0,,,,,1,2019-10-06T03:09:30Z,2019-10-06T04:54:17Z,2019-10-06T04:48:31Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter We can support tracking keywords and following specific users.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503053243,MDU6SXNzdWU1MDMwNTMyNDM=,582,Datasette should not completely crash if one SQLite database is malformed,9599,simonw,open,0,,,,,0,2019-10-06T05:11:43Z,2019-10-06T05:11:43Z,,OWNER,,"If you run Datasette against a number of database files and one of them is malformed, you get this 500 error on the index page: It would be better if Datasette still worked and listed the databases that were NOT malformed, then showed an inline error message just for the one that could not be accessed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 503053800,MDU6SXNzdWU1MDMwNTM4MDA=,12,"Extract ""source"" into a separate lookup table",9599,simonw,closed,0,,,,,3,2019-10-06T05:17:23Z,2019-10-17T15:49:24Z,2019-10-17T15:49:24Z,MEMBER,,"It's pretty bulky and ugly at the moment: ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503085013,MDU6SXNzdWU1MDMwODUwMTM=,13,statuses-lookup command,9599,simonw,closed,0,,,,,1,2019-10-06T11:00:20Z,2019-10-07T00:33:49Z,2019-10-07T00:31:44Z,MEMBER,,"For bulk retrieving tweets by their ID. https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute. Should support `--SQL` and `--attach` #8 ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503128914,MDU6SXNzdWU1MDMxMjg5MTQ=,583,"Enable ""explain"" and ""explain query plan"" for CTEs",9599,simonw,closed,0,,,,,1,2019-10-06T17:00:10Z,2019-10-06T17:24:07Z,2019-10-06T17:24:07Z,OWNER,,"This currently throws an error: https://latest.datasette.io/fixtures?sql=explain+WITH+RECURSIVE%0D%0A++xaxis%28x%29+AS+%28VALUES%28-2.0%29+UNION+ALL+SELECT+x%2B0.05+FROM+xaxis+WHERE+x%3C1.2%29%2C%0D%0A++yaxis%28y%29+AS+%28VALUES%28-1.0%29+UNION+ALL+SELECT+y%2B0.1+FROM+yaxis+WHERE+y%3C1.0%29%2C%0D%0A++m%28iter%2C+cx%2C+cy%2C+x%2C+y%29+AS+%28%0D%0A++++SELECT+0%2C+x%2C+y%2C+0.0%2C+0.0+FROM+xaxis%2C+yaxis%0D%0A++++UNION+ALL%0D%0A++++SELECT+iter%2B1%2C+cx%2C+cy%2C+x*x-y*y+%2B+cx%2C+2.0*x*y+%2B+cy+FROM+m+%0D%0A+++++WHERE+%28x*x+%2B+y*y%29+%3C+4.0+AND+iter%3C28%0D%0A++%29%2C%0D%0A++m2%28iter%2C+cx%2C+cy%29+AS+%28%0D%0A++++SELECT+max%28iter%29%2C+cx%2C+cy+FROM+m+GROUP+BY+cx%2C+cy%0D%0A++%29%2C%0D%0A++a%28t%29+AS+%28%0D%0A++++SELECT+group_concat%28+substr%28%27+.%2B*%23%27%2C+1%2Bmin%28iter%2F7%2C4%29%2C+1%29%2C+%27%27%29+%0D%0A++++FROM+m2+GROUP+BY+cy%0D%0A++%29%0D%0ASELECT+group_concat%28rtrim%28t%29%2Cx%270a%27%29+FROM+a%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503190241,MDU6SXNzdWU1MDMxOTAyNDE=,584,Codec error in some CSV exports,9599,simonw,closed,0,,,,,2,2019-10-07T01:15:34Z,2021-06-17T18:13:20Z,2019-10-18T05:23:16Z,OWNER,,"Got this exploring my Swarm checkins: ![448DBFC4-71F8-4846-83C0-BEA511B2157A](https://user-images.githubusercontent.com/9599/66279259-3af53480-e865-11e9-9651-04fd2d895392.jpeg) `/swarm/stickers.csv?stickerType=messageOnly&_size=max`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503217375,MDU6SXNzdWU1MDMyMTczNzU=,585,"Databases on index page should display in order they were passed to ""datasette serve""?",9599,simonw,closed,0,,,,,1,2019-10-07T03:42:39Z,2019-10-14T03:52:34Z,2019-10-14T03:52:34Z,OWNER,,"If you run this: datasette serve -h 127.0.0.1 -p 8000 -m phone-locations.db healthkit.db locations.db genome.db Then the index page for that Datasette instance should show the databases in the order they were specified on the command-line. Mind you when we add pagination to that page in #468 we may want to do something different here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503218205,MDU6SXNzdWU1MDMyMTgyMDU=,586,Enable browser caching for plugin statics with datasette-auth,9599,simonw,closed,0,,,,,2,2019-10-07T03:47:14Z,2019-10-07T15:46:04Z,2019-10-07T15:46:03Z,OWNER,,"An authenticated Datasette I run is seeing delays on every page load. On looking at the network inspector it turns out it's because datasette-vega is nearly 1MB and a `cache-control: private` is preventing it from being cached! This may well turn out to be a bug in `datasette-auth-github` but it's still worth tracking here because caching of static assets from plugins is very important. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503233021,MDU6SXNzdWU1MDMyMzMwMjE=,1,Use better pagination (and implement progress bar),9599,simonw,closed,0,,,,,4,2019-10-07T04:58:11Z,2020-03-27T22:13:57Z,2020-03-27T22:13:57Z,MEMBER,,"Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow. We can do better by implementing pagination using count and offset.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503234169,MDU6SXNzdWU1MDMyMzQxNjk=,2,Track and use the 'since' value,9599,simonw,closed,0,,,,,3,2019-10-07T05:02:59Z,2020-03-27T22:22:30Z,2020-03-27T22:22:30Z,MEMBER,,"Pocket says: > Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time. At the bottom of https://getpocket.com/developer/docs/v3/retrieve",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503243784,MDU6SXNzdWU1MDMyNDM3ODQ=,3,Extract images into separate tables,9599,simonw,open,0,,,,,1,2019-10-07T05:43:01Z,2020-09-01T06:17:45Z,,MEMBER,,"As already done with authors. Slightly harder because images do not have a universally unique ID. Also need to figure out what to do about there being columns for both `image` and `images`. ",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 503244410,MDU6SXNzdWU1MDMyNDQ0MTA=,14,"When importing favorites, record which user favorited them",9599,simonw,closed,0,,,,,0,2019-10-07T05:45:11Z,2019-10-14T03:30:25Z,2019-10-14T03:30:25Z,MEMBER,,"This code currently just dumps them into the `tweets` table without recording who it was who had favorited them. https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 504238461,MDU6SXNzdWU1MDQyMzg0NjE=,6,sqlite3.OperationalError: table users has no column named bio,1055831,dazzag24,closed,0,,,,,2,2019-10-08T19:39:52Z,2019-10-13T05:31:28Z,2019-10-13T05:30:19Z,NONE,,"``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite"", line 10, in sys.exit(cli()) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 106, in starred utils.save_stars(db, user, stars) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 177, in save_stars user_id = save_user(db, user) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"").last_pk File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1067, in upsert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 916, in insert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: table users has no column named bio ``` ``` $ pipenv graph github-to-sqlite==0.4 - requests [required: Any, installed: 2.22.0] - certifi [required: >=2017.4.17, installed: 2019.9.11] - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4] - idna [required: >=2.5,<2.9, installed: 2.8] - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6] - sqlite-utils [required: ~=1.11, installed: 1.11] - click [required: Any, installed: 7.0] - click-default-group [required: Any, installed: 1.2.2] - click [required: Any, installed: 7.0] - tabulate [required: Any, installed: 0.8.5] Python 3.6.8 ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 504720731,MDU6SXNzdWU1MDQ3MjA3MzE=,1,Add more details on how to request data from google takeout correctly.,1055831,dazzag24,open,0,,,,,0,2019-10-09T15:17:34Z,2019-10-09T15:17:34Z,,NONE,,"The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now: - My Activity - Location History In addition unless you specify that ""My Activity"" is downloaded in JSON format the default is HTML. This then causes the `google-takeout-to-sqlite my-activity takeout.db takeout.zip` command to fail as it only contains html files not json files. Thanks",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 504805857,MDU6SXNzdWU1MDQ4MDU4NTc=,587,Use --platform=managed for publish cloudrun,9599,simonw,closed,0,,,,,0,2019-10-09T18:02:16Z,2019-10-17T21:51:57Z,2019-10-17T21:51:57Z,OWNER,,"Running `datasette publish cloudrun` now shows this message: > Please choose a target platform: > [1] Cloud Run (fully managed) > [2] Cloud Run on GKE > [3] a Kubernetes cluster > [4] cancel >Please enter your numeric choice: 1 > > To specify the platform yourself, pass `--platform managed`. Or, to make this the default target platform, run `gcloud config set run/platform managed`. May as well set that as a default.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505512251,MDU6SXNzdWU1MDU1MTIyNTE=,588,Queries per DB table in metadata.json,12617395,bsilverm,closed,0,,,,,3,2019-10-10T21:08:19Z,2019-10-21T12:58:22Z,2019-10-21T01:48:42Z,NONE,,"It doesn't appear possible to have separate queries defined per database table. When I do something like below, my table descriptions show up but not the queries: ` ""databases"": { ""MYDB"": { ""tables"": { ""MYFIRSTTABLE"": { ""source"": ""Test"", ""source_url"": ""https://www.google.com"", ""queries"": { ""Query 1"": { ""sql"": ""select * from MYFIRSTTABLE"", ""title"": ""Query 1"", ""description"": ""This is the first query"" }, } }, ""MYSECONDTABLE"": { ""source"":""Test2"", ""source_url"":""https://www.google.com"", ""queries"": { ""Query 2"" : { ""sql"":""select * from MYSECONDTABLE;"", ""title"": ""Query 2"", ""description"":""This is the second query"" } } } }`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505666744,MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz,15,"twitter-to-sqlite import command, refs #4",9599,simonw,closed,0,,,,,0,2019-10-11T06:37:14Z,2019-10-11T06:45:01Z,2019-10-11T06:45:01Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/15,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505673645,MDU6SXNzdWU1MDU2NzM2NDU=,16,Do a better job with archived direct message threads,9599,simonw,open,0,,,,,0,2019-10-11T06:55:21Z,2019-10-11T06:55:27Z,,MEMBER,,https://github.com/dogsheep/twitter-to-sqlite/blob/fb2698086d766e0333a55bb73435e7283feeb438/twitter_to_sqlite/archive.py#L98-L99,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 505674949,MDU6SXNzdWU1MDU2NzQ5NDk=,17,import command should empty all archive-* tables first,9599,simonw,closed,0,,,,,2,2019-10-11T06:58:43Z,2019-10-11T15:40:08Z,2019-10-11T15:40:08Z,MEMBER,,Can have a CLI option for NOT doing that.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505814865,MDExOlB1bGxSZXF1ZXN0MzI3MTY5NzQ4,589,Display metadata footer on custom SQL queries,2657547,rixx,closed,0,,,,,0,2019-10-11T12:10:28Z,2019-10-14T08:58:23Z,2019-10-14T03:53:22Z,CONTRIBUTOR,simonw/datasette/pulls/589,Closes #408,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505818256,MDExOlB1bGxSZXF1ZXN0MzI3MTcyNTQ1,590,Handle spaces in DB names,2657547,rixx,closed,0,,,,,3,2019-10-11T12:18:22Z,2019-11-04T23:16:31Z,2019-11-04T23:16:30Z,CONTRIBUTOR,simonw/datasette/pulls/590,Closes #503,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505837199,MDExOlB1bGxSZXF1ZXN0MzI3MTg4MDg3,591,Sort databases on homepage by argument order,2657547,rixx,closed,0,,,,,1,2019-10-11T12:57:38Z,2019-10-14T08:57:50Z,2019-10-14T03:52:34Z,CONTRIBUTOR,simonw/datasette/pulls/591,Closes #585,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505928530,MDU6SXNzdWU1MDU5Mjg1MzA=,18,Command to import home-timeline,9599,simonw,closed,0,,,,,4,2019-10-11T15:47:54Z,2019-10-11T16:51:33Z,2019-10-11T16:51:12Z,MEMBER,,"Feature request: https://twitter.com/johankj/status/1182563563136868352 > Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505950145,MDExOlB1bGxSZXF1ZXN0MzI3Mjc5ODE4,592,Offer SQL formatting,2657547,rixx,closed,0,,,,,1,2019-10-11T16:35:49Z,2019-10-14T08:57:12Z,2019-10-14T03:46:13Z,CONTRIBUTOR,simonw/datasette/pulls/592,"SQL code will be formatted on page load, and can additionally be formatted by clicking the ""Format SQL"" button. Closes #136",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 506087267,MDU6SXNzdWU1MDYwODcyNjc=,19,since_id support for home-timeline,9599,simonw,closed,0,,,,,3,2019-10-11T22:48:24Z,2019-10-16T19:13:06Z,2019-10-16T19:12:46Z,MEMBER,,Currently every time you run `home-timeline` we pull all 800 available tweets. We should offer to support `since_id` (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506183241,MDU6SXNzdWU1MDYxODMyNDE=,593,make uvicorn optional dependancy (because not ok on windows python yet),4312421,stonebig,closed,0,,,,,3,2019-10-12T12:51:07Z,2019-10-13T06:22:08Z,2019-10-13T06:22:07Z,NONE,,"would it be possible to: - remove uvicorn mandatory dependancy ? - eventually make a fallback to hypercorn ? reason: - uvloop not yet supported on Windows/Python-3.8 and below, may happen with Python-3.9 only. - it seems a 6 lines effort (but I'm not expert)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/593/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506268945,MDU6SXNzdWU1MDYyNjg5NDU=,20,--since support for various commands for refresh-by-cron,9599,simonw,closed,0,,,,,3,2019-10-13T03:40:46Z,2019-10-21T03:32:04Z,2019-10-16T19:26:11Z,MEMBER,,"I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything: - [x] Tweets I have tweeted - [x] My home timeline (see #19) - [x] Tweets I have favourited It would be nice if this could be standardized across all commands as a `--since` option.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506276893,MDU6SXNzdWU1MDYyNzY4OTM=,7,issue-comments command for importing issue comments,9599,simonw,closed,0,,,,,1,2019-10-13T05:23:58Z,2019-10-14T14:44:12Z,2019-10-13T05:24:30Z,MEMBER,,Using this API: https://developer.github.com/v3/issues/comments/,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506297048,MDU6SXNzdWU1MDYyOTcwNDg=,594,upgrade to uvicorn-0.9 to be Python-3.8 friendly,4312421,stonebig,closed,0,,,,,3,2019-10-13T09:23:43Z,2019-11-12T04:47:04Z,2019-11-12T04:47:04Z,NONE,,uvicorn-0.8 relies on websockets-0.7 which lacks python-3.8 compatiblity,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506300941,MDExOlB1bGxSZXF1ZXN0MzI3NTQxMDQ2,595,bump uvicorn to 0.9.0 to be Python-3.8 friendly,4312421,stonebig,closed,0,,,,,9,2019-10-13T10:00:04Z,2019-11-12T04:46:48Z,2019-11-12T04:46:48Z,NONE,simonw/datasette/pulls/595,"as uvicorn-0.9 is needed to get websockets-8.0.2, which is needed to have Python-3.8 compatibility",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 506432572,MDU6SXNzdWU1MDY0MzI1NzI=,21,Fix & escapes in tweet text,9599,simonw,closed,0,,,,,1,2019-10-14T03:37:28Z,2019-10-15T18:48:16Z,2019-10-15T18:48:16Z,MEMBER,," Shouldn't be storing `&` here.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 507454958,MDU6SXNzdWU1MDc0NTQ5NTg=,596,Handle really wide tables better,9599,simonw,open,0,,,,,9,2019-10-15T20:05:46Z,2022-09-07T00:58:41Z,,OWNER,,"If a table has hundreds of columns the Datasette UI starts getting unwieldy. Addressing this would be neat. One option would be to only select the first 30 columns by default and provide a UI for selecting more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/596/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 508024032,MDU6SXNzdWU1MDgwMjQwMzI=,22,Ability to import from uncompressed archive or from specific files,9599,simonw,closed,0,,,,,0,2019-10-16T18:31:57Z,2019-10-16T18:53:36Z,2019-10-16T18:53:36Z,MEMBER,,"Currently you can only import like this: $ twitter-to-sqlite import path-to-twitter.zip It would be useful if you could import from a folder that was decompressed from that zip: $ twitter-to-sqlite import path-to-twitter/ AND from individual files within that folder - since that would allow you to e.g. selectively import certain files: $ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508070977,MDU6SXNzdWU1MDgwNzA5Nzc=,597,If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar,9599,simonw,closed,0,,,,,5,2019-10-16T20:07:41Z,2019-10-18T22:51:08Z,2019-10-18T22:51:08Z,OWNER,,"Weird bug I just came across. It appears that if you have one database called `foo.db` and another called `foo-bar.db` any attempts to visit `/foo-bar` will redirect to `/foo`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508100844,MDU6SXNzdWU1MDgxMDA4NDQ=,598,Character encoding bug with CSV export,46313,JoeGermuska,closed,0,,,,,1,2019-10-16T21:09:30Z,2021-06-17T18:13:20Z,2019-10-18T22:52:21Z,NONE,,"I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error: ``` 'latin-1' codec can't encode character '\u2019' in position 27: ordinal not in range(256) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508190730,MDU6SXNzdWU1MDgxOTA3MzA=,23,Extremely simple migration system,9599,simonw,closed,0,,,,,2,2019-10-17T02:13:57Z,2019-10-17T16:57:17Z,2019-10-17T16:57:17Z,MEMBER,,"Needed for #12. This is going to be an incredibly simple version of the Django migration system. * A `migrations` table, keeping track of which migrations were applied (and when) * A `migrate()` function which applies any pending migrations * A `MIGRATIONS` constant which is a list of functions to be applied The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the `migrate()` function before doing anything else. Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508553387,MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4,24,Tweet source extraction and new migration system,9599,simonw,closed,0,,,,,0,2019-10-17T15:24:56Z,2019-10-17T15:49:29Z,2019-10-17T15:49:24Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/24,Closes #12 and #23,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 508578780,MDU6SXNzdWU1MDg1Nzg3ODA=,25,Ensure migrations don't accidentally create foreign key twice,9599,simonw,closed,0,,,,,2,2019-10-17T16:08:50Z,2019-10-17T16:56:47Z,2019-10-17T16:56:47Z,MEMBER,,"Is it possible for these lines to run against a database table that already has these foreign keys? https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 509267608,MDExOlB1bGxSZXF1ZXN0MzI5ODkwMzIw,599,Fix for /foo v.s. /foo-bar issue in #597,9599,simonw,closed,0,,,,,0,2019-10-18T19:22:55Z,2019-10-18T22:51:07Z,2019-10-18T22:51:07Z,OWNER,simonw/datasette/pulls/599,Refs #597,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509339999,MDU6SXNzdWU1MDkzMzk5OTk=,600,Don't auto-format SQL on first page load,9599,simonw,closed,0,,,,,0,2019-10-18T22:36:10Z,2019-10-18T23:56:46Z,2019-10-18T23:56:46Z,OWNER,,"I've gone back and forth on this a bit, but I've decided I'm not keen on the way Datasette now automatically formats SQL when a query (or canned query) page first loads. I like having an optional ""Format SQL"" button, but applying formatting automatically means that if the user has carefully formatted their SQL to a specific style their formatting will be automatically over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 509340359,MDExOlB1bGxSZXF1ZXN0MzI5OTQ3MTgw,601,Don't auto-format SQL on page load,9599,simonw,closed,0,,,,,5,2019-10-18T22:37:39Z,2019-10-20T02:29:49Z,2019-10-18T23:56:45Z,OWNER,simonw/datasette/pulls/601,Refs #600,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509535510,MDExOlB1bGxSZXF1ZXN0MzMwMDc2MjYz,602,Offer to format readonly SQL,2657547,rixx,closed,0,,,,,3,2019-10-20T02:29:32Z,2019-11-04T07:29:33Z,2019-11-04T02:39:56Z,CONTRIBUTOR,simonw/datasette/pulls/602,"Following discussion in #601, this PR adds a ""Format SQL"" button to read-only SQL (if the SQL actually differs from the formatting result). It also removes a console error on readonly SQL queries.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509612217,MDExOlB1bGxSZXF1ZXN0MzMwMTI5MzU4,603,always pop as_format off args dict,6025893,chris48s,closed,0,,,,,2,2019-10-20T15:44:22Z,2019-10-30T19:12:22Z,2019-10-21T02:03:09Z,CONTRIBUTOR,simonw/datasette/pulls/603,closes #563,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509693773,MDU6SXNzdWU1MDk2OTM3NzM=,604,_where= parameter is not persisted in hidden form fields,9599,simonw,closed,0,,,,,3,2019-10-21T02:14:10Z,2019-10-30T19:12:38Z,2019-10-30T18:49:44Z,OWNER,,"e.g. on this page: https://v0-30.datasette.io/fixtures/roadside_attractions?_where=name%20like%20%27%museum%%27 Click the ""Apply"" button and the `_where=` parameter will be dropped.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 510076368,MDU6SXNzdWU1MTAwNzYzNjg=,605,Support queries at the table level,12617395,bsilverm,open,0,,,,,2,2019-10-21T15:58:30Z,2019-10-30T18:55:37Z,,NONE,,"Per the issue described in [issue #588](https://github.com/simonw/datasette/issues/588), it was determined queries are not supported at the table level. Per my last comment in the issue, I'd like to request support for this as it would help eliminate errors in the event certain tables are not present in the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 512218858,MDU6SXNzdWU1MTIyMTg4NTg=,606,/-/plugins shows incorrect name for plugins,9599,simonw,closed,0,,,,,3,2019-10-24T22:53:25Z,2019-11-01T05:41:04Z,2019-11-01T05:40:07Z,OWNER,,"https://fivethirtyeight.datasettes.com/-/plugins ```json [ { ""name"": ""datasette_jellyfish"", ""static"": false, ""templates"": false, ""version"": ""0.3"" }, { ""name"": ""datasette_vega"", ""static"": true, ""templates"": false, ""version"": ""0.6.2"" } ] ``` These should be shown as `datasette-jellyfish` and `datasette-vega` since those are the names on PyPI.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 512996469,MDU6SXNzdWU1MTI5OTY0Njk=,607,Ways to improve fuzzy search speed on larger data sets?,8431341,zeluspudding,closed,0,,,,,6,2019-10-27T17:31:37Z,2019-11-07T03:38:10Z,2019-11-07T03:38:10Z,NONE,,"I have an sqlite table with 16 million rows in it. Having read @simonw article ""[Fast Autocomplete Search for Your Website](https://24ways.org/2018/fast-autocomplete-search-for-your-website/)"" I was curious to try datasette to see what kind of query performance I could get out of it. In truth I don't need to do full text search since all I would like to do is give my users a way to search for the names of investors such as ""Warren Buffet"", or ""Tim Cook"" (who's names are in a single column). On the first search, Datasette takes over 20 seconds to return all records associated with `elon musk`: > ![image](https://user-images.githubusercontent.com/8431341/67638889-a86e1100-f8b7-11e9-9f7e-a9d13a42e988.png) > ![image](https://user-images.githubusercontent.com/8431341/67638825-ed457800-f8b6-11e9-94d1-b44f1a40ee8c.png) If I rerun the same search, it then takes almost 9 seconds: > ![image](https://user-images.githubusercontent.com/8431341/67638908-e4a17180-f8b7-11e9-9d00-748c80ef1f21.png) That's far to slow to implement an autocomplete feature. I could reduce the latency by making a special table of only unique investor names, thereby reducing the search space to less than a million rows (then I'd need to implement a way to add only new investor names to the table as I received new data.. about 4,000 rows a day). If I did that, I'm still concerned the new table wouldn't be lean enough to lookup investor names quickly. Plus, even if I can implement the autocomplete feature, I would still finally have to lookup records for that investors which would take between 8 - 20 seconds. Are there any tricks for speeding this up? Here's my hardware: > ![image](https://user-images.githubusercontent.com/8431341/67638861-55945980-f8b7-11e9-96a8-ca76c7c68c5d.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 513008936,MDU6SXNzdWU1MTMwMDg5MzY=,608,"Improve UI of ""datasette publish cloudrun"" to reduce chances of accidentally over-writing a service",9599,simonw,closed,0,,,,,6,2019-10-27T19:21:28Z,2019-11-08T02:51:36Z,2019-11-08T02:48:46Z,OWNER,,"The concept of a ""service"" in Cloud Run is crucial: if you deploy to the same service, you will over-write what you deployed there last! As such, I'd like to make service a required positional argument for `publish cloudrun`: datasette publish cloudrun my-service one.db two.db three.db ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 513074501,MDU6SXNzdWU1MTMwNzQ1MDE=,26,Command for importing mentions timeline,9599,simonw,closed,0,,,,,1,2019-10-28T03:14:27Z,2019-10-30T02:36:13Z,2019-10-30T02:20:47Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline Almost identical to home-timeline #18 but it uses `https://api.twitter.com/1.1/statuses/mentions_timeline.json` instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514459062,MDU6SXNzdWU1MTQ0NTkwNjI=,27,retweets-of-me command,9599,simonw,closed,0,,,,,4,2019-10-30T07:43:01Z,2019-11-03T01:12:58Z,2019-11-03T01:12:58Z,MEMBER,,https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514899195,MDExOlB1bGxSZXF1ZXN0MzM0NDQ4MjU4,609,Update to latest black,9599,simonw,closed,0,,,,,0,2019-10-30T18:42:35Z,2019-10-30T18:49:01Z,2019-10-30T18:49:01Z,OWNER,simonw/datasette/pulls/609,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/609/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 515658861,MDU6SXNzdWU1MTU2NTg4NjE=,28,Add indexes to followers table,9599,simonw,closed,0,,,,,1,2019-10-31T18:40:22Z,2019-11-09T20:15:42Z,2019-11-09T20:11:48Z,MEMBER,,`select follower_id from following where followed_id = 12497` takes over a second for me at the moment.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516310670,MDU6SXNzdWU1MTYzMTA2NzA=,610,Don't suggest array facet if column is only [] empty arrays,9599,simonw,closed,0,,,,,0,2019-11-01T19:42:02Z,2019-11-01T21:46:08Z,2019-11-01T21:46:08Z,OWNER,,Follow on from #562,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/610/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516370822,MDU6SXNzdWU1MTYzNzA4MjI=,611,Static assets no longer loading for installed plugins,9599,simonw,closed,0,,,,,3,2019-11-01T22:07:00Z,2019-11-01T22:15:55Z,2019-11-01T22:15:55Z,OWNER,,"Caused by fix I made in #606 e.g. `/-/static-plugins/datasette_leaflet_geojson/datasette-leaflet-geojson.js` is a 404, but view-`/-/static-plugins/datasette-leaflet-geojson/datasette-leaflet-geojson.js` works correctly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516748849,MDU6SXNzdWU1MTY3NDg4NDk=,612,CSV export is broken for tables with null foreign keys,9599,simonw,closed,0,,,,,2,2019-11-02T22:52:47Z,2021-06-17T18:13:20Z,2019-11-02T23:12:53Z,OWNER,,"Following on from #406 - this CSV export appears to be broken: https://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max ```csv pk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label 1,1,hello,1,1 2,, ``` That second row should have 5 values, but it only has 4.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516763727,MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2,8,"stargazers command, refs #4",9599,simonw,closed,0,,,,,5,2019-11-03T00:37:36Z,2020-05-02T20:00:27Z,2020-05-02T20:00:26Z,MEMBER,dogsheep/github-to-sqlite/pulls/8,Needs tests. Refs #4.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 516769276,MDU6SXNzdWU1MTY3NjkyNzY=,9,Commands do not work without an auth.json file,9599,simonw,closed,0,,,,,0,2019-11-03T01:54:28Z,2019-11-11T05:30:48Z,2019-11-11T05:30:48Z,MEMBER,,"`auth.json` is meant to be optional. If it's not provided, the tool should make heavily rate-limited unauthenticated requests. ``` $ github-to-sqlite repos .data/repos.db simonw Usage: github-to-sqlite repos [OPTIONS] DB_PATH [USERNAME] Try ""github-to-sqlite repos --help"" for help. Error: Invalid value for ""-a"" / ""--auth"": File ""auth.json"" does not exist. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516874735,MDU6SXNzdWU1MTY4NzQ3MzU=,613,Basic join support for table view,9599,simonw,open,0,,,,,1,2019-11-03T19:12:53Z,2019-11-03T19:14:01Z,,OWNER,,"I think it would be possible to support basic foreign key joins on the table page. The user could specify columns that should result in a join (from a set of suggestions similar to how facets work right now) and they could then be passed as `?_join=city_id` arguments. This feature will make a lot of sense when combined with the ability to show / hide / customize columns, see #292",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/613/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 516950748,MDU6SXNzdWU1MTY5NTA3NDg=,614,"Add ""not in"" filter - ?pk__notin=x,y,z",9599,simonw,closed,0,,,,,1,2019-11-04T04:07:17Z,2019-11-04T04:31:58Z,2019-11-04T04:12:00Z,OWNER,,"We have a `__in` filter at the moment: https://latest.datasette.io/fixtures/facetable?pk__in=1,2,3 Today I found myself needing the inverse, a `?pk__notin=` filter, which isn't currently supported.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/614/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516967682,MDU6SXNzdWU1MTY5Njc2ODI=,10,Add this repos_starred view,9599,simonw,closed,0,,,,,3,2019-11-04T05:44:38Z,2020-05-02T16:37:36Z,2020-05-02T16:37:36Z,MEMBER,,"```sql create view repos_starred as select stars.starred_at, users.login, repos.* from repos join stars on repos.id = stars.repo join users on repos.owner = users.id order by starred_at desc; ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 517241040,MDU6SXNzdWU1MTcyNDEwNDA=,63,ensure_index() method,9599,simonw,closed,0,,,,,1,2019-11-04T15:51:22Z,2019-11-04T16:20:36Z,2019-11-04T16:20:35Z,OWNER,,"```python db[""table""].ensure_index([""col1"", ""col2""]) ``` This will do the following: - if the specified table or column does not exist, do nothing - if they exist and already have an index, do nothing - otherwise, create the index I want this for tools like [twitter-to-sqlite search](https://github.com/dogsheep/twitter-to-sqlite/blob/801c0c2daf17d8abce9dcb5d8d610410e7e25dbe/README.md#running-searches) where the `search_runs` table may or not have been created yet but, if it IS created, I want to put an index on the `hash` column.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 517451234,MDU6SXNzdWU1MTc0NTEyMzQ=,615,?_col= and ?_nocol= support for toggling columns on table view,9599,simonw,closed,0,,,,,16,2019-11-04T22:55:41Z,2021-05-27T04:26:10Z,2021-05-27T04:17:44Z,OWNER,,Split off from #292 (I guess this is a re-opening of #312).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518506242,MDU6SXNzdWU1MTg1MDYyNDI=,616,Datasette FTS detection bug,49656826,null92,closed,0,,,,,2,2019-11-06T14:25:47Z,2019-11-08T15:31:33Z,2019-11-08T02:06:56Z,NONE,,"I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here: With search: http://teste-templates.herokuapp.com/amazonia_protege/car Without search: http://bases.vortex.media/amazonia_protege/car ![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518725064,MDU6SXNzdWU1MTg3MjUwNjQ=,29,`import` command fails on empty files,21148,jacobian,closed,0,,,,,4,2019-11-06T20:34:26Z,2019-11-09T20:33:38Z,2019-11-09T19:36:36Z,CONTRIBUTOR,,"If a file in the export is empty (in my case it was `account-suspensions.js`), `twitter-to-sqlite import` fails: ``` $ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 627, in import_ archive.import_from_file(db, filename, content) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py"", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id=""pk"") File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1113, in upsert_all extracts=extracts, File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 980, in insert_all first_record = next(records) StopIteration ``` This appears to be because `db.upsert_all` is called with no rows -- I think? I hacked around this by modifying `import_from_file` to have an `if rows:` clause: ``` for table, rows in to_insert.items(): if rows: table_name = ""archive_{}"".format(table.replace(""-"", ""_"")) ... ``` I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518739697,MDU6SXNzdWU1MTg3Mzk2OTc=,30,`followers` fails because `transform_user` is called twice,21148,jacobian,closed,0,,,,,2,2019-11-06T20:44:52Z,2019-11-09T20:15:28Z,2019-11-09T19:55:52Z,CONTRIBUTOR,,"Trying to run `twitter-to-sqlite followers` errors out: ``` Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 130, in followers go(bar.update) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 116, in go utils.save_users(db, [profile]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 302, in save_users transform_user(user) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 181, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 646, in parse res, skipped_tokens = self._parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 725, in _parse l = _timelex.split(timestr) # Splits the timestr into tokens File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 207, in split return list(cls(s)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 76, in __init__ '{itype}'.format(itype=instream.__class__.__name__)) TypeError: Parser must be a string or character stream, not datetime ``` This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls `transform_user`, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls `transform_user` again, which fails because the user is already transformed. I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116. Shall I work up a patch for that, or is there a better approach?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519032008,MDExOlB1bGxSZXF1ZXN0MzM3ODQ3NTcz,64,test_insert_upsert_all_empty_list,9599,simonw,closed,0,,,,,0,2019-11-07T04:24:45Z,2019-11-07T04:32:38Z,2019-11-07T04:32:38Z,OWNER,simonw/sqlite-utils/pulls/64,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519038979,MDU6SXNzdWU1MTkwMzg5Nzk=,10,Failed to import workout points,9599,simonw,closed,0,,,,,4,2019-11-07T04:50:22Z,2019-11-08T01:18:37Z,2019-11-08T01:18:37Z,MEMBER,,"I just ran the script and it failed to import any `workout_points`, though it did import `workouts`.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519039316,MDExOlB1bGxSZXF1ZXN0MzM3ODUzMzk0,65,Release 1.12.1,9599,simonw,closed,0,,,,,0,2019-11-07T04:51:29Z,2019-11-07T04:58:48Z,2019-11-07T04:58:47Z,OWNER,simonw/sqlite-utils/pulls/65,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519613116,MDU6SXNzdWU1MTk2MTMxMTY=,617,Refactor TableView.data() method,9599,simonw,closed,0,,,,,9,2019-11-08T01:55:41Z,2021-12-18T01:41:47Z,2021-12-11T19:17:11Z,OWNER,,"This is by far the most complex piece of Datasette - the `TableView.data()` method is over 500 lines long and is increasingly getting in the way of cleanly implementing new features (e.g. #615 and #613). Need to break it up into smaller, cleaner pieces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/617/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519979091,MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4,1,Add parkrun-to-sqlite,1101318,mrw34,closed,0,,,,,0,2019-11-08T12:05:32Z,2020-10-12T00:35:16Z,2020-10-12T00:35:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/1,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520507306,MDU6SXNzdWU1MjA1MDczMDY=,618,Mechanism for seeing indexes on a specific table,9599,simonw,closed,0,,,,,2,2019-11-09T20:10:41Z,2019-11-10T01:40:05Z,2019-11-10T01:30:25Z,OWNER,,"The only way to see the indexes that apply to a specific table at the moment is to run the following SQL manually: ```sql select * from sqlite_master where type = 'index' and tbl_name=? ``` For example: It would be good if this list of indexes was displayed in a neater way on the table page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520521843,MDU6SXNzdWU1MjA1MjE4NDM=,11,Command to fetch releases,9599,simonw,closed,0,,,,,0,2019-11-09T22:23:30Z,2019-11-09T22:57:00Z,2019-11-09T22:57:00Z,MEMBER,,"https://developer.github.com/v3/repos/releases/#list-releases-for-a-repository `GET /repos/:owner/:repo/releases`",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520655983,MDU6SXNzdWU1MjA2NTU5ODM=,619,"""Invalid SQL"" page should let you edit the SQL",9599,simonw,closed,0,,,,,14,2019-11-10T20:54:12Z,2022-01-13T22:21:42Z,2021-06-02T04:15:54Z,OWNER,,"https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D Would be useful if this page showed you the invalid SQL you entered so you can edit it and try again.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/619/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520667773,MDU6SXNzdWU1MjA2Njc3NzM=,620,Mechanism for indicating foreign key relationships in the table and query page URLs,9599,simonw,open,0,,,,,6,2019-11-10T22:26:27Z,2021-04-05T03:57:22Z,,OWNER,,"Datasette currently only inflates foreign keys (into names hyperlinks) if it detects them as foreign key constraints in the underlying database. It would be useful if you could specify additional ""foreign keys"" using both `metadata.json` and the querystring - similar time how you can pass `?_fts_table=x` https://datasette.readthedocs.io/en/stable/full_text_search.html#configuring-full-text-search-for-a-table-or-view",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/620/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 520681725,MDU6SXNzdWU1MjA2ODE3MjU=,621,Syntax for ?_through= that works as a form field,9599,simonw,open,0,,,,,7,2019-11-11T00:19:03Z,2021-12-18T01:42:33Z,,OWNER,,"The current syntax for `?_through=` uses JSON to avoid any risk of confusion with table or column names that contain special characters. This means you can't target a form field at it. We should be able to support both - `?x.y.z=value` for tables and columns with ""regular"" names, falling back to the current JSON syntax for columns or tables that won't work with the key/value syntax.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 520715188,MDU6SXNzdWU1MjA3MTUxODg=,622,Datasette should work with Python 3.8 (and drop compatibility with Python 3.5),9599,simonw,closed,0,,,,,4,2019-11-11T03:12:36Z,2019-11-12T05:52:49Z,2019-11-12T05:09:13Z,OWNER,,"See #595, #594, #404. The big thing holding me back from ditching Python 3.5 was glitch.com - but they now offer Python 3.7: https://support.glitch.com/t/can-you-upgrade-python-to-latest-version/7980/25?u=simonw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/622/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520718056,MDExOlB1bGxSZXF1ZXN0MzM5MjM2NjQ3,623,Test against Python 3.8 in Travis,9599,simonw,closed,0,,,,,2,2019-11-11T03:24:54Z,2019-11-11T03:45:35Z,2019-11-11T03:45:35Z,OWNER,simonw/datasette/pulls/623,Needed for #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520728483,MDExOlB1bGxSZXF1ZXN0MzM5MjQ0ODg4,624,Bump pint to 0.9,9599,simonw,closed,0,,,,,0,2019-11-11T04:07:07Z,2019-11-11T04:19:02Z,2019-11-11T04:19:02Z,OWNER,simonw/datasette/pulls/624,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520740741,MDU6SXNzdWU1MjA3NDA3NDE=,625,If you apply ?_facet_array=tags then &_facet=tags does nothing,9599,simonw,closed,0,,,7571612,Datasette 0.60,13,2019-11-11T04:59:29Z,2022-01-13T22:26:58Z,2021-12-16T20:12:22Z,OWNER,,"Start here: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags Note that `tags` is offered as a suggested facet. But if you click that you get this: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags&_facet=tags The `_facet=tags` is added to the URL and it's removed from the list of suggested tags... but the facet itself is not displayed: The `_facet=tags` facet should look like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520756546,MDU6SXNzdWU1MjA3NTY1NDY=,12,Add this view for seeing new releases,9599,simonw,closed,0,,,,,5,2019-11-11T06:00:12Z,2020-05-02T18:58:18Z,2020-05-02T18:58:17Z,MEMBER,,"```sql CREATE VIEW recent_releases AS select json_object(""label"", repos.full_name, ""href"", repos.html_url) as repo, json_object( ""href"", releases.html_url, ""label"", releases.name ) as release, substr(releases.published_at, 0, 11) as date, releases.body as body_markdown, releases.published_at from releases join repos on repos.id = releases.repo order by releases.published_at desc ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521275281,MDU6SXNzdWU1MjEyNzUyODE=,13,Set up a live demo Datasette instance,9599,simonw,closed,0,,,5225818,1.0,9,2019-11-12T01:27:02Z,2020-03-24T00:03:26Z,2020-03-24T00:03:25Z,MEMBER,,"I deployed https://github-to-sqlite-releases-j7hipcg4aq-uc.a.run.app/ by running this: ``` #!/bin/bash # Fetch repos for simonw and dogsheep github-to-sqlite repos github.db simonw dogsheep -a auth.json # Fetch releases for the repos tagged 'datasette-io' sqlite-utils github.db "" select full_name from repos where rowid in ( select repos.rowid from repos, json_each(repos.topics) j where j.value = 'datasette-io' )"" --csv --no-headers | while read repo; do github-to-sqlite releases \ github.db $(echo $repo | tr -d '\r') \ -a auth.json; sleep 2; done; ``` And then deploying using this: ``` $ datasette publish cloudrun github.db \ --title ""github-to-sqlite releases demo"" \ --about_url=""https://github.com/simonw/github-to-sqlite"" \ --about='github-to-sqlite' \ --install=datasette-render-markdown \ --install=datasette-json-html \ --service=github-to-sqlite-releases ``` This should happen automatically for every release. I can run it once a day in Circle CI to keep the demo database up-to-date.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521282013,MDU6SXNzdWU1MjEyODIwMTM=,626,Unit tests should fail under Python 3.8,9599,simonw,closed,0,,,,,1,2019-11-12T01:54:11Z,2019-11-12T04:31:26Z,2019-11-12T04:31:13Z,OWNER,,"The unit tests currently pass under Python 3.8. But... when you actually attempt to run Datasette you get an error: ``` ~/Dropbox/Development/datasette $ venv-py3.8.0/bin/datasette --memory -p 8855 Serve! files=() (immutables=()) on port 8855 Traceback (most recent call last): File ""venv-py3.8.0/bin/datasette"", line 11, in load_entry_point('datasette', 'console_scripts', 'datasette')() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/datasette/datasette/cli.py"", line 365, in serve uvicorn.run(ds.app(), host=host, port=port, log_level=""info"") File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/main.py"", line 279, in run server.run() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/main.py"", line 305, in run self.config.setup_event_loop() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/config.py"", line 218, in setup_event_loop loop_setup() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/loops/auto.py"", line 3, in auto_loop_setup import uvloop File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvloop/__init__.py"", line 7, in from .loop import Loop as __BaseLoop # NOQA File ""uvloop/includes/stdlib.pxi"", line 114, in init uvloop.loop AttributeError: module 'sys' has no attribute 'set_coroutine_wrapper' ~/Dropbox/Development/datasette $ ``` If Datasette doesn't work under Python 3.8 the tests should fail.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/626/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521323012,MDExOlB1bGxSZXF1ZXN0MzM5NzIyNzkw,627,"Support Python 3.8, stop supporting Python 3.5",9599,simonw,closed,0,,,,,2,2019-11-12T04:36:33Z,2020-04-05T10:23:58Z,2019-11-12T05:09:12Z,OWNER,simonw/datasette/pulls/627,Refs #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521329771,MDU6SXNzdWU1MjEzMjk3NzE=,628,Render jinja2 templates in async mode,9599,simonw,closed,0,,,,,2,2019-11-12T05:01:55Z,2019-11-14T23:28:09Z,2019-11-14T23:14:24Z,OWNER,,"I started playing with this in #404 and got good results but it didn't work in Python 3.5. As of #627 I don't support 3.5 any more so this can go ahead. Rendering templates in async mode will mean that template plugins can include async code... which opens the door to custom template functions that execute SQL queries!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/628/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521335335,MDU6SXNzdWU1MjEzMzUzMzU=,629,"""datasette publish"" commands should deploy with Python 3.8",9599,simonw,closed,0,,,,,1,2019-11-12T05:22:31Z,2019-11-12T06:03:10Z,2019-11-12T06:03:10Z,OWNER,,Now that we support 3.8 (#627) `datasette publish` should always deploy using Python 3.8.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/629/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521346800,MDExOlB1bGxSZXF1ZXN0MzM5NzQyNDMy,630,Use python:3.8 base Docker image,9599,simonw,closed,0,,,,,0,2019-11-12T06:02:37Z,2019-11-12T06:03:10Z,2019-11-12T06:03:10Z,OWNER,simonw/datasette/pulls/630,Closes #629,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521868864,MDU6SXNzdWU1MjE4Njg4NjQ=,66,"The "".upsert()"" method is misnamed",9599,simonw,closed,0,,,,,15,2019-11-12T23:48:28Z,2019-12-31T01:30:21Z,2019-12-31T01:30:20Z,OWNER,,"This thread here is illuminating: https://stackoverflow.com/questions/3634984/insert-if-not-exists-else-update The term `UPSERT` in SQLite has a specific meaning as-of 3.24.0 (2018-06-04): https://www.sqlite.org/lang_UPSERT.html It means ""behave as an UPDATE or a no-op if the INSERT would violate a uniqueness constraint"". The syntax in 3.24.0+ looks like this (confusingly it does not use the term ""upsert""): ```sql INSERT INTO phonebook(name,phonenumber) VALUES('Alice','704-555-1212') ON CONFLICT(name) DO UPDATE SET phonenumber=excluded.phonenumber ``` Here's the problem: the `sqlite-utils` `.upsert()` and `.upsert_all()` methods don't do this. They use the following SQL: ```sql INSERT OR REPLACE INTO [{table}] ({columns}) VALUES {rows}; ``` If the record already exists, it will be entirely replaced by a new record - as opposed to updating any specified fields but leaving existing fields as they are (the behaviour of ""upsert"" in SQLite itself).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/66/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521923131,MDExOlB1bGxSZXF1ZXN0MzQwMjExMTQ5,631,bugfix issue 572,3683993,qwo,closed,0,,,,,1,2019-11-13T02:46:50Z,2019-11-13T04:28:43Z,2019-11-13T04:28:42Z,CONTRIBUTOR,simonw/datasette/pulls/631,closes bugfix issue #572 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/631/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521995039,MDU6SXNzdWU1MjE5OTUwMzk=,632,Upgrade datasette publish Heroku runtime,9599,simonw,closed,0,,,,,2,2019-11-13T06:46:19Z,2019-11-13T16:44:07Z,2019-11-13T16:43:23Z,OWNER,,"``` Python has released a security update! Please consider upgrading to python-3.6.9 ``` https://devcenter.heroku.com/articles/python-support#supported-runtimes shows 3.8.0 is now supported.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522334771,MDU6SXNzdWU1MjIzMzQ3NzE=,633,"Publish to Heroku is broken: ""WARNING: You must pass the application as an import string to enable 'reload' or 'workers""",9599,simonw,closed,0,,,,,3,2019-11-13T16:32:11Z,2020-04-28T20:37:50Z,2019-11-13T16:43:23Z,OWNER,,"``` 2019-11-13T16:27:59.821483+00:00 heroku[web.1]: Starting process with command `datasette serve --host 0.0.0.0 -i fixtures.db --cors --port 36817 --inspect-file inspect-data.json` 2019-11-13T16:28:01.856471+00:00 heroku[web.1]: State changed from starting to crashed 2019-11-13T16:28:01.750253+00:00 app[web.1]: Serve! files=() (immutables=('fixtures.db',)) on port 36817 2019-11-13T16:28:01.771524+00:00 app[web.1]: WARNING: You must pass the application as an import string to enable 'reload' or 'workers'. 2019-11-13T16:28:01.837839+00:00 heroku[web.1]: Process exited with status 1 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522352520,MDU6SXNzdWU1MjIzNTI1MjA=,634,Don't run tests twice when releasing a tag,9599,simonw,closed,0,,,,,2,2019-11-13T17:02:42Z,2020-09-15T20:37:58Z,2020-09-15T20:37:58Z,OWNER,,"Shipping a release currently runs the tests twice: https://travis-ci.org/simonw/datasette/builds/611463728 It does a regular test run on Python 3.6/7/8 - then the ""Release tagged version"" step runs the tests again before publishing to PyPI! This second run is not necessary.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/634/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522566332,MDExOlB1bGxSZXF1ZXN0MzQwNzQzMjIw,635,Use Jinja async mode,9599,simonw,closed,0,,,,,0,2019-11-14T01:20:57Z,2019-11-14T23:14:23Z,2019-11-14T23:14:23Z,OWNER,simonw/datasette/pulls/635,Refs #628. Still needs documentation.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/635/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 525254973,MDU6SXNzdWU1MjUyNTQ5NzM=,636,rowid is not included in dropdown filter menus,9599,simonw,closed,0,,,,,3,2019-11-19T20:43:04Z,2019-11-19T23:01:17Z,2019-11-19T23:01:17Z,OWNER,,"For `rowid` tables the `rowid` column isn't shown in the list of filter options: This also means if you link to e.g. `?rowid__gt=1060124` the resulting filter interface will be slightly broken: clicking the ""apply"" button again will lose your filter for example.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/636/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 525993034,MDU6SXNzdWU1MjU5OTMwMzQ=,637,"Custom queries with 0 results should say ""0 results""",9599,simonw,closed,0,,,,,3,2019-11-20T18:28:14Z,2019-11-23T06:17:23Z,2019-11-23T06:07:08Z,OWNER,,"Consider https://latest.datasette.io/fixtures/neighborhood_search?text=foop It's currently not obvious that the query executed and returned 0 results.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/637/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 526913133,MDU6SXNzdWU1MjY5MTMxMzM=,638,Don't suggest column for faceting if all values are 1,9599,simonw,closed,0,,,,,3,2019-11-22T00:14:22Z,2019-11-22T01:14:59Z,2019-11-22T00:57:49Z,OWNER,,"https://www.niche-museums.com/museums/museums?_facet=wikipedia_url Challenge is how to do this efficiently, since suggested facet queries need to be lightning fast.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 527670799,MDU6SXNzdWU1Mjc2NzA3OTk=,639,updating metadata.json without recreating the app,172847,pkoppstein,open,0,,,,,6,2019-11-24T09:19:53Z,2019-11-30T06:08:50Z,,NONE,,"I've sucessfully ""uploaded"" an SQLite database (with a metadata.json file) to heroku using: $ datasette publish heroku so-sales.db -m metadata.json -n so-sales The question is: how can I modify the (small) metadata.json file without having to upload the (large) SQLite database. The directions on heroku indicate I should run: heroku git:clone -a so-sales But this just results in an empty directory with a warning: warning: You appear to have cloned an empty repository. I've been able to ""clone"" the heroku ""app"" using the command: $ heroku slugs:download -a so-sales but this is not a git repository.... Ideally, it seems to me, there'd be an option of the `datasette` CLI to allow a file to be updated, or there'd be some way to create a local git ""clone"" of the app so that the heroku instructions for ""Deploying with git"" would apply. (p.s. I ran `datasette publish heroku -m metadata.json -n so-sales` in the hope that that would not cause the .db file to be wiped, but of course it was.) (p.p.s. Thanks for Datasette!)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 527710055,MDU6SXNzdWU1Mjc3MTAwNTU=,640,Nicer error message for heroku publish name clash,82988,psychemedia,open,0,,,,,1,2019-11-24T14:57:07Z,2019-12-06T07:19:34Z,,CONTRIBUTOR,,"If you try to publish to Heroku using no set name (i.e. the default `datasette` name) and a project already exists under that name, you get a meaningful error report on the first line followed by Py error messages that drown it out: ``` Creating datasette... ! ▸ Name datasette is already taken Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 10, in sys.exit(cli()) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/NNNNN/Library/Python/3.7/lib/python/site-packages/datasette/publish/heroku.py"", line 124, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py"", line 411, in check_output **kwargs).stdout File ""/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py"", line 512, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'datasette', '--json']' returned non-zero exit status 1. ``` It would be neater if: - the Py error message was caught; - the report suggested setting a project name using `-n` etc. It may also be useful to provide a command to list the current names that are being used, which I assume is available via a Heroku call?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 528442126,MDU6SXNzdWU1Mjg0NDIxMjY=,641,Better documentation for --static option,9599,simonw,closed,0,,,,,1,2019-11-26T02:07:57Z,2019-11-26T03:30:02Z,2019-11-26T02:31:53Z,OWNER,,"This is misleading: https://github.com/simonw/datasette/blob/aca41618f8761f99c47c8ae8e81b07a6d4af4d7a/docs/datasette-serve-help.txt#L23 The correct format is e.g. `static:static/` Also it's not mentioned in the regular documentation at all.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/641/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 529376481,MDExOlB1bGxSZXF1ZXN0MzQ2MjY0OTI2,67,Run tests against 3.5 too,9599,simonw,closed,0,,,,,2,2019-11-27T14:20:35Z,2019-12-31T01:29:44Z,2019-12-31T01:29:43Z,OWNER,simonw/sqlite-utils/pulls/67,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 529429214,MDU6SXNzdWU1Mjk0MjkyMTQ=,642,Provide a cookiecutter template for creating new plugins,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2019-11-27T15:46:36Z,2020-06-20T03:20:33Z,2020-06-20T03:20:25Z,OWNER,,See this conversation: https://twitter.com/psychemedia/status/1199707352540368896,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 530468212,MDU6SXNzdWU1MzA0NjgyMTI=,643,Set up some basic benchmarks as part of the unit tests,9599,simonw,open,0,,,,,0,2019-11-29T19:24:19Z,2019-11-29T19:24:19Z,,OWNER,,"https://pypi.org/project/pytest-benchmark/ looks great for this. Here's how to run it as a github action: https://github.com/rhysd/github-action-benchmark/blob/master/examples/pytest/README.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/643/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 530491074,MDU6SXNzdWU1MzA0OTEwNzQ=,14,Command for importing events,9599,simonw,open,0,,,,,3,2019-11-29T21:28:58Z,2020-04-14T19:38:34Z,,MEMBER,,"Eg from https://api.github.com/users/simonw/events Docs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 530513784,MDExOlB1bGxSZXF1ZXN0MzQ3MTc5MDgx,644,Validate metadata json on startup,6025893,chris48s,closed,0,,,,,1,2019-11-30T00:32:15Z,2021-07-28T17:58:45Z,2021-07-28T17:58:45Z,CONTRIBUTOR,simonw/datasette/pulls/644,"This PR adds a sanity check which builds up a marshmallow schema on-the-fly based on the structure of the database(s) on startup and then validates the metadata json against it. In case of invalid data, this will raise with a descriptive error e.g: ``` marshmallow.exceptions.ValidationError: {'databases': {'fixtures': {'tables': {'not_a_table': ['Unknown field.']}}}} ``` Closes #260 --- This was intended to be fairly self-contained, but then while I was working on it, I hit some problems getting the tests to pass in the context of the test suite as a whole. My tests passed in isolation, but then failed while doing a full test suite run. That's when the worms started coming out of the can :bug: After some sleuthing, it turned out this was essentially the result of several issues intersecting: * There are certain events in the application lifecycle where the metadata schema can be modified after it is loaded e.g: https://github.com/simonw/datasette/blob/a562f2965552fb2dbbbd74df245c9965ee23d886/datasette/app.py#L299-L320 This means that sometimes what goes in isn't always exactly what comes out when you call `/-/metadata`. * Because the test fixtures use session scope for performance reasons if one unit test performs an action which mutates the metadata, that can impact on other unit tests which run after it using the same fixture. * Because the `self._metadata` property was being set with a simple assignment `self._metadata = metadata`, that created an object reference to the test fixture data, so operating on `self._metadata` was actually modifying the test fixture `METADATA` meaning that depending on when it was loaded in the test suite lifecycle, `METADATA` had different content, which was somewhat unexpected. As such, I've added some band-aids in 3552024 and 6859fd8: * Switching the metadata object to a `deepcopy` of the input prevents us directly mutating the input fixture. * I've switched some of the tests to use a fixture with function scope instead of session scope so we're working on a clean copy that hasn't been mutated by other tests where necessary but keeping session scope in most cases for performance. * I haven't really addressed the fact that sometimes the metadata object gets mutated in place, so the object that is served from `/-/metadata` isn't necessarily always exactly the same as the file you fed into it on init. I'm not sure how much of a problem that is. The way the tests were written makes me think it was unexpected, but getting into it feels like too much scope creep for this PR so its probably best addressed as another issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/644/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 530653633,MDU6SXNzdWU1MzA2NTM2MzM=,645,Mechanism for register_output_renderer to suggest extension or not,9599,simonw,closed,0,,,,,4,2019-12-01T01:26:27Z,2020-05-28T02:22:18Z,2020-05-28T02:22:12Z,OWNER,,"[datasette-atom](https://github.com/simonw/datasette-atom) only works if the user constructs a SQL query with specific output columns (`atom_id` ,`atom_updated` etc). It would be good if the `.atom` link wasn't shown on the query/table page unless those columns were present. Right now you get a link which results in a 400 error: See also #581.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/645/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531502365,MDU6SXNzdWU1MzE1MDIzNjU=,646,Make database level information from metadata.json available in the index.html template,18017473,lagolucas,open,0,,,3268330,Datasette 1.0,3,2019-12-02T19:55:10Z,2022-03-15T20:50:34Z,,NONE,,"Did a search on the issues here and didn't find anything related to what I want. I want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page. I tried some small tweaks on the python and html files, but failed to get that result. Is there a way? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 531583658,MDU6SXNzdWU1MzE1ODM2NTg=,68,Add support for porter stemming in FTS,9599,simonw,closed,0,,,,,1,2019-12-02T22:35:52Z,2020-09-20T04:25:53Z,2020-09-20T04:25:47Z,OWNER,,FTS5 can have porter stemming enabled.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531755959,MDU6SXNzdWU1MzE3NTU5NTk=,647,Move hashed URL mode out to a plugin,9599,simonw,closed,0,,,3268330,Datasette 1.0,9,2019-12-03T06:29:03Z,2022-03-19T11:56:05Z,2022-03-15T23:13:06Z,OWNER,,"They used to be the default until #418. Since making them optional I haven't felt the need to use them even once. That suggests to me that they should be removed. I think their effect could be entirely handled by an ASGI wrapping plugin. https://datasette.readthedocs.io/en/0.32/performance.html#hashed-url-mode",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534492501,MDU6SXNzdWU1MzQ0OTI1MDE=,648,Mechanism for adding arbitrary pages like /about,9599,simonw,closed,0,,,,,13,2019-12-08T04:55:19Z,2020-05-07T15:21:19Z,2020-04-26T18:46:45Z,OWNER,,"For www.niche-museums.com I solved this by creating an empty `about.db` database file - see https://simonwillison.net/2019/Nov/25/niche-museums/ I want a neater mechanism for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/648/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534530973,MDU6SXNzdWU1MzQ1MzA5NzM=,649,Reduce table counts on index page with many databases,9599,simonw,closed,0,,,,,2,2019-12-08T11:56:37Z,2020-02-29T01:08:29Z,2020-02-29T01:08:29Z,OWNER,,"Since #467 the index page has attempted to optimistically count times. My personal Dogsheep has enough connected databases and tables that the page can still take way too long to load - sometimes more than twenty seconds.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/649/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534629631,MDU6SXNzdWU1MzQ2Mjk2MzE=,650,Add a glossary to the documentation,9599,simonw,open,0,,,,,3,2019-12-09T00:23:45Z,2022-01-13T22:04:56Z,,OWNER,,"Call it `glossary.rst` - it can use a definition list something like this: ```rst .. _glossary: Glossary ======== Term A definition of the term. Another term Another definition. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 539204432,MDU6SXNzdWU1MzkyMDQ0MzI=,70,Implement ON DELETE and ON UPDATE actions for foreign keys,26292069,LucasElArruda,open,0,,,,,2,2019-12-17T17:19:10Z,2020-02-27T04:18:53Z,,NONE,,"Hi! I did not find any mention on the library about ON DELETE and ON UPDATE actions for foreign keys. Are those expected to be implemented? If not, it would be a nice thing to include!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 539590148,MDU6SXNzdWU1Mzk1OTAxNDg=,651,fts5 syntax error when using punctuation,2181410,clausjuhl,closed,0,,,,,3,2019-12-18T10:25:35Z,2021-07-14T19:26:06Z,2019-12-30T06:42:55Z,NONE,,"Hi Simon I get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' ""enable-fts""-command. The same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.) What am I doing wrong? Many thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539985017,MDExOlB1bGxSZXF1ZXN0MzU0ODY5Mzkx,652,Quick (and uninformed and perhaps misguided) attempt to add a url for hosting datasette at a particular host/URI,132978,terrycojones,closed,0,,,,,1,2019-12-18T23:37:16Z,2020-03-24T22:14:50Z,2020-03-24T22:14:50Z,NONE,simonw/datasette/pulls/652,"As usual, I don't really know what I'm doing... so this is just a suggested approach. I've not written tests, I've not run the tests, I don't know if I've missed some absolute URLs that would need to have the leading slash dropped. BUT, I tested it with `--config base_url:http://127.0.0.1:8001/` on the command line and from what little I know about datasette it's at least working in some obvious cases. My changes are based on what I saw in https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf (thanks!) I'm happy to be more thorough on this if you think it's worth pursuing. Fixes #394 (he said, optimistically).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/652/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 541274681,MDU6SXNzdWU1NDEyNzQ2ODE=,2,Add linkedin-to-sqlite,881925,mnp,open,0,,,,,0,2019-12-21T03:13:40Z,2019-12-21T03:13:40Z,,NONE,,"There is an API available. https://developer.linkedin.com/docs/rest-api# At the minimum, I would think contact list and messages would be of interest.",214746582,dogsheep.github.io,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 541331755,MDExOlB1bGxSZXF1ZXN0MzU2MDA0MjQy,653,allow leading comments in SQL input field,418191,jaywgraves,closed,0,,,,,8,2019-12-21T14:19:52Z,2020-02-05T02:35:41Z,2020-02-05T02:13:25Z,CONTRIBUTOR,simonw/datasette/pulls/653,"this changes the SQL validation to allow for lines that are commented out my main use case for this is that I like to write a succession of queries when trying to solve a problem. In most native SQL clients there is a key binding that will run just the current highlighted query or the program is smart enough to run just the query that the cursor is in if it's properly delimited with a ';'. Typically my workflow will start with a single simple query and I'll copy/paste it to a new query below when I want to make big changes while debugging. This makes it easy to go back to a working version above when the query doesn't work. Since datasette sends the whole query to the DB I have to comment out the older queries by prefixing each line with `--`. This gets caught by the validators when I use my typical strategy of copy/pasting each successive query below the last one. so this is just a simple fix to allow for a query to be sent to the DB with leading comments. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 541467590,MDU6SXNzdWU1NDE0Njc1OTA=,654,Template debug mode that outputs template context,9599,simonw,closed,0,,,,,3,2019-12-22T15:51:25Z,2019-12-22T16:13:11Z,2019-12-22T16:04:51Z,OWNER,,It would make writing templates (including custom templates) easier if there was an option to dump out the full template context - maybe `?_context=1`,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542553350,MDU6SXNzdWU1NDI1NTMzNTA=,655,Copy and paste doesn't work reliably on iPhone for SQL editor,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2019-12-26T13:15:10Z,2020-09-30T20:36:12Z,2020-08-30T17:51:40Z,OWNER,,I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542814756,MDU6SXNzdWU1NDI4MTQ3NTY=,71,Tests are failing due to missing FTS5,9599,simonw,closed,0,,,,,3,2019-12-27T09:41:16Z,2019-12-27T09:49:37Z,2019-12-27T09:49:37Z,OWNER,,"https://travis-ci.com/simonw/sqlite-utils/jobs/268436167 This is a recent change: 2 months ago they worked fine. I'm not sure what changed here. Maybe something to do with https://launchpad.net/~jonathonf/+archive/ubuntu/backports ?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 543355051,MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2,6,don't break if source is missing,78035,mfa,closed,0,,,,,1,2019-12-29T10:46:47Z,2020-03-28T02:28:11Z,2020-03-28T02:28:11Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/6,broke for me. very old checkins in 2010 had no source set.,205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543717994,MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2,3,Add todoist-to-sqlite,706257,bcongdon,closed,0,,,,,0,2019-12-30T04:02:59Z,2020-10-12T00:35:58Z,2020-10-12T00:35:57Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/3,"Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful",214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543738004,MDExOlB1bGxSZXF1ZXN0MzU3OTkyNTg4,72,Fixed implementation of upsert,9599,simonw,closed,0,,,,,0,2019-12-30T05:08:05Z,2019-12-30T05:29:24Z,2019-12-30T05:29:24Z,OWNER,simonw/sqlite-utils/pulls/72,Refs #66,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 544571092,MDU6SXNzdWU1NDQ1NzEwOTI=,15,Assets table with downloads,2029,garethr,closed,0,,,5225818,1.0,4,2020-01-02T13:05:28Z,2020-03-28T12:17:01Z,2020-03-23T19:17:32Z,NONE,,"The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful? If so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 545407916,MDU6SXNzdWU1NDU0MDc5MTY=,73,upsert_all() throws issue when upserting to empty table,82988,psychemedia,closed,0,,,,,6,2020-01-05T11:58:57Z,2020-01-31T14:21:09Z,2020-01-05T17:20:18Z,NONE,,"If I try to add a list of `dict`s to an empty table using `upsert_all`, I get an error: ```python import sqlite3 from sqlite_utils import Database import pandas as pd conx = sqlite3.connect(':memory') cx = conx.cursor() cx.executescript('CREATE TABLE ""test"" (""Col1"" TEXT);') q=""SELECT * FROM test;"" pd.read_sql(q, conx) #shows empty table db = Database(conx) db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in 1 db = Database(conx) ----> 2 db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, extracts) 1157 alter=alter, 1158 extracts=extracts, -> 1159 upsert=True, 1160 ) 1161 /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, upsert) 1040 sql = ""INSERT OR IGNORE INTO [{table}]({pks}) VALUES({pk_placeholders});"".format( 1041 table=self.name, -> 1042 pks="", "".join([""[{}]"".format(p) for p in pks]), 1043 pk_placeholders="", "".join([""?"" for p in pks]), 1044 ) TypeError: 'NoneType' object is not iterable ``` A hacky workaround in use is: ```python try: db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) except: db['test'].insert_all([{'Col1':'a'},{'Col1':'b'}]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546051181,MDU6SXNzdWU1NDYwNTExODE=,16,Exception running first command: IndexError: list index out of range,15092,jayvdb,closed,0,,,,,4,2020-01-07T03:01:58Z,2020-04-14T18:37:21Z,2020-04-14T18:37:21Z,NONE,,"Exception running first command without an existing db or auth. ```py > mkdir ~/.github/coala > /usr/bin/github-to-sqlite repos ~/.github/coala coala Traceback (most recent call last): File ""/usr/bin/github-to-sqlite"", line 11, in load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')() File ""/usr/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py"", line 163, in repos utils.save_repo(db, repo) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 120, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"", alter=True).last_pk File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1135, in upsert extracts=extracts, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1162, in upsert_all upsert=True, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1105, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546073980,MDU6SXNzdWU1NDYwNzM5ODA=,74,Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column,15092,jayvdb,open,0,,,,,3,2020-01-07T04:35:50Z,2020-01-12T07:21:17Z,,CONTRIBUTOR,,"openSUSE 15.1 is using python 3.6.5 and click-7.0 , however it has test failures while openSUSE Tumbleweed on py37 passes. Most fail on the cli exit code like ```py [ 74s] =================================== FAILURES =================================== [ 74s] _________________________________ test_tables __________________________________ [ 74s] [ 74s] db_path = '/tmp/pytest-of-abuild/pytest-0/test_tables0/test.db' [ 74s] [ 74s] def test_tables(db_path): [ 74s] result = CliRunner().invoke(cli.cli, [""tables"", db_path]) [ 74s] > assert '[{""table"": ""Gosh""},\n {""table"": ""Gosh2""}]' == result.output.strip() [ 74s] E assert '[{""table"": ""...e"": ""Gosh2""}]' == '' [ 74s] E - [{""table"": ""Gosh""}, [ 74s] E - {""table"": ""Gosh2""}] [ 74s] [ 74s] tests/test_cli.py:28: AssertionError ``` packaging project at https://build.opensuse.org/package/show/home:jayvdb:py-new/python-sqlite-utils I'll keep digging into this after I have github-to-sqlite working on Tumbleweed, as I'll need openSUSE Leap 15.1 working before I can submit this into the main python repo.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 546078359,MDExOlB1bGxSZXF1ZXN0MzU5ODIyNzcz,75,Explicitly include tests and docs in sdist,15092,jayvdb,closed,0,,,,,1,2020-01-07T04:53:20Z,2020-01-31T00:21:27Z,2020-01-31T00:21:27Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/75,Also exclude 'tests' from runtime installation.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 546961357,MDU6SXNzdWU1NDY5NjEzNTc=,656,Display of the column definitions,6371750,JBPressac,closed,0,,,,,1,2020-01-08T16:16:53Z,2020-01-20T14:17:11Z,2020-01-20T14:14:33Z,CONTRIBUTOR,,"Hello, Is the nice display of headers and definitions at the top of https://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/antiquities-act%2Factions_under_antiquities_act is configured in the metadata.json file ? Thank you,",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/656/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 548591089,MDU6SXNzdWU1NDg1OTEwODk=,657,Allow creation of virtual tables at startup,1055831,dazzag24,open,0,,,,,4,2020-01-12T16:10:55Z,2021-01-15T20:24:35Z,,NONE,,"Hi, I've been experimenting with SQLite reading from huge datasets using this excellent Parquet extension from @cldellow. https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html https://github.com/cldellow/sqlite-parquet-vtable This works really well, but I was keen to see if I could combine datasette with this. Having previously experimented with the spatialite extension I knew that datasette supports loading extensions in the underlying sqlite instance. However I hit a blocker as the current design only allows SELECT statements to be executed and so I am unable to execute the crucial CREATE VIRTUAL TABLE ......... command that is required to load the data from the parquet file into the table. It seems like this would be a simple-ish change, but I don't know enough about the architecture of datasette to start implementing this myself? Could this be done as a datasette plugin? or would this require more fundamental changes at initialisation time? My thoughts are that something at init time could detect that the user was loading a *.parquet file and then switch to a mode were it loads that via the ""CREATE VIRTUAL TABLE..."" rather than loading the *.db file in the default case?? I'm happy to contribute code and testing, I just need some pointers on the best approach. Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 549287310,MDU6SXNzdWU1NDkyODczMTA=,76,order_by mechanism,10501166,metab0t,closed,0,,,,,4,2020-01-14T02:06:03Z,2020-04-16T06:23:29Z,2020-04-16T03:13:06Z,NONE,,"In some cases, I want to iterate rows in a table with `ORDER BY` clause. It would be nice to have a `rows_order_by` function similar to `rows_where`. In a more general case, `rows_filter` function might be added to allow more customized filtering to iterate rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 550293770,MDU6SXNzdWU1NTAyOTM3NzA=,658,How do I use the app.css as style sheet?,49656826,null92,open,0,,,,,2,2020-01-15T16:27:57Z,2020-02-07T00:29:50Z,,NONE,,"Simon, I'm trying to use the app.css (in static folder) as style sheet but the datasette on Heroku simply ignore it! I read everything about customization here and on readthedocs but still can't. Is this possible? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/658/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 551834842,MDU6SXNzdWU1NTE4MzQ4NDI=,659,README information is obscured by feature history,55480210,labstersteve,closed,0,,,,,1,2020-01-18T22:34:51Z,2020-12-10T23:28:51Z,2020-12-10T23:28:51Z,NONE,,"While it's sometimes valuable to know how a project has developed, there is usually little justification for including this information in the README, and certainly not immediately after other key information such as ""what does this package do, and who might want to use it?"" Might I recommend that the feature history is migrated to an Appendix in the documentation?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 552773632,MDExOlB1bGxSZXF1ZXN0MzY1MjE4Mzkx,660,"gcloud run is now GA, s/beta//",813732,glasnt,closed,0,,,,,1,2020-01-21T10:08:38Z,2020-01-22T03:41:09Z,2020-01-21T23:28:12Z,CONTRIBUTOR,simonw/datasette/pulls/660,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 555832585,MDU6SXNzdWU1NTU4MzI1ODU=,661,"--port option to expose a port other than 8001 in ""datasette package""",134771,dvhthomas,closed,0,,,,,3,2020-01-27T21:05:56Z,2020-01-30T04:17:52Z,2020-01-29T22:46:45Z,NONE,,"I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080. https://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41 Is there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/661/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 556814876,MDU6SXNzdWU1NTY4MTQ4NzY=,662,Escape_fts5_query-hookimplementation does not work with queries to standard tables,2181410,clausjuhl,closed,0,,,,,5,2020-01-29T11:56:03Z,2020-01-30T00:30:20Z,2020-01-30T00:30:19Z,NONE,,"Hi Simon Thank you for adding the escape_function, but it does not work on my datasette-installation (0.33). I've added the following file to my datasette-dir: /plugins/sql_functions.py: `from datasette import hookimpl def escape_fts_query(query): bits = query.split() return ' '.join('""{}""'.format(bit.replace('""', '')) for bit in bits) @hookimpl def prepare_connection(conn): conn.create_function(""escape_fts_query"", 1, escape_fts_query)` It has no effect on the standard queries to the tables though, as they still produce errors when including any characters like '-', '/', '+' or '?' Does the function only work when using costum queries, where I can include the escape_fts-function explicitly in the sql-query? PS. I'm calling datasette with --plugins=plugins, and my other plugins work just fine. PPS. The fts5 virtual table is created with 'sqlite3' like so: `CREATE VIRTUAL TABLE ""cases_fts"" USING FTS5( title, subtitle, resume, suggestion, presentation, detail = full, content_rowid = 'id', content = 'cases', tokenize='unicode61', 'remove_diacritics 2', 'tokenchars ""-_""' );` Thanks! _Originally posted by @clausjuhl in https://github.com/simonw/datasette/issues/651#issuecomment-579675357_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557077945,MDExOlB1bGxSZXF1ZXN0MzY4NzM0NTAw,663,"-p argument for datasette package, plus tests - refs #661",9599,simonw,closed,0,,,,,1,2020-01-29T19:47:50Z,2020-01-29T22:46:43Z,2020-01-29T22:46:43Z,OWNER,simonw/datasette/pulls/663,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557825032,MDU6SXNzdWU1NTc4MjUwMzI=,77,Ability to insert data that is transformed by a SQL function,9599,simonw,closed,0,,,,,2,2020-01-30T23:45:55Z,2022-02-05T00:04:25Z,2020-01-31T00:24:32Z,OWNER,,"I want to be able to run the equivalent of this SQL insert: ```python # Convert to ""Well Known Text"" format wkt = shape(geojson['geometry']).wkt # Insert and commit the record conn.execute(""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"", ( ""Wales"", wkt )) conn.commit() ``` From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557830332,MDExOlB1bGxSZXF1ZXN0MzY5MzQ4MDg0,78,"New conversions= feature, refs #77",9599,simonw,closed,0,,,,,0,2020-01-31T00:02:33Z,2020-09-22T07:48:29Z,2020-01-31T00:24:31Z,OWNER,simonw/sqlite-utils/pulls/78,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557842245,MDU6SXNzdWU1NTc4NDIyNDU=,79,Helper methods for working with SpatiaLite,9599,simonw,closed,0,,,,,8,2020-01-31T00:39:19Z,2022-02-05T00:04:25Z,2022-02-04T05:55:11Z,OWNER,,"As demonstrated by this piece of documentation, using SpatiaLite with sqlite-utils requires a fair bit of boilerplate: https://github.com/simonw/sqlite-utils/blob/f7289174e66ae4d91d57de94bbd9d09fabf7aff4/docs/python-api.rst#L880-L909",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557892819,MDExOlB1bGxSZXF1ZXN0MzY5Mzk0MDQz,80,on_create mechanism for after table creation,9599,simonw,closed,0,,,,,5,2020-01-31T03:38:48Z,2020-01-31T05:08:04Z,2020-01-31T05:08:04Z,OWNER,simonw/sqlite-utils/pulls/80,"I need this for `geojson-to-sqlite`, in particular https://github.com/simonw/geojson-to-sqlite/issues/6",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 558600274,MDU6SXNzdWU1NTg2MDAyNzQ=,81,"Remove .detect_column_types() from table, make it a documented API",9599,simonw,closed,0,,,,,4,2020-02-01T21:25:54Z,2020-02-01T21:55:35Z,2020-02-01T21:55:35Z,OWNER,,"I used it in `geojson-to-sqlite` here: https://github.com/simonw/geojson-to-sqlite/blob/f10e44264712dd59ae7dfa2e6fd5a904b682fb33/geojson_to_sqlite/utils.py#L45-L50 It would make more sense for this method to live on the Database rather than the Table - or even to exist as a separate utility method entirely. Then it should be documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 558715564,MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3,4,Add beeminder-to-sqlite,706257,bcongdon,closed,0,,,,,0,2020-02-02T15:51:36Z,2020-10-12T00:36:16Z,2020-10-12T00:36:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/4,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 559197745,MDU6SXNzdWU1NTkxOTc3NDU=,82,Tutorial command no longer works,10350886,petey284,closed,0,,,,,3,2020-02-03T16:36:11Z,2020-02-27T04:16:43Z,2020-02-27T04:16:30Z,NONE,,"Issue with command on [tutorial](https://simonwillison.net/2019/Feb/25/sqlite-utils/) on Simon's site. The following command no longer works, and breaks with the previous too many variables error: #50 ``` cmd > curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id ``` Output: ``` cmd Traceback (most recent call last): File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""Continuum\miniconda3\envs\main\Scripts\sqlite-utils.exe\__main__.py"", line 9, in File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 717, in main rv = self.invoke(ctx) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 555, in invoke return callback(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 434, in insert default=default, File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\db.py"", line 1081, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: too many SQL variables ``` My thought is that maybe the dataset grew over the last few years and so didn't run into this issue before. No error when I reduce the count of entries to 83. Once the number of entries hits 84 the command fails. // This passes ``` cmd type meteorite_83.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` // But this fails ``` cmd type meteorite_84.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` A potential fix might be to chunk the incoming data? I can work on a PR if pointed in right direction. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559374410,MDU6SXNzdWU1NTkzNzQ0MTA=,83,"Make db[""table""].exists a documented API",9599,simonw,closed,0,,,,,1,2020-02-03T22:31:44Z,2020-02-08T23:58:35Z,2020-02-08T23:56:23Z,OWNER,,Right now it's a static thing which might get out-of-sync with the database. It should probably be a live check. Maybe call it `.exists()` instead?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/83/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559522877,MDExOlB1bGxSZXF1ZXN0MzcwNjc1MDA3,664,Datasette.render_template() method,9599,simonw,closed,0,,,,,5,2020-02-04T06:53:59Z,2020-02-04T20:26:18Z,2020-02-04T20:26:18Z,OWNER,simonw/datasette/pulls/664,Refs #577,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/664/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 559964149,MDU6SXNzdWU1NTk5NjQxNDk=,665,Introduce a SQL statement parser in Python,9599,simonw,open,0,,,,,1,2020-02-04T20:36:05Z,2020-02-04T20:36:48Z,,OWNER,,#254 and #653 are both examples of problems that could be solved using a real SQL parser in Python.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/665/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 561454071,MDU6SXNzdWU1NjE0NTQwNzE=,32,"Documentation for "" favorites"" command",9599,simonw,closed,0,,,,,0,2020-02-07T06:50:11Z,2020-02-07T06:59:10Z,2020-02-07T06:59:10Z,MEMBER,,"It looks like I forgot to document this one in the README. https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561460274,MDU6SXNzdWU1NjE0NjAyNzQ=,84,.upsert() with hash_id throws error,9599,simonw,closed,0,,,,,0,2020-02-07T07:08:19Z,2020-02-07T07:17:11Z,2020-02-07T07:17:11Z,OWNER,,"```python db[table_name].upsert_all(rows, hash_id=""pk"") ``` This throws an error: `PrimaryKeyRequired('upsert() requires a pk')` The problem is, if you try this: ```python db[table_name].upsert_all(rows, hash_id=""pk"", pk=""pk"") ``` You get this error: `AssertionError('Use either pk= or hash_id=')` `hash_id=` should imply that `pk=` that column.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/84/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561469252,MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4,33,Upgrade to sqlite-utils 2.2.1,9599,simonw,closed,0,,,,,1,2020-02-07T07:32:12Z,2020-03-20T19:21:42Z,2020-03-20T19:21:41Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/33,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562085508,MDExOlB1bGxSZXF1ZXN0MzcyNzYzOTA2,666,"Use inspect-file, if possible, for total row count",13896256,kevindkeogh,closed,0,,,,,3,2020-02-08T22:10:35Z,2020-03-09T02:47:15Z,2020-02-25T20:19:29Z,CONTRIBUTOR,simonw/datasette/pulls/666,"For large tables, counting the number of rows in the table can take a signficant amount of time. Instead, where an inspect-file is provided for an immutable database, look up the row-count for a plain count(*).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/666/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562787785,MDU6SXNzdWU1NjI3ODc3ODU=,667,Allow injecting configuration data from plugins,870184,xrotwang,closed,0,,,,,2,2020-02-10T19:50:15Z,2020-02-12T16:18:22Z,2020-02-12T09:21:22Z,NONE,,"I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md). Part of this customization would be support for the ""special"" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file. It would be nice to be able to programmatically inject config data from plugins as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 562911863,MDU6SXNzdWU1NjI5MTE4NjM=,85,Create index doesn't work for columns containing spaces,9599,simonw,closed,0,,,,,1,2020-02-11T00:34:46Z,2020-02-11T05:13:20Z,2020-02-11T05:13:20Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 563347679,MDU6SXNzdWU1NjMzNDc2Nzk=,668,Make it easier to load SpatiaLite,9599,simonw,closed,0,,,,,2,2020-02-11T17:03:43Z,2022-01-20T21:29:41Z,2021-01-04T20:18:39Z,OWNER,,"``` $ datasette spatial.db Serve! files=('spatial.db',) (immutables=()) on port 8001 ERROR: conn= , sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex Usage: datasette serve [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it: ``` datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib ``` Even better: if Datasette had a `--spatialite` option which automatically loads the extension from common locations, if it can find it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/668/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 563348959,MDExOlB1bGxSZXF1ZXN0MzczNzc1Nzg4,669,fix db-to-sqlite command in ecosystem doc page,883348,adipasquale,closed,0,,,,,1,2020-02-11T17:05:41Z,2020-02-22T02:32:18Z,2020-02-22T02:32:17Z,CONTRIBUTOR,simonw/datasette/pulls/669,the `--connection` parameter has become positional,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 564579430,MDU6SXNzdWU1NjQ1Nzk0MzA=,86,Problem with square bracket in CSV column name,8149512,foscoj,closed,0,,,,,7,2020-02-13T10:19:57Z,2020-02-27T04:16:08Z,2020-02-27T04:16:07Z,NONE,,"testing some data from european power information (entsoe.eu), the title of the csv contains square brackets. as I am playing with glitch, sqlite-utils are used for creating the db. Traceback (most recent call last): File ""/app/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 434, in insert default=default, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 997, in insert_all extracts=extracts, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 618, in create extracts=extracts, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 310, in create_table self.conn.execute(sql) sqlite3.OperationalError: unrecognized token: ""]"" entsoe_2016.csv renamed to txt for uploading compatibility [entsoe_2016.txt](https://github.com/simonw/sqlite-utils/files/4197688/entsoe_2016.txt) code is remixed directly from your https://glitch.com/edit/#!/datasette-csvs repo ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/86/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 564833696,MDU6SXNzdWU1NjQ4MzM2OTY=,670,Prototoype for Datasette on PostgreSQL,9599,simonw,open,0,,,,,15,2020-02-13T17:17:55Z,2023-11-17T15:32:21Z,,OWNER,,"I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite. Some of the factors making me think PostgreSQL support could be worth the effort: - Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL. - Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using [db-to-sqlite](https://github.com/simonw/db-to-sqlite) but that's a pretty big barrier to getting started - being able to run `datasette postgresql://connection-string` and start trying it out would be a massively better experience. - Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus. - Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else. - It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't. - Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future. The above reasons feel strong enough to justify a prototype.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/670/reactions"", ""total_count"": 19, ""+1"": 14, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 5, ""rocket"": 0, ""eyes"": 0}",, 565041624,MDU6SXNzdWU1NjUwNDE2MjQ=,671,"datasette.add_database(name, db) and datasette.remove_database(name) methods",9599,simonw,closed,0,,,,,1,2020-02-14T01:05:48Z,2020-02-14T01:30:35Z,2020-02-14T01:30:30Z,OWNER,,"- `datasette.add_database(name, db)` - adds a new named database to the list of connected databases. `db` will be a `Database()` object, which may prove useful in the future for things like #670 and could also allow some plugins to provide in-memory SQLite databases. - `datasette.remove_database(name)` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/417#issuecomment-586047995_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565064079,MDExOlB1bGxSZXF1ZXN0Mzc1MTgwODMy,672,--dirs option for scanning directories for SQLite databases,9599,simonw,open,0,,,,,15,2020-02-14T02:25:52Z,2020-03-27T01:03:53Z,,OWNER,simonw/datasette/pulls/672,Refs #417.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/672/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 565518772,MDU6SXNzdWU1NjU1MTg3NzI=,673,Mechanism for checking if a SQLite database file is safe to open,9599,simonw,closed,0,,,,,11,2020-02-14T19:36:04Z,2020-02-14T20:13:59Z,2020-02-14T20:13:59Z,OWNER,,"Opening a SpatiaLite database file without SpatiaLite will result in errors later on. Same for database files which use custom extensions, like the Apple Photos database. I've figured out how to tell if a database is safe to open or not: ```sql select sql from sqlite_master where sql like 'CREATE VIRTUAL TABLE%'; ``` This returns the SQL definitions for virtual tables. The bit after `using` tells you what they need. Run this against a SpatiaLite database and you get the following: ```sql CREATE VIRTUAL TABLE SpatialIndex USING VirtualSpatialIndex() CREATE VIRTUAL TABLE ElementaryGeometries USING VirtualElementary() ``` Run it against an Apple Photos `photos.db` file (found with `find ~/Library | grep photos.db`) and you get this (partial list): ```sql CREATE VIRTUAL TABLE RidList_VirtualReader using RidList_VirtualReaderModule CREATE VIRTUAL TABLE Array_VirtualReader using Array_VirtualReaderModule CREATE VIRTUAL TABLE LiGlobals_VirtualBufferReader using VirtualBufferReaderModule CREATE VIRTUAL TABLE RKPlace_RTree using rtree (modelId,minLongitude,maxLongitude,minLatitude,maxLatitude) ``` For a database with FTS4 you get: ```sql CREATE VIRTUAL TABLE ""docs_fts"" USING FTS4 ( [title], [content], content=""docs"" ) ``` FTS5: ```sql CREATE VIRTUAL TABLE [FARA_All_Registrants_fts] USING FTS5 ( [Name], [Address_1], [Address_2], content=[FARA_All_Registrants] ) ``` So I can use this to figure out all of the `using` pieces and then compare them to a list of known support ones. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/672#issuecomment-586441484_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565552217,MDU6SXNzdWU1NjU1NTIyMTc=,674,Rethink how sanity checks work,9599,simonw,closed,0,,,,,5,2020-02-14T20:57:02Z,2020-03-26T17:19:23Z,2020-02-15T17:57:46Z,OWNER,,"If you specify a file to open using `files` or `-i` then Datasette should show a useful error message and fail to start. Files found by scanning a directory #672 should just be skipped. _Split off from comment by @simonw in https://github.com/simonw/datasette/issues/673#issuecomment-586455321_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565837965,MDU6SXNzdWU1NjU4Mzc5NjU=,87,Should detect collections.OrderedDict as a regular dictionary,9599,simonw,closed,0,,,,,2,2020-02-16T02:06:34Z,2020-02-16T02:20:59Z,2020-02-16T02:20:59Z,OWNER,,"``` File ""...python3.7/site-packages/sqlite_utils/db.py"", line 292, in create_table column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 567902704,MDU6SXNzdWU1Njc5MDI3MDQ=,675,--cp option for datasette publish and datasette package for shipping additional files and directories,141844,aviflax,open,0,,,,,12,2020-02-19T22:55:56Z,2020-12-28T18:49:21Z,,NONE,,"I’m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image — in my case, it’s a sort of a deployment directive. I’ve worked out a way to do this after the image has been created, but it’s convoluted and brittle. So it’d be excellent if there was an additional option for this command, something like, like, `--copy`. I’d envision it looking something like: ```shell $ datasette package --copy /the/source/path:/the/target/path data.db ``` I’d be happy to help design, specify, implement, and test this feature, if you’d be interested. Thanks for the fantastic tools!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 568091133,MDU6SXNzdWU1NjgwOTExMzM=,676,?_searchmode=raw option for running FTS searches without escaping characters,58088336,tunguyenatwork,closed,0,,,,,9,2020-02-20T06:56:57Z,2020-02-25T05:57:24Z,2020-02-25T05:56:04Z,NONE,,"After the version 0.34. I am not able to use the wildchar in the _search option( or the full text search). It will not return any result unless I specify the whole word for text search. If I use 'match :search || ""*"" ' in the sql statement then it will work as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569237568,MDU6SXNzdWU1NjkyMzc1Njg=,677,The first time you click sort by ID it should show you results in reverse order,9599,simonw,closed,0,,,,,1,2020-02-21T23:38:50Z,2020-03-21T23:57:46Z,2020-03-21T23:57:46Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/roadside_attractions Clicking the ""pk"" column header doesn't actually do anything - it sorts by pk asc but since the page was already sorted like that nothing useful changes. The first click on a primary key column that the page is already implicitly sorted by should instead enable sort descending on that column.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569253072,MDU6SXNzdWU1NjkyNTMwNzI=,678,prepare_connection() plugin hook should accept optional datasette argument,9599,simonw,closed,0,,,,,3,2020-02-22T00:50:26Z,2020-02-22T03:53:19Z,2020-02-22T02:28:51Z,OWNER,,"I want to build a plugin that allows users to configure certain database columns to be ""masked"" - so the `password` column on a users table is never revealed, for example. To do this, I need to use the `conn.set_authorizer()` SQLite mechanism. So the plugin needs to build off the `prepare_connection(conn)` hook. But that hook doesn't currently get passed `datasette` so it doesn't have a way of looking up its plugin configuration!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569268612,MDU6SXNzdWU1NjkyNjg2MTI=,679,Release 0.36,9599,simonw,closed,0,,,,,2,2020-02-22T02:41:01Z,2020-02-22T03:52:13Z,2020-02-22T03:52:13Z,OWNER,,"I think we have enough changes to warrant a release - and I want to take advantage of the changes to the `prepare_connection()` plugin hook in #678 Changes since 0.35 so far: https://github.com/simonw/datasette/compare/0.35...be2265b0e811d0ac2875c2f748125c17b0f9289e - [x] Update ecosystem page - [x] Write release notes - [x] Ship the release",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569275763,MDU6SXNzdWU1NjkyNzU3NjM=,680,Release automation: automate the bit that posts the GitHub release,9599,simonw,closed,0,,,,,5,2020-02-22T03:50:40Z,2020-09-12T18:18:50Z,2020-09-12T18:18:50Z,OWNER,,"The most manual part of [the release process](https://datasette.readthedocs.io/en/stable/contributing.html#release-process) right now is having to post a GitHub release that matches the updated changelog. This is particularly annoying because the changelog is in `.rst` while the GitHub release needs markdown - so I currently manually translate between the two. Having the release script automatically post a GitHub release at the end would be much more convenient.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569317377,MDU6SXNzdWU1NjkzMTczNzc=,681,Cashe-header missing in http-response,2181410,clausjuhl,closed,0,,,,,4,2020-02-22T10:50:45Z,2020-02-24T20:53:57Z,2020-02-24T20:53:56Z,NONE,,"Hi Simon. I need some help with both understanding and adding http-headers. If I call datasette on localhost with --config default_cache_ttl:120 and --cors, I only get the following response-headers: access-control-allow-origin: * content-type: text/html; charset=utf-8 date: Sat, 22 Feb 2020 10:32:15 GMT referrer-policy: no-referrer server: uvicorn transfer-encoding: chunked Cors works, but no caching-header is set? Same thing happens if I use the command in a Dockerfile and run datasette with docker. Second, how can one add headers to uvicorn? I've tried to add uvicorn commands to the Dockerfile, before the final datasette command, but it doesn't work. Is there any way to add headers to the uvicorn.run() command i datasette? I particular, I would like to add some of the missing security-headers: Thank you for a great product!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569613563,MDU6SXNzdWU1Njk2MTM1NjM=,682,Mechanism for writing to database via a queue,9599,simonw,closed,0,,,,,10,2020-02-24T03:10:07Z,2020-02-25T04:45:10Z,2020-02-25T04:45:10Z,OWNER,,"I've been mulling this over for a long time, and I have a new approach that I think is worth exploring. The catch with writing to SQLite is that it should only accept one write at a time. I'm now thinking that an easy way to manage that would be with a write queue for each database which is then read by a single dedicated write thread which manages its own writable connection.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/682/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570101428,MDExOlB1bGxSZXF1ZXN0Mzc5MTkyMjU4,683,.execute_write() and .execute_write_fn() methods on Database,9599,simonw,closed,0,,,3268330,Datasette 1.0,14,2020-02-24T19:51:58Z,2020-05-30T18:40:20Z,2020-02-25T04:45:08Z,OWNER,simonw/datasette/pulls/683,"See #682 - [x] Come up with design for `.execute_write()` and `.execute_write_fn()` - [x] Build some quick demo plugins to exercise the design - [x] Write some unit tests - [x] Write the documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 570301333,MDU6SXNzdWU1NzAzMDEzMzM=,684,Add documentation on Database introspection methods to internals.rst,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2020-02-25T04:20:24Z,2020-06-04T18:56:15Z,2020-05-30T18:40:39Z,OWNER,,`internals.rst` will be landing as part of #683,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570309546,MDU6SXNzdWU1NzAzMDk1NDY=,685,Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread(),9599,simonw,closed,0,,,3268330,Datasette 1.0,15,2020-02-25T04:49:44Z,2020-05-30T13:20:50Z,2020-05-08T17:42:18Z,OWNER,,"In #683 I started a new section of internals documentation covering the `Database` class: https://datasette.readthedocs.io/en/latest/internals.html#database-class I decided not to document `.execute()` and `.execute_against_connection_in_thread()` yet because I'm not 100% happy with their API design yet.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/685/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570327466,MDExOlB1bGxSZXF1ZXN0Mzc5Mzc4Nzgw,686,?_searchmode=raw option,9599,simonw,closed,0,,,,,0,2020-02-25T05:45:50Z,2020-02-25T05:56:09Z,2020-02-25T05:56:04Z,OWNER,simonw/datasette/pulls/686,Closes #676,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 571805300,MDU6SXNzdWU1NzE4MDUzMDA=,88,"table.disable_fts() method and ""sqlite-utils disable-fts ..."" command",9599,simonw,closed,0,,,,,5,2020-02-27T04:00:50Z,2020-02-27T04:40:44Z,2020-02-27T04:40:44Z,OWNER,,This would make it easier to iterate on the FTS configuration for a database without having to wipe and recreate the database each time.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 572896293,MDU6SXNzdWU1NzI4OTYyOTM=,687,Expand plugins documentation to multiple pages,9599,simonw,closed,0,,,5533512,Datasette 0.45,11,2020-02-28T17:26:21Z,2020-06-22T03:55:20Z,2020-06-22T03:53:54Z,OWNER,,"I think the plugins docs need to extend beyond a single page now. I want to add a whole section on writing tests for plugins, showing how `httpx` can be used as seen in https://github.com/simonw/datasette-atom/issues/3 and suchlike.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/687/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573088799,MDExOlB1bGxSZXF1ZXN0MzgxNjY2Nzc3,688,Don't count rows on homepage for DBs > 100MB,9599,simonw,closed,0,,,,,0,2020-02-29T01:01:06Z,2020-02-29T01:08:30Z,2020-02-29T01:08:29Z,OWNER,simonw/datasette/pulls/688,Closes #649.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 573578548,MDU6SXNzdWU1NzM1Nzg1NDg=,89,Ability to customize columns used by extracts= feature,9599,simonw,open,0,,,,,3,2020-03-01T16:54:48Z,2020-10-16T19:17:50Z,,OWNER,,"@simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, `.lookup()` allows you to define the ""value"" column (the documentation uses name), but when you use `extracts` keyword as part of `.insert()`, `.upsert()` etc. the lookup must be done against a column named ""value"". I have an existing lookup table that I've populated with columns ""id"" and ""name"" as opposed to ""id"" and ""value"", and seems I can't use `extracts=`, unless I'm missing something... Initial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so: ``` table = db.table(""trees"", extracts={""species_id"": (""Species"", ""name""}) ``` I haven't dug too much into the existing code yet, but does this make sense? Worth doing? _Originally posted by @chrishas35 in https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 573583971,MDU6SXNzdWU1NzM1ODM5NzE=,689,"""Templates considered"" comment broken in >=0.35",35075,chrishas35,closed,0,,,,,6,2020-03-01T17:31:21Z,2020-04-05T19:39:44Z,2020-04-05T19:39:44Z,NONE,,"Noticed that the ""Templates Considered"" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34: ```diff < ""datasette_version"": ""0.34"", < ""app_css_hash"": ""ffa51a"", < ""select_templates"": [ < ""*index.html"" < ], < ""zip"": "" "", < ""body_scripts"": [], < ""extra_css_urls"": "" "", < ""extra_js_urls"": "" "", < ""format_bytes"": "" "", < ""database_url"": "" >"", < ""database_color"": "" >"" --- > ""datasette_version"": ""0.35"", > ""database_url"": "" >"", > ""database_color"": "" >"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573740712,MDU6SXNzdWU1NzM3NDA3MTI=,90,Cannot .enable_fts() for columns with spaces in their names,9599,simonw,closed,0,,,,,0,2020-03-02T06:06:03Z,2020-03-02T06:10:49Z,2020-03-02T06:10:49Z,OWNER,,"``` import sqlite_utils db = sqlite_utils.Database(memory=True) db[""test""].insert({""space in name"": ""hello""}) db[""test""].enable_fts([""space in name""]) --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in ----> 1 db['test'].enable_fts([""space in name""]) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in enable_fts(self, columns, fts_version, create_triggers) 755 ) 756 self.db.conn.executescript(sql) --> 757 self.populate_fts(columns) 758 759 if create_triggers: /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in populate_fts(self, columns) 787 table=self.name, columns="", "".join(columns) 788 ) --> 789 self.db.conn.executescript(sql) 790 return self 791 OperationalError: near ""in"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/90/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573755726,MDU6SXNzdWU1NzM3NTU3MjY=,690,Mechanism for plugins to add action menu items for various things,9599,simonw,closed,0,,,6026070,0.51,11,2020-03-02T06:48:36Z,2020-10-30T05:20:43Z,2020-10-30T05:20:42Z,OWNER,,"Now that we have support for plugins that can write I'm seeing all sorts of places where a plugin might need to add UI to the table page. Some examples: - `datasette-configure-fts` needs to add a ""configure search for this table"" link - a plugin that lets you render or delete tables needs to add a link or button somewhere - existing plugins like `datasette-vega` and `datasette-cluster-map` already do this with JavaScript The challenge here is that multiple plugins may want to do this, so simply overriding templates and populating names blocks doesn't entirely work as templates may override each other.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 574021194,MDU6SXNzdWU1NzQwMjExOTQ=,691,--reload sould reload server if code in --plugins-dir changes,9599,simonw,open,0,,,,,1,2020-03-02T14:42:21Z,2020-06-14T02:35:17Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 574035432,MDU6SXNzdWU1NzQwMzU0MzI=,692,is_hidden_table context variable on table.html page,9599,simonw,open,0,,,,,1,2020-03-02T15:03:25Z,2020-03-02T15:03:48Z,,OWNER,,It's useful to know if a table is hidden when rendering that page. `datasette-configure-fts` for example may want to disallow enabling search on hidden tables.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 574043218,MDU6SXNzdWU1NzQwNDMyMTg=,693,Variables from extra_template_vars() not exposed in _context=1,9599,simonw,closed,0,,,,,3,2020-03-02T15:14:51Z,2020-04-05T19:12:48Z,2020-04-05T19:12:48Z,OWNER,,The `_context=1` debugging mode does not show variables that should have been added to the context by the `extra_template_vars()` plugin hook.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/693/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576582604,MDU6SXNzdWU1NzY1ODI2MDQ=,694,datasette publish cloudrun --memory option,9599,simonw,closed,0,,,,,8,2020-03-05T22:59:57Z,2020-06-23T17:10:51Z,2020-03-05T23:49:41Z,OWNER,,"Got this error deploying large (603MB) database with Cloud Run ``` X Deploying... Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revi sion might contain more information. X Creating Revision... Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. . Routing traffic... ✓ Setting IAM Policy... Deployment failed ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/694/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576711589,MDU6SXNzdWU1NzY3MTE1ODk=,695,Update SQLite bundled with Docker container,9599,simonw,closed,0,,,,,7,2020-03-06T05:42:12Z,2020-03-08T23:33:23Z,2020-03-06T06:15:27Z,OWNER,,"It's 3.26.0 at the moment: https://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/Dockerfile#L9-L11 Most recent release is 3.31.1: https://www.sqlite.org/releaselog/3_31_1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576722115,MDU6SXNzdWU1NzY3MjIxMTU=,696,Single failing unit test when run inside the Docker image,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2020-03-06T06:16:36Z,2021-03-29T17:04:19Z,2021-03-07T07:41:18Z,OWNER,,"``` docker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash root@0e1928cfdf79:/# cd /mnt root@0e1928cfdf79:/mnt# pip install -e .[test] root@0e1928cfdf79:/mnt# pytest ``` I get one failure! It was for `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]` ``` def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E + [] E - [[1, 'barry cat', 'terry dog', 'panther'], E - [2, 'terry dog', 'sara weasel', 'puma']] ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 577302229,MDU6SXNzdWU1NzczMDIyMjk=,91,Enable ordering FTS results by rank,416374,gfrmin,closed,0,,,6079500,3.0,1,2020-03-07T08:43:51Z,2020-11-06T23:53:26Z,2020-11-06T23:53:25Z,NONE,,According to https://www.sqlite.org/fts5.html (not sure about FTS4) results can be sorted by relevance. At the moment results are returned by default by `rowid`. Perhaps a flag can be added to the `search` method?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 577578306,MDU6SXNzdWU1Nzc1NzgzMDY=,697,index.html is not reliably loaded from a plugin,9599,simonw,closed,0,,,,,7,2020-03-08T22:37:55Z,2020-03-08T23:33:28Z,2020-03-08T23:11:27Z,OWNER,,"Lots of detail in https://github.com/simonw/datasette-search-all/issues/2 - short version is that I have a plugin with its own `index.html` template and Datasette intermittently fails to load it and uses the default `index.html` that ships with Datasette instead. Related: * #689: ""Templates considered"" comment broken in >=0.35 * #693: Variables from extra_template_vars() not exposed in _context=1 (may as well fix this while I'm in there)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 578883725,MDU6SXNzdWU1Nzg4ODM3MjU=,17,Command for importing commits,9599,simonw,closed,0,,,,,2,2020-03-10T21:55:12Z,2020-03-11T02:47:37Z,2020-03-11T02:47:37Z,MEMBER,,Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 581339961,MDU6SXNzdWU1ODEzMzk5NjE=,92,.columns_dict doesn't work for all possible column types,9599,simonw,closed,0,,,,,7,2020-03-14T19:30:35Z,2020-03-15T18:37:43Z,2020-03-14T20:04:14Z,OWNER,,"Got this error: ``` File "".../python3.7/site-packages/sqlite_utils/db.py"", line 462, in for column in self.columns KeyError: 'REAL' ``` `.columns_dict` uses `REVERSE_COLUMN_TYPE_MAPPING`: https://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L457-L463 `REVERSE_COLUMN_TYPE_MAPPING` defines `FLOAT` not `REAL`A https://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L68-L74",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/92/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 581795570,MDU6SXNzdWU1ODE3OTU1NzA=,93,Support more string values for types in .add_column(),9599,simonw,open,0,,,,,0,2020-03-15T19:32:49Z,2020-09-24T20:36:46Z,,OWNER,,"https://sqlite-utils.readthedocs.io/en/2.4.2/python-api.html#adding-columns says: > SQLite types you can specify are ""TEXT"", ""INTEGER"", ""FLOAT"" or ""BLOB"". As discovered in #92 this isn't the right list of values. I should expand this to match https://www.sqlite.org/datatype3.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 582517965,MDU6SXNzdWU1ODI1MTc5NjU=,698,Ability for a canned query to write to the database,9599,simonw,closed,0,,,5512395,Datasette 0.44,26,2020-03-16T18:31:59Z,2020-06-06T19:43:49Z,2020-06-06T19:43:48Z,OWNER,,"Canned queries are currently read-only: https://datasette.readthedocs.io/en/0.38/sql_queries.html#canned-queries Add a `""write"": true` option to their definition in `metadata.json` which turns them into queries that are submitted via POST and send their queries to the write queue. Then they can be used as a really quick way to define a writable interface and JSON API!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 582526961,MDU6SXNzdWU1ODI1MjY5NjE=,699,Authentication (and permissions) as a core concept,9599,simonw,closed,0,,,5512395,Datasette 0.44,40,2020-03-16T18:48:00Z,2020-06-06T19:42:11Z,2020-06-06T19:42:11Z,OWNER,,"Right now Datasette authentication is provided exclusively by plugins: * https://github.com/simonw/datasette-auth-github * https://github.com/simonw/datasette-auth-existing-cookies This is an all-or-nothing approach: either your Datasette instance requires authentication at the top level or it does not. But... as I build new plugins like https://github.com/simonw/datasette-configure-fts and https://github.com/simonw/datasette-edit-tables I increasingly have individual features which should be reserved for logged-in users while still wanting other parts of Datasette to be open to all. This is too much for plugins to own independently of Datasette core. Datasette needs to ship a single ""user is authenticated"" concept (independent of how users actually sign in) so that different plugins can integrate with it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/699/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 582713554,MDU6SXNzdWU1ODI3MTM1NTQ=,700,Request object utility for handling POST form data,9599,simonw,closed,0,,,,,1,2020-03-17T02:44:59Z,2020-03-17T02:47:50Z,2020-03-17T02:47:50Z,OWNER,,"> This is also going to need me to handle POST form submissions which means I need to be able to parse the form body. I guess that will go in [datasette/utils/asgi.py](https://github.com/simonw/datasette/blob/master/datasette/utils/asgi.py). _Originally posted by @simonw in https://github.com/simonw/datasette/issues/698#issuecomment-599704264_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/700/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 583970196,MDU6SXNzdWU1ODM5NzAxOTY=,701,Search box CSS doesn't look great on OS X Safari,9599,simonw,closed,0,,,5234079,Datasette 0.39,3,2020-03-18T20:00:52Z,2020-03-24T22:57:18Z,2020-03-24T22:57:18Z,OWNER,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585266763,MDU6SXNzdWU1ODUyNjY3NjM=,34,IndexError running user-timeline command,9599,simonw,closed,0,,,,,2,2020-03-20T18:54:08Z,2020-03-20T19:20:52Z,2020-03-20T19:20:37Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 256, in user_timeline utils.save_tweets(db, chunk) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 289, in save_tweets db[""users""].upsert(user, pk=""id"", alter=True) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1128, in upsert conversions=conversions, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1157, in upsert_all upsert=True, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1096, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585282212,MDU6SXNzdWU1ODUyODIyMTI=,35,twitter-to-sqlite user-timeline [screen_names] --sql / --attach,9599,simonw,closed,0,,,,,5,2020-03-20T19:26:07Z,2020-03-20T20:17:00Z,2020-03-20T20:16:35Z,MEMBER,,Split from #8.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585306847,MDU6SXNzdWU1ODUzMDY4NDc=,36,twitter-to-sqlite followers/friends --sql / --attach,9599,simonw,closed,0,,,,,0,2020-03-20T20:20:33Z,2020-03-20T23:12:38Z,2020-03-20T23:12:38Z,MEMBER,,"Split from #8. The `friends` and `followers` commands don't yet support `--sql` and `--attach`. (`friends-ids` and `followers-ids` do though).",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585353598,MDU6SXNzdWU1ODUzNTM1OTg=,37,"Handle ""User not found"" error",9599,simonw,closed,0,,,,,3,2020-03-20T22:14:32Z,2020-04-17T23:43:46Z,2020-04-17T23:43:46Z,MEMBER,,"While running `user-timeline` I got this bug (because a screen name I asked for didn't exist): ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 185, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' >>> import pdb >>> pdb.pm() > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user() -> user[""created_at""] = parser.parse(user[""created_at""]) (Pdb) user {'errors': [{'code': 50, 'message': 'User not found.'}]} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585359363,MDU6SXNzdWU1ODUzNTkzNjM=,38,Screen name display for user-timeline is uneven,9599,simonw,closed,0,,,,,1,2020-03-20T22:30:23Z,2020-03-20T22:37:17Z,2020-03-20T22:37:17Z,MEMBER,,"``` CDPHE [####################################] 67 CHFSKy [####################################] 3216 DHSWI [####################################] 41 DPHHSMT [####################################] 742 Delaware_DHSS [####################################] 3231 DhhsNevada [####################################] 639 ``` I could format them to match the length of the longest screen name instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585390482,MDU6SXNzdWU1ODUzOTA0ODI=,702,Option in metadata.json to set default sort order for a table,9599,simonw,closed,0,,,5234079,Datasette 0.39,5,2020-03-21T00:19:56Z,2020-03-25T04:19:36Z,2020-03-22T02:40:35Z,OWNER,,If you access the table page without any `?_sort` or `?_sort_desc` arguments it currently defaults to order by primary key - would be neat to be able to change that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585411547,MDU6SXNzdWU1ODU0MTE1NDc=,18,Commits in GitHub API can have null author,9599,simonw,closed,0,,,5225818,1.0,8,2020-03-21T02:20:56Z,2020-03-23T20:44:49Z,2020-03-23T20:44:26Z,MEMBER,,"``` Traceback (most recent call last): File ""/home/ubuntu/datasette-venv/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 235, in commits utils.save_commits(db, commits, repo_full[""id""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 290, in save_commits commit_to_insert[""author""] = save_user(db, commit[""author""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 54, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items' ``` Got this running the `commits` command from cron.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585526292,MDU6SXNzdWU1ODU1MjYyOTI=,1,Set up full text search,9599,simonw,closed,0,,,,,1,2020-03-21T15:57:35Z,2020-03-21T19:47:46Z,2020-03-21T19:45:52Z,MEMBER,,"Should run against `title` and `text` in `items`, and `about` and `id` in `users`.",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585597133,MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5,703,WIP implementation of writable canned queries,9599,simonw,closed,0,,,,,3,2020-03-21T22:23:51Z,2020-06-03T00:08:14Z,2020-06-02T23:57:35Z,OWNER,simonw/datasette/pulls/703,Refs #698.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 585597329,MDU6SXNzdWU1ODU1OTczMjk=,704,Add datasette-publish-fly to Datasette Publish documentation,9599,simonw,closed,0,,,5234079,Datasette 0.39,1,2020-03-21T22:25:10Z,2020-03-24T22:39:09Z,2020-03-24T22:39:09Z,OWNER,,It's a cool example of a plugin that provides a new publish provider - worth mentioning on https://datasette.readthedocs.io/en/stable/publish.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585626199,MDU6SXNzdWU1ODU2MjYxOTk=,705,latest.datasette.io is no longer updating,9599,simonw,closed,0,,,5234079,Datasette 0.39,15,2020-03-22T01:59:30Z,2020-03-25T02:30:24Z,2020-03-25T02:30:24Z,OWNER,,https://latest.datasette.io/-/versions is stuck on 0.35.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/705/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585633142,MDU6SXNzdWU1ODU2MzMxNDI=,706,"Documentation for the ""request"" object",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2020-03-22T02:55:50Z,2020-05-30T13:20:00Z,2020-05-27T22:31:22Z,OWNER,,"Since that object is passed to the `extra_template_vars` hooks AND the classes registered by `register_facet_classes` it should be part of the documented interface on https://datasette.readthedocs.io/en/stable/internals.html I could also start passing it to the `register_output_renderer` callback.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/706/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585850715,MDU6SXNzdWU1ODU4NTA3MTU=,19,"Enable full-text search for more stuff (like commits, issues and issue_comments)",9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T00:19:56Z,2020-03-23T19:06:39Z,2020-03-23T19:06:39Z,MEMBER,,Currently FTS is only enabled for repos and releases.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586454513,MDU6SXNzdWU1ODY0NTQ1MTM=,20,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,5225818,1.0,0,2020-03-23T19:17:58Z,2020-03-23T19:22:52Z,2020-03-23T19:22:52Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586477757,MDU6SXNzdWU1ODY0Nzc3NTc=,94,"If column data is a mixture of integers and nulls, detected type should be INTEGER",9599,simonw,closed,0,,,,,0,2020-03-23T19:51:46Z,2020-03-23T19:57:10Z,2020-03-23T19:57:10Z,OWNER,,It looks like detected type for that case is TEXT at the moment.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586486367,MDU6SXNzdWU1ODY0ODYzNjc=,95,Columns with only null values are no longer created in the database,9599,simonw,closed,0,,,,,0,2020-03-23T20:07:42Z,2020-03-23T20:31:15Z,2020-03-23T20:31:15Z,OWNER,,"Bug introduced in #94, and released in `2.4.3`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/95/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586561727,MDU6SXNzdWU1ODY1NjE3Mjc=,21,Turn GitHub API errors into exceptions,9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T22:37:24Z,2020-03-23T23:48:23Z,2020-03-23T23:48:22Z,MEMBER,,"This would have really helped in debugging the mess in #13. Running with this `auth.json` is a useful demo: ```json {""github_personal_token"": """"} ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586567379,MDU6SXNzdWU1ODY1NjczNzk=,22,Handle empty git repositories,9599,simonw,closed,0,,,,,0,2020-03-23T22:49:48Z,2020-03-23T23:13:11Z,2020-03-23T23:13:11Z,MEMBER,,"Got this error: ``` github_to_sqlite.utils.GitHubError: {'message': 'Git Repository is empty.', 'documentation_url': 'https://developer.github.com/v3/repos/commits/#list-commits-on-a-repository'} ``` From https://api.github.com/repos/dogsheep/beta/commits",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586595839,MDU6SXNzdWU1ODY1OTU4Mzk=,23,Release 1.0,9599,simonw,closed,0,,,5225818,1.0,1,2020-03-24T00:03:55Z,2020-03-24T00:15:50Z,2020-03-24T00:15:50Z,MEMBER,,Need to compile release notes.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587222354,MDU6SXNzdWU1ODcyMjIzNTQ=,707,"Consider configuring Jinja in Datasette() constructor, not .app()",9599,simonw,closed,0,,,,,0,2020-03-24T19:19:58Z,2020-03-27T01:12:57Z,2020-03-27T01:12:57Z,OWNER,,"Right now the following fails with an error: ```python ds = Datasette([], template_dir=""."") rendered = await ds.render_template(""index.html"") ``` The error is: ``` async def render_template( self, templates, context=None, request=None, view_name=None ): context = context or {} if isinstance(templates, Template): template = templates select_templates = [] else: if isinstance(templates, str): templates = [templates] > template = self.jinja_env.select_template(templates) E AttributeError: 'Datasette' object has no attribute 'jinja_env' ``` This is because `jinja_env` is configured in the `.app()` method, here: https://github.com/simonw/datasette/blob/a498d0fe6590f9bdbc4faf9e0dd5faeb3b06002c/datasette/app.py#L609-L633 This is a little surprising, especially now that `.render_template()` is part of the documented internals API: https://datasette.readthedocs.io/en/stable/internals.html#render-template-template-context-none-request-none Maybe this should happen in the Datasette class constructor instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/707/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587302139,MDExOlB1bGxSZXF1ZXN0MzkzMjc0NDMz,708,"base_url configuration setting, refs #394",9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-24T21:52:00Z,2020-03-25T00:18:44Z,2020-03-25T00:18:44Z,OWNER,simonw/datasette/pulls/708,Pull request implementing #394,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 587314002,MDU6SXNzdWU1ODczMTQwMDI=,709,Each plugin hook should link to example plugins built with it,9599,simonw,closed,0,,,5234079,Datasette 0.39,1,2020-03-24T22:18:48Z,2020-03-24T22:30:10Z,2020-03-24T22:29:43Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/709/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587322443,MDU6SXNzdWU1ODczMjI0NDM=,710,Remove Zeit Now v1 support,9599,simonw,closed,0,,,,,2,2020-03-24T22:39:49Z,2020-04-04T23:05:12Z,2020-04-04T23:05:12Z,OWNER,,It will remain supported as a plugin but since no-one can sign up for Docker hosting any more (for over a year now) there's no point including it in Datasette core.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587398703,MDU6SXNzdWU1ODczOTg3MDM=,711,Release notes for Datasette 0.39,9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-25T02:31:13Z,2020-03-25T04:06:55Z,2020-03-25T04:06:55Z,OWNER,,Then I can ship it.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/711/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 588108428,MDU6SXNzdWU1ODgxMDg0Mjg=,712,base_url doesn't entirely work for running Datasette inside Binder,9599,simonw,closed,0,,,,,12,2020-03-26T02:25:55Z,2020-03-26T15:11:49Z,2020-03-26T14:35:43Z,OWNER,,"> Thanks! I'm trying to launch Datasette from *within* a notebook using the jupyter-server-proxy and the new `base_url` parameter. While the assets load ok, and the breadcrumb navigation works, the facet links don't seem to use the `base_url`. Or have I missed something? _Originally posted by @wragge in https://github.com/simonw/datasette/issues/394#issuecomment-604166918_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589402939,MDU6SXNzdWU1ODk0MDI5Mzk=,4,"Store authentication information as ""pocket_access_token"" etc",9599,simonw,closed,0,,,,,0,2020-03-27T20:43:22Z,2020-03-27T20:43:59Z,2020-03-27T20:43:59Z,MEMBER,,The `pocket_` prefix will mean that the same `auth.json` file can be used for other Dogsheep tools without Pocket over-riding a value set by some other tool.,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589491711,MDU6SXNzdWU1ODk0OTE3MTE=,7,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,,,0,2020-03-28T02:24:51Z,2020-03-28T02:25:03Z,2020-03-28T02:25:03Z,MEMBER,,,205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589801352,MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3,96,Add type conversion for Panda's Timestamp,32605365,b0b5h4rp13,closed,0,,,,,2,2020-03-29T14:13:09Z,2020-03-31T04:40:49Z,2020-03-31T04:40:48Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/96,"Add type conversion for Panda's Timestamp, if Panda library is present in system (thanks for this project, I was about to do the same thing from scratch)",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 590666760,MDU6SXNzdWU1OTA2NjY3NjA=,39,--since feature can be confused by retweets,9599,simonw,closed,0,,,,,11,2020-03-30T23:25:33Z,2020-04-01T03:45:16Z,2020-04-01T03:45:16Z,MEMBER,,"If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run. It does this by seeking out the max ID of their previous tweets: https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311 BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 590669793,MDU6SXNzdWU1OTA2Njk3OTM=,40,Feature: record history of follower counts,9599,simonw,closed,0,,,,,5,2020-03-30T23:32:28Z,2020-04-01T04:13:05Z,2020-04-01T04:13:05Z,MEMBER,,"We currently over-write the follower count every time we import a tweet (when we import that user profile again): https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294 It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 591613579,MDU6SXNzdWU1OTE2MTM1Nzk=,41,"Bug: recorded a since_id for None, None",9599,simonw,closed,0,,,,,0,2020-04-01T04:29:43Z,2020-04-01T04:31:11Z,2020-04-01T04:31:11Z,MEMBER,,"This shouldn't happen in the `since_ids` table (relates to #39): ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 592829135,MDU6SXNzdWU1OTI4MjkxMzU=,713,Support YAML in metadata - metadata.yaml,9599,simonw,closed,0,,,,,6,2020-04-02T18:10:05Z,2020-04-02T19:36:17Z,2020-04-02T19:30:55Z,OWNER,,"I was originally going to do this with a plugin - see #357 - but the more I work with `metadata.json` the more I want it to just accept YAML as an optional alternative to JSON. The best example why is still this one: https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/master/russian-ads-metadata.yaml YAML is just SO much better than JSON for multi-line strings - in particular HTML and SQL, both of which are common in `metadata.json` files.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 592844348,MDExOlB1bGxSZXF1ZXN0Mzk3NzQ5NjUz,714,--metadata accepts YAML as well as JSON,9599,simonw,closed,0,,,,,1,2020-04-02T18:36:02Z,2020-04-02T19:30:54Z,2020-04-02T19:30:54Z,OWNER,simonw/datasette/pulls/714,Refs #713. Still needs tests and documentation.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/714/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 593006814,MDU6SXNzdWU1OTMwMDY4MTQ=,715,Refactor duplicate cell display logic,9599,simonw,open,0,,,,,0,2020-04-03T00:58:11Z,2020-04-03T00:58:11Z,,OWNER,,"The logic for rendering cells in table view and in database (or canned query) view is currently very similar: https://github.com/simonw/datasette/blob/7656fd64d8b6a32ebc34d89c1b8711cc5ea240f7/datasette/views/base.py#L514-L539 Compared with: https://github.com/simonw/datasette/blob/7656fd64d8b6a32ebc34d89c1b8711cc5ea240f7/datasette/views/table.py#L104-L195 I'll be changing this a bit in #698 but I should still try to clean this up more further in the future.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 593751293,MDU6SXNzdWU1OTM3NTEyOTM=,97,"Adding a ""recreate"" flag to the `Database` constructor",1448859,betatim,closed,0,,,,,4,2020-04-04T05:41:10Z,2020-04-15T14:29:31Z,2020-04-13T03:52:29Z,NONE,,"I have a [script](https://github.com/betatim/binder-datasette/blob/master/create-db.ipynb) that imports data into a sqlite DB. When I re-run that script I'd like to remove the existing sqlite DB, instead of adding to it. The pragmatic answer is to add the check and file deletion to my script. However I thought it would be easy and useful for others to add a `recreate=True` flag to `db = sqlite_utils.Database(""binder-launches.db"")`. After taking a look at the code for it I am not so sure any more. This is because the connection string could be a URL (or ""connection string"") like `""file:///tmp/foo.db""`. I don't know what the equivalent of `os.path.exists()` is for a connection string or how to detect that something is a connection string and raise an error ""can't use recreate=True and conn_string at the same time"". Does anyone have an idea/suggestion where to start investigating?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/97/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594168758,MDU6SXNzdWU1OTQxNjg3NTg=,716,extra_template_vars() sending wrong view_name for index,9599,simonw,closed,0,,,,,8,2020-04-04T23:57:09Z,2020-04-05T20:04:08Z,2020-04-05T18:28:48Z,OWNER,,"See https://github.com/simonw/museums/issues/20#issuecomment-609103663 - at some point between 286ed286b68793532c2a38436a08343b45cfbc91 and current master (e0e7a0facfc935a835cd73c720bc46661462f0b1 today) a bug was introduced where the `extra_template_vars(request, view_name)` plugin hook started being passed `None` instead of `index` for the `view_name` parameter on the site index page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/716/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594189527,MDU6SXNzdWU1OTQxODk1Mjc=,717,See if I can get Datasette working on Zeit Now v2,9599,simonw,closed,0,,,,,10,2020-04-05T00:56:48Z,2020-04-06T22:47:22Z,2020-04-06T22:47:21Z,OWNER,,"I thought this was impossible because AWS Lambda doesn't ship the `sqlite3` standard library module... but apparenttly that's not the case on Now v2 any more! https://now-2-python-versions-ks69olzpi.now.sh/api ``` _________________________________________________________________________________________________________________________________________________________________ / Hello from Python from a ZEIT Now Serverless Function! Version is 3.6.10 (default, Mar 10 2020, 22:54:43) \ \ [GCC 4.8.3 20140911 (Red Hat 4.8.3-9)], sqlite3 module = , sqlite3 version = [('3.7.17',)] / ----------------------------------------------------------------------------------------------------------------------------------------------------------------- \ ^__^ \ (oo)\_______ (__)\ )\/\ ||----w | || || ``` That's from shipping this code as `api/index.py`: ```python from http.server import BaseHTTPRequestHandler from cowpy import cow import sys try: import sqlite3 except ImportError: sqlite3 = None class handler(BaseHTTPRequestHandler): def do_GET(self): self.send_response(200) self.send_header(""Content-type"", ""text/plain"") self.end_headers() message = cow.Cowacter().milk( ""Hello from Python from a ZEIT Now Serverless Function! Version is {}, sqlite3 module = {}, sqlite3 version = {}"".format( sys.version, sqlite3, sqlite3.connect("":memory:"").execute(""select sqlite_version()"").fetchall() ) ) self.wfile.write(message.encode()) return ``` Now v2 supports ASGI so this might be possible without too much work: https://zeit.co/docs/runtimes#advanced-usage/advanced-python-usage/asynchronous-server-gateway-interface",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/717/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594237015,MDU6SXNzdWU1OTQyMzcwMTU=,718,Plugin idea: datasette-redirects,9599,simonw,open,0,,,,,0,2020-04-05T03:41:38Z,2023-08-30T22:17:31Z,,OWNER,,"I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin. https://github.com/simonw/museums/blob/6b1faf00c463b2228860d4d62d104b11935e01b1/plugins/redirect_www.py",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened 594553553,MDExOlB1bGxSZXF1ZXN0Mzk5MTY2NDMz,719,asgi: check raw_path is not None,193185,cldellow,closed,0,,,,,1,2020-04-05T16:53:58Z,2020-05-04T17:14:26Z,2020-05-04T17:14:26Z,CONTRIBUTOR,simonw/datasette/pulls/719,"The ASGI spec (https://asgi.readthedocs.io/en/latest/specs/www.html#http) seems to imply that `None` is a valid value, so we need to check the value itself, not just whether the key is present. In particular, the [mangum](https://github.com/erm/mangum) adapter passes `None` for this key's value. This change permits mangum to be used to front datasette in Amazon API Gateway + AWS Lambda deployments.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596245802,MDExOlB1bGxSZXF1ZXN0NDAwNTc4OTc5,720,"Update beautifulsoup4 requirement from ~=4.8.1 to >=4.8.1,<4.10.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:24:38Z,2020-05-04T17:14:51Z,2020-05-04T17:14:46Z,CONTRIBUTOR,simonw/datasette/pulls/720,"Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version. Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com). ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596245923,MDExOlB1bGxSZXF1ZXN0NDAwNTc5MDc3,721,"Update pytest requirement from ~=5.2.2 to >=5.2.2,<5.5.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:25:04Z,2020-05-04T17:13:49Z,2020-05-04T17:13:41Z,CONTRIBUTOR,simonw/datasette/pulls/721,"Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)Release notes
Sourced from pytest's releases.
5.4.1
pytest 5.4.1 (2020-03-13)
Bug Fixes
#6909: Revert the change introduced by #6330, which required all arguments to
@pytest.mark.parametrize
to be explicitly defined in the function signature.The intention of the original change was to remove what was expected to be an unintended/surprising behavior, but it turns out many people relied on it, so the restriction has been reverted.
#6910: Fix crash when plugins return an unknown stats while using the
--reportlog
option.Changelog
Sourced from pytest's changelog.
Commits
3d0f3ba
Preparing release version 5.4.1b9e2cd0
Merge pull request #6914 from nicoddemus/revert-6330a84fcbf
Revert "[parametrize] enforce explicit argnames declaration (#6330)"59c1bfa
Merge pull request #6913 from nicoddemus/backport-69103267f64
Merge pull request #6910 from nicoddemus/resultlog-logreportc9fd1bd
Preparing release version 5.4.093aa988
Merge pull request #6901 from RonnyPfannschmidt/regendoc-fix-simple7996724
Merge pull request #6902 from RoyalTS/filterwarnings-docfix90ee8a7
docfix378a75d
run and fix tox -e regen to prepare 5.4- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/721/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596246006,MDExOlB1bGxSZXF1ZXN0NDAwNTc5MTM2,722,"Update jinja2 requirement from ~=2.10.3 to >=2.10.3,<2.12.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:25:24Z,2020-05-04T17:13:26Z,2020-05-04T17:13:16Z,CONTRIBUTOR,simonw/datasette/pulls/722,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)Release notes
Sourced from jinja2's releases.
2.11.1
This fixes an issue in async environment when indexing the result of an attribute lookup, like
{{ data.items[1:] }}
.Changelog
Sourced from jinja2's changelog.
Version 2.11.1
Released 2020-01-30
- Fix a bug that prevented looking up a key after an attribute (
{{ data.items[1:] }}
) in an async template. 1141Version 2.11.0
Released 2020-01-27
- Drop support for Python 2.6, 3.3, and 3.4. This will be the last version to support Python 2.7 and 3.5.
- Added a new
ChainableUndefined
class to support getitem and getattr on an undefined object. 977- Allow
{%+
syntax (with NOP behavior) whenlstrip_blocks
is disabled. 748- Added a
default
parameter for themap
filter. 557- Exclude environment globals from meta.find_undeclared_variables. 931
- Float literals can be written with scientific notation, like 2.56e-3. 912, 922
- Int and float literals can be written with the '_' separator for legibility, like 12_345. 923
- Fix a bug causing deadlocks in
LRUCache.setdefault
. 1000- The
trim
filter takes an optional string of characters to trim. 828- A new
jinja2.ext.debug
extension adds a{% debug %}
tag to quickly dump the current context and available filters and tests. 174, 798, 983- Lexing templates with large amounts of whitespace is much faster. 857, 858
- Parentheses around comparisons are preserved, so
{{ 2 * (3 < 5) }}
outputs "2" instead of "False". 755, 938- Add new
boolean
,false
,true
,integer
andfloat
tests. 824- The environment's
finalize
function is only applied to the output of expressions (constant or not), not static template data. 63- When providing multiple paths to
FileSystemLoader
, a template can have the same name as a directory. 821- Always return Undefined when omitting the
else
clause in a{{ 'foo' if bar }}
expression, regardless of the environment'sundefined
class. Omitting theelse
clause is a valid shortcut and should not raise an error when using StrictUndefined. 710, 1079- Fix behavior of
loop
control variables such aslength
andrevindex0
when looping over a generator. 459, 751, 794, 993- Async support is only loaded the first time an environment enables it, in order to avoid a slow initial import. 765
- In async environments, the
|map
filter will await the filter call if needed. 913- In for loops that access
loop
attributes, the iterator is not advanced ahead of the current iteration unlesslength
,revindex
,nextitem
, orlast
are accessed. This makes it less likely to breakgroupby
results. 555, 1101- In async environments, the
loop
attributeslength
andrevindex
work for async iterators. 1101- In async environments, values from attribute/property access will be awaited if needed. 1101
- ~loader.PackageLoader doesn't depend on setuptools or pkg_resources. 970
PackageLoader
has limited support for 420 namespace packages. 1097- Support os.PathLike objects in ~loader.FileSystemLoader and ~loader.ModuleLoader. 870
- ~nativetypes.NativeTemplate correctly handles quotes between expressions.
"'{{ a }}', '{{ b }}'"
renders as the tuple('1', '2')
rather than the string'1, 2'
. 1020- Creating a ~nativetypes.NativeTemplate directly creates a ~nativetypes.NativeEnvironment instead of a default Environment. 1091
- After calling
LRUCache.copy()
, the copy's queue methods point to the correct queue. 843- Compiling templates always writes UTF-8 instead of defaulting to the system encoding. 889
|wordwrap
filter treats existing newlines as separate paragraphs to be wrapped individually, rather than creating short intermediate lines. 175- Add
break_on_hyphens
parameter to|wordwrap
filter. 550- Cython compiled functions decorated as context functions will be passed the context. 1108
- When chained comparisons of constants are evaluated at compile time, the result follows Python's behavior of returning
False
if any comparison returnsFalse
, rather than only the last one. 1102- Tracebacks for exceptions in templates show the correct line numbers and source for Python >= 3.7. 1104
- Tracebacks for template syntax errors in Python 3 no longer show internal compiler frames. 763
- Add a
DerivedContextReference
node that can be used by extensions to get the current context and local variables such asloop
. 860- Constant folding during compilation is applied to some node types that were previously overlooked. 733
TemplateSyntaxError.source
is not empty when raised from an included template. 457Commits
b85283e
release version 2.11.13d5bfc6
Merge pull request #1143 from pallets/bugfix/attribute-accessd61c1ea
add changelog15d7e61
Added regression test for slicing of attributes05dee9b
Fix attribute access in async code. Fixes #1141bbdafe3
release version 2.11.09ff27f6
add python 3.8 classifier, clean up changelogd312609
isolate bytecode cache tests9849979
import Markup from markupsafe, fix flake8 import warningsc6d864c
increment bytecode cache version- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 597671518,MDU6SXNzdWU1OTc2NzE1MTg=,98,"Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records",9599,simonw,closed,0,,,,,7,2020-04-10T03:19:40Z,2021-09-28T04:38:44Z,2020-04-13T03:29:15Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598013965,MDU6SXNzdWU1OTgwMTM5NjU=,724,--plugin-secret over-rides existing metadata.json plugin config,9599,simonw,closed,0,,,,,3,2020-04-10T17:56:30Z,2020-04-16T04:58:12Z,2020-04-10T18:34:21Z,OWNER,,"This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored. https://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598640234,MDU6SXNzdWU1OTg2NDAyMzQ=,99,.upsert_all() should maybe error if dictionaries passed to it do not have the same keys,9599,simonw,closed,0,,,,,2,2020-04-13T03:02:25Z,2020-04-13T03:05:20Z,2020-04-13T03:05:04Z,OWNER,,"While investigating #98 I stumbled across this: ``` def test_upsert_compound_primary_key(fresh_db): table = fresh_db[""table""] table.upsert_all( [ {""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 4}, {""species"": ""cat"", ""id"": 1, ""name"": ""Catbag""}, ], pk=(""species"", ""id""), ) table.upsert_all( [ {""species"": ""dog"", ""id"": 1, ""age"": 5}, {""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1}, ], pk=(""species"", ""id""), ) > assert [ {""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 5}, {""species"": ""cat"", ""id"": 1, ""name"": ""Catbag"", ""age"": None}, {""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1}, ] == list(table.rows) E AssertionError: assert [{'age': 5, '...cies': 'dog'}] == [{'age': 5, '...cies': 'dog'}] E At index 0 diff: {'species': 'dog', 'id': 1, 'name': 'Cleo', 'age': 5} != {'species': 'dog', 'id': 1, 'name': None, 'age': 5} E Full diff: E - [{'age': 5, 'id': 1, 'name': 'Cleo', 'species': 'dog'}, E ? ^^^ -- E + [{'age': 5, 'id': 1, 'name': None, 'species': 'dog'}, E ? ^^^ E {'age': None, 'id': 1, 'name': 'Catbag', 'species': 'cat'}, E {'age': 1, 'id': 2, 'name': 'New Dog', 'species': 'dog'}] ``` If you run `.upsert_all()` with multiple dictionaries it doesn't quite have the effect you might expect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598891570,MDExOlB1bGxSZXF1ZXN0NDAyNjQ1OTg0,725,"Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,3,2020-04-13T13:32:47Z,2020-05-04T18:16:54Z,2020-05-04T16:17:49Z,CONTRIBUTOR,simonw/datasette/pulls/725,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)Commits
9a2141e
0.5.0479b7ee
Update README6c247a2
Modernize testseec75d3
Switch to async def wherever possible786c3e9
Prepare for 3.81451075
Update README.rst5db1e38
Add several async os functionsa60f19b
Add async remove function9cf2ac8
Merge pull request #53 from graingert/patch-1b88912c
all should be a List[str]- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 599776345,MDU6SXNzdWU1OTk3NzYzNDU=,24,Feature idea: github-to-sqlite everything ...,9599,simonw,open,0,,,,,0,2020-04-14T18:34:00Z,2020-04-14T18:34:00Z,,MEMBER,,"At the moment if you want to pull all your repos, issues, issues comments etc you have to do it with a sequence of separate commands. Consider adding a `everything` or `all` command which fetches everything that the tool knows how to fetch, and is designed to be run on a cron in a way that fetches just new stuff each time.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/24/reactions"", ""total_count"": 7, ""+1"": 7, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 600120439,MDU6SXNzdWU2MDAxMjA0Mzk=,726,Foreign key : case of a link to the associated row not displayed,6371750,JBPressac,closed,0,,,,,1,2020-04-15T08:31:27Z,2020-04-27T22:05:47Z,2020-04-27T22:05:46Z,CONTRIBUTOR,,"Hello, I use Datasette to publish tsv files linked together by foreign keys declared thanks to sqlite-utils. In one table, [prelib_personne](http://crbc-dataset.huma-num.fr/prelib/prelib_personne), the foreign keys are properly noticed by a link to the associated row (for instance ville_naissance_id is properly linked to prelib_ville). But every link to the foreign key prelib_oeuvre.id fails. For instance, [prelib_ecritoeuvre](http://crbc-dataset.huma-num.fr/prelib/prelib_ecritoeuvre) has links to prelib_personne but none to prelib_oeuvre. In despite of the schema: CREATE TABLE ""prelib_ecritoeuvre"" ( ""id"" INTEGER, ""fonction_id"" INTEGER, ""oeuvre_id"" INTEGER, ""personne_id"" INTEGER ,PRIMARY KEY ([id]), FOREIGN KEY(fonction_id) REFERENCES prelib_fonctionecritoeuvre(id), FOREIGN KEY(personne_id) REFERENCES prelib_personne(id), FOREIGN KEY(oeuvre_id) REFERENCES prelib_oeuvre(id) ); Would you have any clue to investigate the reason of this problem? Thanks,",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/726/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 600583271,MDU6SXNzdWU2MDA1ODMyNzE=,727,Custom CSS class on body for styling canned queries,9599,simonw,closed,0,,,,,5,2020-04-15T20:57:32Z,2020-04-15T21:14:58Z,2020-04-15T21:07:50Z,OWNER,,"https://latest.datasette.io/fixtures/neighborhood_search is a canned query page. One of the templates scanned is `query-fixtures-neighborhood_search.html` BUT... the body CSS class just looks like this: ```html ``` I would be useful if that included a class that can be used to style that specific canned query page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/727/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601265023,MDU6SXNzdWU2MDEyNjUwMjM=,25,Improvements to demo instance,9599,simonw,closed,0,,,,,1,2020-04-16T17:26:55Z,2020-04-16T18:07:12Z,2020-04-16T18:07:12Z,MEMBER,,- [x] Demo should pull issue-comments as well,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601271612,MDU6SXNzdWU2MDEyNzE2MTI=,26,Topics are missing from repositories,9599,simonw,closed,0,,,,,2,2020-04-16T17:36:32Z,2020-04-16T17:41:11Z,2020-04-16T17:41:11Z,MEMBER,,"I'm sure this used to work, but right now repositories are fetched without their topics. https://developer.github.com/v3/repos/ says you need to send a custom `Accept` header of `application/vnd.github.mercy-preview+json` to get topics.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601330277,MDU6SXNzdWU2MDEzMzAyNzc=,27,Repos have a big blob of JSON in the organization column,9599,simonw,closed,0,,,,,5,2020-04-16T18:43:14Z,2020-04-18T00:19:16Z,2020-04-18T00:18:52Z,MEMBER,,"e.g. https://github-to-sqlite.dogsheep.net/github/repos ![github__repos__11_rows_where_sorted_by_updated_at_descending](https://user-images.githubusercontent.com/9599/79494124-5640b980-7fd7-11ea-99a2-17ffbd82f9ce.png) This appears to be obsolete because the `owner` column already links to that record, albeit in the `users` table with `type` set to `Organization`: https://github-to-sqlite.dogsheep.net/github/users/53015001",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601333634,MDU6SXNzdWU2MDEzMzM2MzQ=,28,Pull repository contributors,9599,simonw,closed,0,,,,,3,2020-04-16T18:46:40Z,2020-04-18T15:05:10Z,2020-04-18T15:05:10Z,MEMBER,,"https://developer.github.com/v3/repos/#list-contributors `GET /repos/:owner/:repo/contributors` Not sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601358649,MDU6SXNzdWU2MDEzNTg2NDk=,100,"Mechanism for forcing column-type, over-riding auto-detection",9599,simonw,closed,0,,,,,3,2020-04-16T19:12:52Z,2020-04-17T23:53:32Z,2020-04-17T23:53:32Z,OWNER,,"As seen in https://github.com/dogsheep/github-to-sqlite/issues/27#issuecomment-614843406 - there's a problem where you insert a record with a `None` value for a column and that column is created as `TEXT` - but actually you intended it to be an `INT` (as later examples will demonstrate). Some kind of mechanism for over-riding the detected types of columns would be useful here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601392318,MDU6SXNzdWU2MDEzOTIzMTg=,101,README should include an example of CLI data insertion,9599,simonw,closed,0,,,,,0,2020-04-16T19:45:37Z,2020-04-17T23:59:49Z,2020-04-17T23:59:49Z,OWNER,,Maybe using `curl` from the GitHub API.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602173589,MDU6SXNzdWU2MDIxNzM1ODk=,42,Error running user-timeline with --sql and --ids together,9599,simonw,closed,0,,,,,0,2020-04-17T19:02:06Z,2020-04-17T23:34:40Z,2020-04-17T23:34:40Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, inDependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in user_timeline ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" TypeError: object of type 'int' has no len() ``` But this DID work - casting to strings: ``` $ twitter-to-sqlite user-timeline tweets.db --sql='select """" || id from users' --ids ... this worked ... ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602176870,MDU6SXNzdWU2MDIxNzY4NzA=,43,"""twitter-to-sqlite lists"" command for retrieving a user's owned lists",9599,simonw,closed,0,,,,,1,2020-04-17T19:08:59Z,2020-04-17T23:48:28Z,2020-04-17T23:30:39Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships `https://api.twitter.com/1.1/lists/ownerships.json `",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602181581,MDU6SXNzdWU2MDIxODE1ODE=,44,"tweet[""source""] can be an empty string",9599,simonw,closed,0,,,,,0,2020-04-17T19:18:26Z,2020-04-17T22:01:44Z,2020-04-17T22:01:44Z,MEMBER,,"Got this excepion: ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 641, in extract_and_save_source details = m.groupdict() AttributeError: 'NoneType' object has no attribute 'groupdict' ``` I traced it back to this tweet: https://twitter.com/osder/status/578712651393576960 ``` (Pdb) source_re re.compile('.*?)"".*?>(?P .*?) ') (Pdb) locals()['source'] '' (Pdb) u > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets() -> tweet[""source""] = extract_and_save_source(db, tweet[""source""]) (Pdb) tweet {'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533300,MDU6SXNzdWU2MDI1MzMzMDA=,1,Import photo metadata from Apple Photos into SQLite,9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,8,2020-04-18T19:23:26Z,2020-05-04T02:41:40Z,,MEMBER,,"Faces, albums, locations, that kind of thing.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 602533352,MDU6SXNzdWU2MDI1MzMzNTI=,2,Ability to convert HEIC images to JPEG,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,1,2020-04-18T19:23:43Z,2020-04-28T16:47:21Z,2020-04-28T16:47:21Z,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533481,MDU6SXNzdWU2MDI1MzM0ODE=,3,"Import EXIF data into SQLite - lens used, ISO, aperture etc",9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,2,2020-04-18T19:24:31Z,2021-10-05T12:38:24Z,,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 602533539,MDU6SXNzdWU2MDI1MzM1Mzk=,4,Upload all my photos to a secure S3 bucket,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,14,2020-04-18T19:24:50Z,2020-04-18T21:58:11Z,2020-04-18T21:57:13Z,MEMBER,,"- [x] Create a bucket with bucket credentials - [x] Programmatically upload some recent photos to it (from a notebook) - [x] Turn this into a script",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602551638,MDU6SXNzdWU2MDI1NTE2Mzg=,5,photos-to-sqlite s3-auth command,9599,simonw,closed,0,,,,,1,2020-04-18T21:05:25Z,2020-04-18T21:08:44Z,2020-04-18T21:08:44Z,MEMBER,,Modeled on `github-to-sqlite auth` - prompts the user for their S3 credentials and saves them to `auth.json`.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602569315,MDU6SXNzdWU2MDI1NjkzMTU=,102,Can't store an array or dictionary containing a bytes value,9599,simonw,closed,0,,,,,0,2020-04-18T22:49:21Z,2020-05-01T20:45:45Z,2020-05-01T20:45:45Z,OWNER,,"``` In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [3]: db[""t""].insert({""id"": 1, ""data"": {""foo"": b""bytes""}}) --------------------------------------------------------------------------- TypeError Traceback (most recent call last)in ----> 1 db[""t""].insert({""id"": 1, ""data"": {""foo"": b""bytes""}}) ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert(self, record, pk, foreign_keys, column_order, not_null, defaults, hash_id, alter, ignore, replace, extracts, conversions, columns) 950 extracts=extracts, 951 conversions=conversions, --> 952 columns=columns, 953 ) 954 ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, conversions, columns, upsert) 1052 for key in all_columns: 1053 value = jsonify_if_needed( -> 1054 record.get(key, None if key != hash_id else _hash(record)) 1055 ) 1056 if key in extracts: ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in jsonify_if_needed(value) 1318 def jsonify_if_needed(value): 1319 if isinstance(value, (dict, list, tuple)): -> 1320 return json.dumps(value) 1321 elif isinstance(value, (datetime.time, datetime.date, datetime.datetime)): 1322 return value.isoformat() /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw) 229 cls is None and indent is None and separators is None and 230 default is None and not sort_keys and not kw): --> 231 return _default_encoder.encode(obj) 232 if cls is None: 233 cls = JSONEncoder /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in encode(self, o) 197 # exceptions aren't as detailed. The list call should be roughly 198 # equivalent to the PySequence_Fast that ''.join() would do. --> 199 chunks = self.iterencode(o, _one_shot=True) 200 if not isinstance(chunks, (list, tuple)): 201 chunks = list(chunks) /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in iterencode(self, o, _one_shot) 255 self.key_separator, self.item_separator, self.sort_keys, 256 self.skipkeys, _one_shot) --> 257 return _iterencode(o, 0) 258 259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr, /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in default(self, o) 177 178 """""" --> 179 raise TypeError(f'Object of type {o.__class__.__name__} ' 180 f'is not JSON serializable') 181 TypeError: Object of type bytes is not JSON serializable ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602575575,MDU6SXNzdWU2MDI1NzU1NzU=,6,Add progress bar to upload command,9599,simonw,closed,0,,,,,2,2020-04-18T23:32:41Z,2020-04-19T00:15:24Z,2020-04-19T00:15:24Z,MEMBER,,Upload was added in #4 ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602585497,MDU6SXNzdWU2MDI1ODU0OTc=,7,Integrate image content hashing,9599,simonw,open,0,,,,,2,2020-04-19T00:36:58Z,2021-08-26T02:01:01Z,,MEMBER,,To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 602619330,MDU6SXNzdWU2MDI2MTkzMzA=,45,Use raise_for_status() everywhere,9599,simonw,open,0,,,,,1,2020-04-19T04:38:28Z,2020-04-19T04:39:22Z,,MEMBER,,"I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message. Recent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575 Using `response.raise_for_status()` everywhere will make these errors less confusing.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 603242257,MDExOlB1bGxSZXF1ZXN0NDA2MDY3MDE5,728,"Update mergedeep requirement from ~=1.1.1 to >=1.1.1,<1.4.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-20T13:33:23Z,2020-05-04T16:45:58Z,2020-05-04T16:45:49Z,CONTRIBUTOR,simonw/datasette/pulls/728,"Updates the requirements on [mergedeep](https://github.com/clarketm/mergedeep) to permit the latest version. Commits
3d6e7b4
v1.3.0 - support additive merging ofCounter
types56a258a
v1.2.1 - tidy docs and variable names61ab213
v1.2.0 - support both TYPESAFE_REPLACE and TYPESAFE_ADDITIVE merge strategies...b331bb5
cleanup Makefile6f577bf
officially label support for python3.884faf37
use pipenv for managing dev dependencies3a8761a
Update README.md- See full diff in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 603295970,MDU6SXNzdWU2MDMyOTU5NzA=,729,Visually distinguish integer and text columns,9599,simonw,closed,0,,,,,8,2020-04-20T14:47:26Z,2020-05-18T17:20:02Z,2020-05-15T18:16:56Z,OWNER,,It would be useful if I could tell from looking at the table page if a column was a integer or a text (or a float I guess?). This is particularly important for knowing if it safe to sort by that column.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/729/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603617013,MDU6SXNzdWU2MDM2MTcwMTM=,29,Milestones should have foreign key to creator and repo,9599,simonw,closed,0,,,,,1,2020-04-21T00:20:44Z,2020-04-21T00:43:58Z,2020-04-21T00:43:58Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/milestones Creator is an integer but not a foreign key to users Repo is missing entirely!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603618244,MDU6SXNzdWU2MDM2MTgyNDQ=,30,Issues milestone column is the wrong type,9599,simonw,closed,0,,,,,2,2020-04-21T00:24:34Z,2020-04-21T00:45:23Z,2020-04-21T00:36:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?milestone=2857392 ![2A4C1185-2434-4F29-9EA0-3246E2F03F77](https://user-images.githubusercontent.com/9599/79811760-b7e08b00-832b-11ea-9ad7-684a6ae097a6.jpeg) It is TEXT when it should be an INTEGER - which is why the foreign key label is not correctly displayed.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603624862,MDU6SXNzdWU2MDM2MjQ4NjI=,31,Issue and milestone should have foreign key to repo,9599,simonw,closed,0,,,,,3,2020-04-21T00:46:24Z,2020-04-22T01:20:19Z,2020-04-22T01:20:19Z,MEMBER,,"Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`. _Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 604001627,MDExOlB1bGxSZXF1ZXN0NDA2Njc3MjA1,730,"Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12",27856297,dependabot-preview[bot],closed,0,,,,,1,2020-04-21T13:32:35Z,2020-05-04T13:27:24Z,2020-05-04T13:27:23Z,CONTRIBUTOR,simonw/datasette/pulls/730,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)Commits
1026c39
0.11.0ab2b140
Test on Python 3.8, drop 3.3 and 3.46397a22
plugin: Use pytest 5.4.0 new Function API21a0f94
Replace yield_fixture() by fixture()964b295
Added min hypothesis version so that bugfix for https://github.com/Hypothesis...4a11a20
Add max supported pytest version to < 5.4.0 to prevent fails until #141 is fi...b305594
Change event_loop to module scope in hypothesis tests, fixing #145.d5a0f47
Enable test_subprocess to be run on win, by changing to ProactorEventLoop in ...d07cd2d
Fix required pytest version86cd9a6
Handle BaseExceptions from loop.run_until_complete (#126)- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 604222295,MDU6SXNzdWU2MDQyMjIyOTU=,32,Issue comments don't appear to populate issues foreign key,9599,simonw,closed,0,,,,,3,2020-04-21T19:17:32Z,2020-04-22T01:17:44Z,2020-04-22T01:17:44Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605110015,MDU6SXNzdWU2MDUxMTAwMTU=,731,Option to automatically configure based on directory layout,9599,simonw,closed,0,,,,,9,2020-04-22T22:17:47Z,2020-04-27T16:32:44Z,2020-04-27T16:30:26Z,OWNER,,"My Datasette projects increasingly take on the following structure: - `metadata.json` with the metadata - One or more `something.db` database files - A `templates/` folder with some custom templates - A `plugins/` folder with some custom plugins Then I have to run Datasette like this: datasette *.db -m metadata.json --template-dir=templates --plugins-dir=plugins It would be really interesting if Datasette had a special mode where you could point it at a directory with the above layout and it would automatically configure itself based on the contents. Maybe even allow `datasette serve` to detect if it was passed a single argument that's a directory, not a file, and kick in to ""directory layout configuration mode"" in that case: datasette . ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/731/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605147638,MDU6SXNzdWU2MDUxNDc2Mzg=,8,Should I have used MD5 instead of SHA256?,9599,simonw,closed,0,,,,,2,2020-04-23T00:02:08Z,2020-04-23T00:03:35Z,2020-04-23T00:03:35Z,MEMBER,,"https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html > Objects created by the PUT Object, POST Object, or Copy operation, or through the AWS Management Console, and are encrypted by SSE-S3 or plaintext, have ETags that are an MD5 digest of their object data. ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605546606,MDExOlB1bGxSZXF1ZXN0NDA3OTI5MTI4,734,"Update janus requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-23T13:43:45Z,2020-05-04T16:48:14Z,2020-05-04T16:48:04Z,CONTRIBUTOR,simonw/datasette/pulls/734,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)Changelog
Sourced from janus's changelog.
0.5.0 (2020-04-23)
- Remove explicit loop arguments and forbid creating queues outside event loops #246
0.4.0 (2018-07-28)
- Add
py.typed
macro #89- Drop python 3.4 support and fix minimal version python3.5.3 #88
- Add property with that indicates if queue is closed #86
0.3.2 (2018-07-06)
- Fixed python 3.7 support #97
0.3.1 (2018-01-30)
- Fixed bug with join() in case tasks are added by sync_q.put() #75
0.3.0 (2017-02-21)
- Expose unfinished_tasks property #34
0.2.4 (2016-12-05)
- Restore tarball deploying
0.2.3 (2016-07-12)
- Fix exception type
0.2.2 (2016-07-11)
- Update asyncio.async() to use asyncio.ensure_future() #6
0.2.1 (2016-03-24)
- Fix python setup.py test command #4
0.2.0 (2015-09-20)
... (truncated)Commits
8e89b45
Bump to 0.5.0ec8592b
Fix up Python 3.8 loop argument warnings (#246)2543af6
Bump coverage from 5.0.4 to 5.103d1b36
Bump tox from 3.14.5 to 3.14.68219c38
Bump coverage from 5.0.3 to 5.0.485ec71d
Bump pytest from 5.4.0 to 5.4.13b974c9
Bump pytest from 5.3.5 to 5.4.0282dc12
Bump mypy from 0.761 to 0.7701364fb3
Bump tox from 3.14.4 to 3.14.5dc519bb
Bump tox from 3.14.3 to 3.14.4- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 605806386,MDU6SXNzdWU2MDU4MDYzODY=,735,"Error when I click on ""View and edit SQL""",30607,aborruso,closed,0,,,,,2,2020-04-23T19:31:32Z,2020-04-28T06:10:20Z,2020-04-27T19:00:30Z,NONE,,"Hi, when I do it [here](https://my-database.now.sh/commissioniComunePalermo/youtube), I have ""unrecognized token: ""["""" error. Is it normal? Thank you",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605938063,MDU6SXNzdWU2MDU5MzgwNjM=,9,"upload command should be resumable, should only upload photos not already uploaded",9599,simonw,closed,0,,,,,2,2020-04-23T23:31:08Z,2020-04-23T23:39:14Z,2020-04-23T23:39:14Z,MEMBER,,Follow on from #4. ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606028272,MDU6SXNzdWU2MDYwMjgyNzI=,10,Speed up hashing step using threads,9599,simonw,closed,0,,,,,0,2020-04-24T04:20:08Z,2020-04-24T04:32:35Z,2020-04-24T04:32:35Z,MEMBER,,"This TODO from the code: https://github.com/dogsheep/photos-to-sqlite/blob/2e7f2c67cc18b02c75bb64992a05b0196e507252/photos_to_sqlite/cli.py#L82-L90",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606032950,MDU6SXNzdWU2MDYwMzI5NTA=,11,Try running S3 uploads in a thread pool,9599,simonw,closed,0,,,,,0,2020-04-24T04:34:31Z,2020-04-24T16:45:41Z,2020-04-24T16:45:41Z,MEMBER,,"Since #10 provided such a speedup, can the same thing be done for the actual uploads? http://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606033104,MDU6SXNzdWU2MDYwMzMxMDQ=,12,"If less than 500MB, show size in MB not GB",9599,simonw,open,0,,,,,1,2020-04-24T04:35:01Z,2020-04-24T04:35:25Z,,MEMBER,,"Just saw this: ``` Uploading 0.05 GB ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 606720674,MDU6SXNzdWU2MDY3MjA2NzQ=,736,strange behavior using accented characters,30607,aborruso,closed,0,,,,,3,2020-04-25T08:34:51Z,2020-04-28T06:09:28Z,2020-04-27T18:59:16Z,NONE,,"Hi, when I search `incompatibilità` [here](https://my-database.now.sh/commissioniComunePalermo/youtube), using full text search, it becomes `incompatibilitÃÂ ` and I have no result. If I encode the `à` char in the URL (`incompatibilit%C3%A0`) I have the right result. ![image](https://user-images.githubusercontent.com/30607/80275201-00a79380-86e0-11ea-865e-f7e1474e8098.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/736/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607067303,MDExOlB1bGxSZXF1ZXN0NDA5MTIzODk3,737,"Custom pages mechanism, refs #648",9599,simonw,closed,0,,,,,4,2020-04-26T17:31:41Z,2020-04-26T18:46:43Z,2020-04-26T18:46:43Z,OWNER,simonw/datasette/pulls/737,"Refs #648. TODO: - [x] Pass a `view_name` to `render_template()` - [x] Mechanism for custom status code / headers / redirect - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 607086780,MDU6SXNzdWU2MDcwODY3ODA=,738,Pass a request object to custom page templates,9599,simonw,closed,0,,,,,1,2020-04-26T18:57:48Z,2020-04-26T19:01:54Z,2020-04-26T19:01:54Z,OWNER,,"Follow-up to #648. I'm not passing a request object to `.render_template()` at the moment, which breaks any other custom plugins using e.g. `extra_template_vars()` that were expecting to be able to access the request.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607107849,MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw,739,Configuration directory mode,9599,simonw,closed,0,,,,,3,2020-04-26T20:37:46Z,2020-04-27T16:30:25Z,2020-04-27T16:30:25Z,OWNER,simonw/datasette/pulls/739,"Refs #731 TODO: - [x] Decide how to combine explicit command-line options with items detected from the directory structure - [x] Add unit tests - [x] Implement `inspect-data.json` mechanism for populating `immutables` - [x] Add documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 607211058,MDU6SXNzdWU2MDcyMTEwNTg=,740,Don't throw 500 error on attempted directory browse,9599,simonw,closed,0,,,,,1,2020-04-27T03:50:11Z,2020-04-27T18:29:15Z,2020-04-27T18:29:15Z,OWNER,," This should be a 403 error instead, because the `--static` mechanism doesn't allow directory browsing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/740/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607223136,MDU6SXNzdWU2MDcyMjMxMzY=,741,"Replace ""datasette publish --extra-options"" with ""--setting""",9599,simonw,open,0,,,3268330,Datasette 1.0,9,2020-04-27T04:29:04Z,2022-05-12T19:21:16Z,,OWNER,,"See https://github.com/simonw/datasette-publish-now/issues/9#issuecomment-618155764 - the `--extra-options` mechanism is in practice just used to set `--config` options in data that you publish, but that means you end up with pretty messy looking commands: datasette publish my.db --extra-options=""--config default_page_size:50 --config sql_time_limit_ms:3500"" A neater design would be to support `--config` as an option for `datasette publish` directly: datasette publish my.db --config default_page_size:50 --config sql_time_limit_ms:3500 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/741/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 607243940,MDU6SXNzdWU2MDcyNDM5NDA=,742,"Speed up tests with scope=""session""?",9599,simonw,closed,0,,,,,1,2020-04-27T05:23:54Z,2020-04-27T18:24:53Z,2020-04-27T18:24:53Z,OWNER,,"Tests are pretty slow - could I speed them up with pytest `scope=""session""` on some of the fixtures? Eg https://travis-ci.org/github/simonw/datasette/jobs/679940036 ran 452 tests in 3m53s - the `test_html` ones seem particularly slow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607770595,MDU6SXNzdWU2MDc3NzA1OTU=,743,escape_fts() does not correctly escape * wildcards,9599,simonw,closed,0,,,,,4,2020-04-27T18:48:53Z,2020-04-27T19:11:30Z,2020-04-27T19:11:01Z,OWNER,,"Spotted in #732. This should not return any results... but it does: https://latest.datasette.io/fixtures/searchable?_search=bar%2A&_trace=1 The query from trace is: ``` ""sql"": ""select count(*) from searchable where rowid in (select rowid from searchable_fts where searchable_fts match escape_fts(:search))"", ""params"": { ""search"": ""bar*"" } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607888367,MDU6SXNzdWU2MDc4ODgzNjc=,13,Also upload movie files,9599,simonw,open,0,,,,,2,2020-04-27T22:11:25Z,2020-04-28T00:39:45Z,,MEMBER,,"The `upload` command currently only handles static images: https://github.com/dogsheep/photos-to-sqlite/blob/d939455af00e07866686457ee2fcb9b2d1b7194e/photos_to_sqlite/utils.py#L26-L33 Need to cover movies taken by my phone and DSLR too.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 608058890,MDU6SXNzdWU2MDgwNTg4OTA=,744,link_or_copy_directory() error - Invalid cross-device link,30607,aborruso,closed,0,,,,,28,2020-04-28T06:26:45Z,2020-05-28T14:32:53Z,2020-05-27T06:01:28Z,NONE,,"Hi, when I run ``` datasette publish heroku -n myapp --template-dir ./template mydb.db ``` I have this error ``` Traceback (most recent call last): File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 607, in link_or_copy_directory shutil.copytree(src, dst, copy_function=os.link) File ""/usr/lib/python3.7/shutil.py"", line 365, in copytree raise Error(errors) shutil.Error: [('/myfolder/youtubeComunePalermo/processing/./template/base.html', '/tmp/tmps9_4mzc4/templates/base.html', ""[Errno 18] Invalid cross-device link: '/myfolder/youtubeComunePalermo/processing/./template/base.html' -> '/tmp/tmps9_4mzc4/templates/base.html'""), ('/myfolder/youtubeComunePalermo/processing/./template/index.html', '/tmp/tmps9_4mzc4/templates/index.html', ""[Errno 18] Invalid cross-device link: '/myfolder/youtubeComunePalermo/processing/./template/index.html' -> '/tmp/tmps9_4mzc4/templates/index.html'"")] During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/aborruso/.local/bin/datasette"", line 8, inDependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)sys.exit(cli()) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/publish/heroku.py"", line 103, in heroku extra_metadata, File ""/usr/lib/python3.7/contextlib.py"", line 112, in __enter__ return next(self.gen) File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/publish/heroku.py"", line 191, in temporary_heroku_directory os.path.join(tmp.name, ""templates""), File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 609, in link_or_copy_directory shutil.copytree(src, dst) File ""/usr/lib/python3.7/shutil.py"", line 321, in copytree os.makedirs(dst) File ""/usr/lib/python3.7/os.py"", line 221, in makedirs mkdir(name, mode) FileExistsError: [Errno 17] File exists: '/tmp/tmps9_4mzc4/templates' ``` I'm attaching my very basic template folder. Thank you [template.zip](https://github.com/simonw/datasette/files/4543751/template.zip) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/744/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 608512747,MDU6SXNzdWU2MDg1MTI3NDc=,14,Annotate photos using the Google Cloud Vision API,9599,simonw,open,0,,,,,5,2020-04-28T18:09:03Z,2020-04-28T18:19:06Z,,MEMBER,,"It can detect faces, run OCR, do image labeling (it knows what a lemur is!) and do object localization where it identifies objects and returns bounding polygons for them.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14/reactions"", ""total_count"": 3, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 608613033,MDU6SXNzdWU2MDg2MTMwMzM=,745,Extract the hash-URL mechanism out into a plugin,9599,simonw,closed,0,,,,,2,2020-04-28T21:00:38Z,2020-10-23T19:47:18Z,2020-10-23T19:47:10Z,OWNER,,"0.28 in May 2019 made this feature not-the-default: https://datasette.readthedocs.io/en/stable/changelog.html#v0-28 - see #418 I've not felt the need to use it myself since. I think I should move it into a plugin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 608752766,MDExOlB1bGxSZXF1ZXN0NDEwNDY5Mjcy,746,"shutil.Error, not OSError",9599,simonw,closed,0,,,,,1,2020-04-29T03:30:51Z,2020-04-29T07:07:24Z,2020-04-29T07:07:23Z,OWNER,simonw/datasette/pulls/746,Refs #744,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 609950090,MDU6SXNzdWU2MDk5NTAwOTA=,33,Fall back to authentication via ENV,2029,garethr,closed,0,,,,,4,2020-04-30T12:58:14Z,2020-05-02T18:46:10Z,2020-05-02T18:45:37Z,NONE,,"Would you accept a PR that falls back to looking for an environment variable for the GitHub token? Specifically a change here: https://github.com/dogsheep/github-to-sqlite/blob/c34d5a18bfc41fa08755ba3d5cf9fe09ff204238/github_to_sqlite/cli.py#L271 I'd like to use `github-to-sqlite` in a GitHub Action workflow and this would be simpler than trying to fill out the prompt or generate a file with sensitive content. Wanted to check first, I'm happy to submit a PR with tests and updates to the docs. ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610192152,MDU6SXNzdWU2MTAxOTIxNTI=,747,Directory configuration mode should support metadata.yaml,9599,simonw,closed,0,,,,,4,2020-04-30T16:05:30Z,2020-04-30T19:04:19Z,2020-04-30T19:04:19Z,OWNER,,Refs #739 - `metadata.yml` or `metadata.yaml` should be detected in the same way as `metadata.json` is.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610284471,MDU6SXNzdWU2MTAyODQ0NzE=,46,Error running 'search' for the first time,9599,simonw,closed,0,,,,,0,2020-04-30T18:11:20Z,2020-04-30T18:11:58Z,2020-04-30T18:11:58Z,MEMBER,,"``` % twitter-to-sqlite search infodemic.db '#infodemic' Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 867, in search for tweet in tweets: File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 165, in fetch_timeline [since_type_id, since_key], sqlite3.OperationalError: no such table: since_ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610342575,MDU6SXNzdWU2MTAzNDI1NzU=,748,?_searchmode=raw should be documented on full-text search page,9599,simonw,closed,0,,,,,0,2020-04-30T19:50:06Z,2020-04-30T21:06:12Z,2020-04-30T21:06:12Z,OWNER,,"It's currently documented here: https://datasette.readthedocs.io/en/stable/json_api.html#special-table-arguments But it should also be described here: https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-view-api",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610408908,MDU6SXNzdWU2MTA0MDg5MDg=,34,Command for retrieving dependents for a repo,9599,simonw,closed,0,,,,,6,2020-04-30T21:47:51Z,2020-05-03T15:53:01Z,2020-05-03T15:53:01Z,MEMBER,,"I really, really want to start grabbing this data: https://github.com/simonw/datasette/network/dependents",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610511450,MDU6SXNzdWU2MTA1MTE0NTA=,35,Create index on issue_comments(user) and other foreign keys,9599,simonw,closed,0,,,,,3,2020-05-01T02:06:56Z,2020-05-02T18:26:24Z,2020-05-02T18:26:24Z,MEMBER,,"``` create index issue_comments_user on issue_comments(user) ``` I'm sure there are other user columns that could benefit from an index.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610517472,MDU6SXNzdWU2MTA1MTc0NzI=,103,sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns,32605365,b0b5h4rp13,closed,0,,,,,8,2020-05-01T02:26:14Z,2020-05-14T00:18:57Z,2020-05-14T00:18:57Z,CONTRIBUTOR,,"If using insert_all to put in 1000 rows of data with varying number of columns, it comes up with this message `sqlite3.OperationalError: too many SQL variables` if the number of columns is larger in later records (past the first row) I've reduced `SQLITE_MAX_VARS` by 100 to 899 at the top of `db.py` to add wiggle room, so that if the column count increases it wont go past SQLite's batch limit as calculated by this line of code based on the count of the first row's dict keys batch_size = max(1, min(batch_size, SQLITE_MAX_VARS // num_columns))",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610829227,MDU6SXNzdWU2MTA4MjkyMjc=,749,Cloud Run fails to serve database files larger than 32MB,9599,simonw,closed,0,,,,,4,2020-05-01T16:06:46Z,2020-12-03T00:31:15Z,2020-12-03T00:31:14Z,OWNER,,"https://cloud.google.com/run/quotas lists the maximum response size as 32MB. I spotted a bug where attempting to download a database file larger than that from a Cloud Run deployment (in this case it was https://github-to-sqlite.dogsheep.net/github.db after I [accidentally increased the size of that database](https://github.com/dogsheep/github-to-sqlite/commit/630bdba68a23c0ac453e015518ef0bf41107a952)) returned a 500 error because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/749/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610842926,MDU6SXNzdWU2MTA4NDI5MjY=,36,Add view for better display of dependent repos,9599,simonw,closed,0,,,,,2,2020-05-01T16:33:44Z,2020-05-02T16:50:31Z,2020-05-02T16:30:11Z,MEMBER,,"```sql select repos.full_name as repo, 'https://github.com/' || repos2.full_name as dependent, repos2.created_at as dependent_repo_created, repos2.updated_at as dependent_repo_updated, repos2.stargazers_count as dependent_repo_stars, repos2.watchers_count as dependent_repo_watchers from dependents join repos as repos2 on dependents.dependent = repos2.id join repos on dependents.repo = repos.id order by repos2.created_at desc ``` https://dogsheep.simonwillison.net/github?sql=select%0D%0A++repos.full_name+as+repo%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos2.full_name+as+dependent%2C%0D%0A++repos2.created_at+as+dependent_repo_created%2C%0D%0A++repos2.updated_at+as+dependent_repo_updated%2C%0D%0A++repos2.stargazers_count+as+dependent_repo_stars%2C%0D%0A++repos2.watchers_count+as+dependent_repo_watchers%0D%0Afrom%0D%0A++dependents%0D%0A++join+repos+as+repos2+on+dependents.dependent+%3D+repos2.id%0D%0A++join+repos+on+dependents.repo+%3D+repos.id%0D%0Aorder+by%0D%0A++repos2.created_at+desc",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610843136,MDU6SXNzdWU2MTA4NDMxMzY=,37,Mechanism for creating views if they don't yet exist,9599,simonw,closed,0,,,,,3,2020-05-01T16:34:10Z,2020-05-02T16:19:47Z,2020-05-02T16:19:31Z,MEMBER,,Needed for #36 #10 #12 ,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610853393,MDU6SXNzdWU2MTA4NTMzOTM=,104,"--schema option to ""sqlite-utils tables""",9599,simonw,closed,0,,,,,0,2020-05-01T16:55:49Z,2020-05-01T17:12:37Z,2020-05-01T17:12:37Z,OWNER,,Adds output showing the table schema.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610853576,MDU6SXNzdWU2MTA4NTM1NzY=,105,"""sqlite-utils views"" command",9599,simonw,closed,0,,,,,1,2020-05-01T16:56:11Z,2020-05-01T20:40:07Z,2020-05-01T20:38:36Z,OWNER,,Similar to `sqlite-utils tables`. See also #104.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611216862,MDU6SXNzdWU2MTEyMTY4NjI=,106,"create_view(..., ignore=True, replace=True) parameters",9599,simonw,closed,0,,,,,1,2020-05-02T15:45:21Z,2020-05-02T16:04:51Z,2020-05-02T16:02:10Z,OWNER,,"Two new parameters which specify what should happen if the view already exists. I want this for https://github.com/dogsheep/github-to-sqlite/issues/37 Here's the current `create_view()` implementation: https://github.com/simonw/sqlite-utils/blob/b4d953d3ccef28bb81cea40ca165a647b59971fa/sqlite_utils/db.py#L325-L332 `ignore=True` will not do anything if the view exists already. `replace=True` will drop and redefine the view - but only if its SQL definition differs, otherwise it will be left alone.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611222968,MDU6SXNzdWU2MTEyMjI5Njg=,107,sqlite-utils create-view CLI command,9599,simonw,closed,0,,,,,2,2020-05-02T16:15:13Z,2020-05-03T15:36:58Z,2020-05-03T15:36:37Z,OWNER,,Can go with #27 - `sqlite-utils create-table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611252244,MDU6SXNzdWU2MTEyNTIyNDQ=,750,Add notlike table filter,9599,simonw,closed,0,,,,,3,2020-05-02T18:54:36Z,2020-05-02T19:10:44Z,2020-05-02T19:10:44Z,OWNER,,"I found myself wanting that for applying the opposite of this: https://github-to-sqlite.dogsheep.net/github/dependent_repos?dependent__like=%25simonw%2F%25&_sort_desc=dependent_stars ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611284481,MDU6SXNzdWU2MTEyODQ0ODE=,38,[Feature Request] Support Repo Name in Search 🥺,5779832,zzeleznick,closed,0,,,,,4,2020-05-02T22:08:51Z,2020-05-03T02:34:32Z,2020-05-02T23:15:11Z,NONE,,"## Description Per your [v2.2 release tweet](https://twitter.com/simonw/status/1256700238099693568) I played with the demo, but the output did not match my expectations. ## Expected Behavior Expected a search query for ""twitter"" contained within the `repo` column to return non-zero results. ## Actual Behavior 😭 [0 rows where repo contains ""twitter"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=twitter&_sort_desc=starred_at) ## Best Explanation Per the table schema (see appendix) `repo` is of type `INTEGER` which built from `repo_id` and does not expose the repo name in search. ## Desired Behavior Given that searching for ""206156866"" is less intuitive than ""twitter"", it would be great to support this via extending the search capabilities or by adding an additional column. ✅ 104 rows where repo contains ""twitter"" ❌ [104 rows where repo contains ""206156866"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=206156866&_sort_desc=starred_at) ## Appendix ``` CREATE TABLE [stars] ( [user] INTEGER REFERENCES [users]([id]), [repo] INTEGER REFERENCES [repos]([id]), [starred_at] TEXT, PRIMARY KEY ([user], [repo]) ); CREATE INDEX [idx_stars_repo] ON [stars] ([repo]); CREATE INDEX [idx_stars_user] ON [stars] ([user]); ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611326701,MDU6SXNzdWU2MTEzMjY3MDE=,108,Documentation unit tests for CLI commands,9599,simonw,closed,0,,,,,2,2020-05-03T03:58:42Z,2020-05-03T04:13:57Z,2020-05-03T04:13:57Z,OWNER,,Have a test that ensures all CLI commands are documented.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611540797,MDU6SXNzdWU2MTE1NDA3OTc=,751,Ability to set custom default _size on a per-table basis,9599,simonw,closed,0,,,5471110,Datasette 0.43,4,2020-05-04T00:13:03Z,2020-05-28T05:00:22Z,2020-05-28T05:00:20Z,OWNER,,"I have some tables where I'd like the default page size to be 10, without affecting the rest of my Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611835285,MDU6SXNzdWU2MTE4MzUyODU=,752,Non-utf8 encoding in exceptionhandlers and custom-pages,2181410,clausjuhl,closed,0,,,,,1,2020-05-04T12:24:42Z,2020-05-04T17:42:20Z,2020-05-04T17:42:20Z,NONE,,"Hi Simon. Whenever a response is not piped through a router-view, the template is encoded in latin-1 (I think). This is especially a problem (for me) with the new custom_pages-functionality, but also problematic with the 404- and 500-handlers. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/752/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611874514,MDExOlB1bGxSZXF1ZXN0NDEyOTUxMTkx,753,"Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.13",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-05-04T13:27:19Z,2020-05-04T17:41:01Z,2020-05-04T17:40:49Z,CONTRIBUTOR,simonw/datasette/pulls/753,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version. Commits
b8e2a45
0.12.006580c6
Update changelogb45de23
Fixed failing test case, 'test_asyncio_marker_without_loop'.238cced
Put event_loop first among the fixtures of asyncio tests, fixes #154.e5e3dc7
Added unittests for issue #154.a7e5795
0.12.0 open for business!1026c39
0.11.0ab2b140
Test on Python 3.8, drop 3.3 and 3.46397a22
plugin: Use pytest 5.4.0 new Function API21a0f94
Replace yield_fixture() by fixture()- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 611997130,MDU6SXNzdWU2MTE5OTcxMzA=,754,Clean up aiofiles warnings on 3.8,9599,simonw,closed,0,,,,,2,2020-05-04T16:14:59Z,2020-05-04T16:22:30Z,2020-05-04T16:22:30Z,OWNER,,"https://travis-ci.org/github/simonw/datasette/jobs/682624476 Lots of warnings like this: ``` /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33 /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33 /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33: DeprecationWarning: ""@coroutine"" decorator is deprecated since Python 3.8, use ""async def"" instead def method(self, *args, **kwargs): /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27 /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27: DeprecationWarning: ""@coroutine"" decorator is deprecated since Python 3.8, use ""async def"" instead def _open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None, ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/754/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612082842,MDU6SXNzdWU2MTIwODI4NDI=,755,"Fix ""no such column: id"" output in tests",9599,simonw,closed,0,,,,,1,2020-05-04T18:37:49Z,2020-05-04T18:42:14Z,2020-05-04T18:42:14Z,OWNER,,"``` pytest ... tests/test_custom_pages.py ........ [ 33%] tests/test_database.py ......no such column: id ... [ 35%] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612089949,MDU6SXNzdWU2MTIwODk5NDk=,756,Add pipx to installation documentation,9599,simonw,closed,0,,,,,2,2020-05-04T18:49:01Z,2020-05-04T19:19:06Z,2020-05-04T19:10:33Z,OWNER,,"Add to this page: https://datasette.readthedocs.io/en/stable/installation.html Here's how to install plugins: https://twitter.com/simonw/status/1257348687979778050 ``` $ datasette plugins [] $ pipx inject datasette datasette-json-html injected package datasette-json-html into venv datasette done! ✨ 🌟 ✨ $ datasette plugins [ { ""name"": ""datasette-json-html"", ""static"": false, ""templates"": false, ""version"": ""0.6"" } ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612151767,MDU6SXNzdWU2MTIxNTE3Njc=,15,Expose scores from ZCOMPUTEDASSETATTRIBUTES,9599,simonw,closed,0,,,,,7,2020-05-04T20:36:07Z,2020-12-20T04:44:22Z,2020-05-05T00:11:45Z,MEMBER,,"The Apple Photos database has a `ZCOMPUTEDASSETATTRIBUTES` that looks absurdly interesting... it has calculated scores for every photo: ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612287234,MDU6SXNzdWU2MTIyODcyMzQ=,16,"Import machine-learning detected labels (dog, llama etc) from Apple Photos",9599,simonw,open,0,,,,,13,2020-05-05T02:45:43Z,2020-05-05T05:38:16Z,,MEMBER,,"Follow-on from #1. Apple Photos runs some very sophisticated machine learning on-device to figure out if photos are of dogs, llamas and so on. I really want to extract those labels out into my own database.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 612378203,MDU6SXNzdWU2MTIzNzgyMDM=,757,Question: Any fixed date for the release with the uft8-encoding fix?,2181410,clausjuhl,closed,0,,,,,3,2020-05-05T06:51:20Z,2020-05-06T18:41:29Z,2020-05-06T18:41:29Z,NONE,,Just a little impatient :),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/757/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612382643,MDU6SXNzdWU2MTIzODI2NDM=,758,Question: Access to immutable database-path,2181410,clausjuhl,open,0,,,,,6,2020-05-05T07:01:18Z,2020-05-28T08:23:27Z,,NONE,,"Hi Simon Is there anywhere in the app-context where one can access the hashed urlpath of the database? Currently it's included in the template-context (`databases[0][""path"")` when rendering urls of the database (eg. `/db-44b06v9/cases`...), but where can I find the hashed url when rendering the index-page? I'm trying to avoid redirects. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/758/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 612658444,MDU6SXNzdWU2MTI2NTg0NDQ=,109,"table.create_index(..., ignore=True)",9599,simonw,closed,0,,,,,1,2020-05-05T14:44:21Z,2020-05-05T14:46:53Z,2020-05-05T14:46:53Z,OWNER,,Option to silently do nothing if the index already exists.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612673948,MDU6SXNzdWU2MTI2NzM5NDg=,759,fts search on a column doesn't work anymore due to escape_fts,133845,Krazybug,closed,0,,,,,3,2020-05-05T15:03:44Z,2021-07-16T02:11:54Z,2020-05-06T17:50:57Z,NONE,,"Hi and first, thank you for this awesome work you make with this projet. On a db indexed in full text search, I can't query on indexed column anymore. This request ""cauvin language:ita"": is running smoothly on a old version of datasette but not on the current version. Compare the current version query `select uuid, title, authors, year, series, language, formats, publisher, tags, identifiers from summary where rowid in (select rowid from summary_fts where summary_fts match escape_fts(:search)) order by uuid limit 101` To an older version: `select title, authors, series, uuid, language, identifiers, tags, publisher, formats, year, links from summary where rowid in (select rowid from summary_fts where summary_fts match :search) order by uuid limit 101` _language_ is a searchable column but now the search string is known as ""cauvin language:ita"" literally as a search term. columns are not parsed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860531,MDU6SXNzdWU2MTI4NjA1MzE=,17,Only install osxphotos if running on macOS,9599,simonw,closed,0,,,,,3,2020-05-05T20:03:26Z,2020-05-05T20:20:05Z,2020-05-05T20:11:23Z,MEMBER,,The build is broken right now because you can't `pip install osxphotos` on Ubuntu.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860758,MDU6SXNzdWU2MTI4NjA3NTg=,18,Switch CI solution to GitHub Actions with a macOS runner,9599,simonw,open,0,,,,,1,2020-05-05T20:03:50Z,2020-05-05T23:49:18Z,,MEMBER,,Refs #17.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613002220,MDU6SXNzdWU2MTMwMDIyMjA=,19,apple-photos command should work even if upload has not run,9599,simonw,closed,0,,,,,1,2020-05-06T02:02:25Z,2020-05-19T20:59:59Z,2020-05-19T20:59:59Z,MEMBER,,"I want people to be able to query their Apple Photos metadata without having to first run `upload` to upload all of their files to their own S3 bucket. To do this I can have `apple-photos` calculate SHA256 hashes of each photo if the `uploads` table does not yet exist (or does not contain that photo).",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613006393,MDU6SXNzdWU2MTMwMDYzOTM=,20,Ability to serve thumbnailed Apple Photo from its place on disk,9599,simonw,closed,0,,,,,10,2020-05-06T02:17:50Z,2020-05-25T20:14:22Z,2020-05-25T20:09:41Z,MEMBER,,"A custom Datasette plugin that can be run locally on a Mac laptop which knows how to serve photos such that they can be seen in the browser. _Originally posted by @simonw in https://github.com/dogsheep/photos-to-sqlite/issues/19#issuecomment-624406285_",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613422636,MDU6SXNzdWU2MTM0MjI2MzY=,760,Way of seeing full schema for a database,9599,simonw,open,0,,,,,3,2020-05-06T15:46:08Z,2020-05-06T23:49:06Z,,OWNER,,"I find myself wanting to quickly figure out all of the BLOB columns in a database. A `/-/schema` page showing the full schema (actually since it's per-database probably `/dbname/-/schema` or `/-/schema/dbname`) would be really handy. It would need to be carefully constructed from various queries against `sqlite_master` - just doing `select * from sqlite_master where type='table'` isn't quite enough because I also want to show indexes, triggers etc.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/760/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613467382,MDU6SXNzdWU2MTM0NjczODI=,761,Allow-list pragma_table_info(tablename) and similar,9599,simonw,closed,0,,,,,8,2020-05-06T16:54:14Z,2020-05-07T03:09:05Z,2020-05-06T17:18:38Z,OWNER,,"It would be great if `pragma_table_info(tablename)` was allowed to be used in queries. See also https://github.com/simonw/til/blob/master/sqlite/list-all-columns-in-a-database.md > `select * from pragma_table_info(tablename);` is currently disallowed for user-provided queries via a regex restriction - but could help here too. > > https://github.com/simonw/datasette/blob/d349d57cdf3d577afb62bdf784af342a4d5be660/datasette/utils/__init__.py#L174 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/760#issuecomment-624729459_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613491342,MDU6SXNzdWU2MTM0OTEzNDI=,762,Experiment with PRAGMA hard_heap_limit ,9599,simonw,open,0,,,,,0,2020-05-06T17:33:23Z,2020-05-07T03:08:44Z,,OWNER,,"This was added in SQLite 2020-01-22 (3.31.0): https://www.sqlite.org/changes.html#version_3_31_0 > Add the [sqlite3_hard_heap_limit64()](https://www.sqlite.org/c3ref/hard_heap_limit64.html) interface and the corresponding [PRAGMA hard_heap_limit](https://www.sqlite.org/pragma.html#pragma_hard_heap_limit) command. This sounds like it could be a nice extra safety measure.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/762/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613755043,MDU6SXNzdWU2MTM3NTUwNDM=,110,Support decimal.Decimal type,134771,dvhthomas,closed,0,,,,,6,2020-05-07T03:57:19Z,2020-05-11T01:58:20Z,2020-05-11T01:50:11Z,NONE,,"Decimal types in Postgres cause a failure in db.py data type selection --- I have a Django app using a MoneyField, which uses a `numeric(14,0)` data type in Postgres (https://www.postgresql.org/docs/9.3/datatype-numeric.html). When attempting to export that table I get the following error: ```bash $ db-to-sqlite --table isaweb_proposal ""postgres://connection"" test.db .... column_type=COLUMN_TYPE_MAPPING[column_type], KeyError:Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)``` Looking at `sql_utils.db.py` at 292-ish it's clear that there is no matching type for what I assume SQLAlchemy interprets as Python decimal.Decimal. From the [SQLite docs](https://www.sqlite.org/datatype3.html#affinity_name_examples) it looks like DECIMAL in other DBs are considered numeric. I'm not quite sure if it's as simple as adding a data type to that list or if there are repercussions beyond it. Thanks for a great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613777056,MDU6SXNzdWU2MTM3NzcwNTY=,39,issues foreign key to repo isn't working,9599,simonw,closed,0,,,,,1,2020-05-07T05:11:48Z,2020-08-18T14:24:46Z,2020-08-18T14:23:56Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?_facet=repo If the foreign key was working those would be repository names. From the schema at the bottom of the page: ``` [repo] TEXT, ``` That's the wrong type and not a foreign key.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 614806683,MDExOlB1bGxSZXF1ZXN0NDE1Mjg2MTA1,763,Documentation + improvements for db.execute() and Results class,9599,simonw,closed,0,,,,,0,2020-05-08T15:16:02Z,2020-06-11T16:05:48Z,2020-05-08T16:05:46Z,OWNER,simonw/datasette/pulls/763,"Refs #685 Still TODO: - [x] Implement `results.first()` - [x] Implement `results.single_value()` - [x] Unit tests for the above ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/763/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 615474990,MDU6SXNzdWU2MTU0NzQ5OTA=,21,bpylist.archiver.CircularReference: archive has a cycle with uid(13),9599,simonw,closed,0,,,,,11,2020-05-10T20:58:06Z,2020-12-19T07:44:49Z,2020-05-10T21:57:13Z,MEMBER,,"``` % python -i $(which photos-to-sqlite) apple-photos photos.db Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 611, in place return self._place # pylint: disable=access-member-before-definition AttributeError: 'PhotoInfo' object has no attribute '_place' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite"", line 11, in load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py"", line 249, in apple_photos photo_row = osxphoto_to_row(sha256, photo) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py"", line 91, in osxphoto_to_row place = photo.place File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 614, in place self._place = PlaceInfo5(self._info[""reverse_geolocation""]) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 505, in __init__ self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 16, in unarchive return Unarchive(plist).top_object() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 256, in top_object return self.decode_object(self.top_uid) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 126, in decode_archive mapItem = archive.decode(""mapItem"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 180, in decode_archive sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in decode_archive return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 137, in _decode_index return self._unarchiver.decode_object(index) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 217, in decode_archive placeType = archive.decode(""placeType"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 227, in decode_object raise CircularReference(index) bpylist.archiver.CircularReference: archive has a cycle with uid(13) ``` In the debugger I traced this back to: ``` 178 @staticmethod 179 def decode_archive(archive): 180 -> sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") 181 finalPlaceInfos = archive.decode(""finalPlaceInfos"") 182 return PLRevGeoMapItem(sortedPlaceInfos, finalPlaceInfos) ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615477131,MDU6SXNzdWU2MTU0NzcxMzE=,111,sqlite-utils drop-table and drop-view commands,9599,simonw,closed,0,,,,,2,2020-05-10T21:10:42Z,2020-05-11T01:58:36Z,2020-05-11T00:44:26Z,OWNER,,Would be useful to be able to drop views and tables from the CLI.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615626118,MDU6SXNzdWU2MTU2MjYxMTg=,22,Try out ExifReader,9599,simonw,open,0,,,,,4,2020-05-11T06:32:13Z,2020-05-14T05:59:53Z,,MEMBER,,"https://pypi.org/project/ExifReader/ New fork that should be able to handle EXIF in HEIC files. Forked here: https://github.com/ianare/exif-py/issues/102#issuecomment-626376522 Refs #3 ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 616012427,MDU6SXNzdWU2MTYwMTI0Mjc=,764,Add PyPI project urls to setup.py,9599,simonw,closed,0,,,5471110,Datasette 0.43,3,2020-05-11T16:23:08Z,2020-05-27T20:21:36Z,2020-05-11T18:28:55Z,OWNER,,"Spotted this example here: ```python project_urls={ ""Issues"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/issues"", ""Source Code"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/-/tree/publish"", ""CI"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/pipelines"", ""Releases"": ""https://github.com/Cyb3r-Jak3/ExifReader"" }, ``` Results in this on https://pypi.org/project/ExifReader/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/764/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 616087149,MDU6SXNzdWU2MTYwODcxNDk=,765,publish heroku should default to currently tagged version,9599,simonw,open,0,,,,,1,2020-05-11T18:24:06Z,2020-05-11T18:25:43Z,,OWNER,,"Had a report that deploying to Heroku was using the previously installed version of Datasette, not the latest. Could be because of this: https://github.com/simonw/datasette/blob/af6c6c5d6f929f951c0e63bfd1c82e37a071b50f/datasette/publish/heroku.py#L172-L179 Heroku documentation recommends pinning to specific versions https://devcenter.heroku.com/articles/python-pip So... we could ensure we default to an install value of `[""datasette>=current_tag""]`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/765/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 616271236,MDU6SXNzdWU2MTYyNzEyMzY=,112,"add_foreign_key(...., ignore=True)",9599,simonw,closed,0,,,5896742,2.19,4,2020-05-12T00:24:00Z,2020-09-20T22:17:34Z,2020-09-20T22:17:34Z,OWNER,,"When using this library I often find myself wanting to ""add this foreign key, but only if it doesn't exist yet"". The `ignore=True` parameter is increasingly being used for this else where in the library (e.g. in `create_view()`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 617323873,MDU6SXNzdWU2MTczMjM4NzM=,766,Enable wildcard-searches by default,2181410,clausjuhl,open,0,,,,,2,2020-05-13T10:14:48Z,2021-03-05T16:35:21Z,,NONE,,"Hi Simon. It seems that datasette currently has wildcard-searches disabled by default (along with the boolean search-options, NEAR-queries and more, and despite the docs). If I try out the search-url provided in the [docs](https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-page-and-table-view-api) (https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort), it does not handle wildcard-searches, and I'm unable to make it work on my datasette-instance. I would argue that wildcard-searches is such a standard query, that it should be enabled by default. Requiring ""_searchmode=raw"" when using prefix-searches seems unnecessary. Plus: What happens to non-ascii searches when using ""_searchmode=raw""? Is the ""escape_fts""-function from datasette.utils ignored? Thanks! /Claus",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 620969465,MDU6SXNzdWU2MjA5Njk0NjU=,767,Allow to specify a URL fragment for canned queries,2657547,rixx,closed,0,,,5471110,Datasette 0.43,2,2020-05-19T13:17:42Z,2020-05-27T21:52:25Z,2020-05-27T21:52:25Z,CONTRIBUTOR,,"Canned queries are very useful to direct users to prepared data and views. I like to use them with charts using datasette-vega a lot, because people get a direct impression at first glance. datasette-vega doesn't show up by default though, and users have to click through to it. Also, datasette-vega does not always guess the best way to render columns correctly though, so it would be nice if I could specify a URL fragment in my canned queries to make sure people see what I want them to see. My current workaround is to include a fragement link in ``description_html`` and ask people to reload the page, like [here](https://data.rixx.de/songs/show_by_bpm#g.mark=bar&g.x_column=bpm_floor&g.x_type=ordinal&g.y_column=bpm_count&g.y_type=quantitative), which is a bit hacky.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621280529,MDU6SXNzdWU2MjEyODA1Mjk=,23,create-subset command for creating a publishable subset of a photos database,9599,simonw,closed,0,,,,,1,2020-05-19T20:58:20Z,2020-05-19T22:32:48Z,2020-05-19T22:32:37Z,MEMBER,,"I want to share a subset of my photos, without sharing everything. Idea: $ photos-to-sqlite create-subset photos.db public.db ""select sha256 from ... where ..."" So the command takes a SQL query that returns sha256 hashes, then creates a new file called `public.db` containing just the data corresponding to those photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621286870,MDU6SXNzdWU2MjEyODY4NzA=,113,Syntactic sugar for ATTACH DATABASE,9599,simonw,closed,0,,,,,2,2020-05-19T21:10:00Z,2021-02-19T05:09:12Z,2021-02-19T04:56:36Z,OWNER,,"https://www.sqlite.org/lang_attach.html Maybe something like this: ```python db.attach(""other_db"", ""other_db.db"") ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621323348,MDU6SXNzdWU2MjEzMjMzNDg=,24,Configurable URL for images,9599,simonw,open,0,,,,,1,2020-05-19T22:25:56Z,2020-05-20T06:00:29Z,,MEMBER,,"This is hard-coded at the moment, which is bad: https://github.com/dogsheep/photos-to-sqlite/blob/d5d69b9019703c47bc251444838578dd752801e2/photos_to_sqlite/cli.py#L269-L272",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 621332242,MDU6SXNzdWU2MjEzMzIyNDI=,25,Create a public demo,9599,simonw,closed,0,,,,,5,2020-05-19T22:47:20Z,2020-05-21T22:26:16Z,2020-05-20T05:54:18Z,MEMBER,,"So I can show people what this does, using some of my photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed