github
id | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | repo | url | merged_by | auto_merge |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
163561830 | MDExOlB1bGxSZXF1ZXN0MTYzNTYxODMw | 181 | closed | 0 | add "format sql" button to query page, uses sql-formatter | 1957344 | Cool project! This fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance. | 2018-01-17T21:50:04Z | 2019-11-11T03:08:25Z | 2019-11-11T03:08:25Z | a9ac208088e536043890e0f7ff8a182398576a51 | 0 | 86ac746cfcbf2fa86863f8fab528494600eac1ae | a290f28caae61b47e76e825c06984f22fc41a694 | NONE | 107914493 | https://github.com/simonw/datasette/pull/181 | |||||
165029807 | MDExOlB1bGxSZXF1ZXN0MTY1MDI5ODA3 | 182 | closed | 0 | Add db filesize next to download link | 3433657 | Took a stab at #172, will this do the trick? | 2018-01-25T04:58:56Z | 2019-03-22T13:50:57Z | 2019-02-06T04:59:38Z | a8d9e69872dec9a551b25cd609ffdbf3896045bd | 0 | b62835205a830472abb66c708822c2dcdf4ab027 | 56623e48da5412b25fb39cc26b9c743b684dd968 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/182 | |||||
181247568 | MDExOlB1bGxSZXF1ZXN0MTgxMjQ3NTY4 | 202 | closed | 0 | Raise 404 on nonexistent table URLs | 45057 | Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything. This is issue #184. | 2018-04-12T15:47:06Z | 2018-04-13T19:22:56Z | 2018-04-13T18:19:15Z | 134150933ade84327cfd97a88d536f5bff37a136 | 0 | 71bbf4e4be8a9ab7bcc4ddfb33760c7d902f4a34 | bfb4e45a7bcb880758dbc18f66258de26c1d1904 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/202 | |||||
181642114 | MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0 | 205 | closed | 0 | Support filtering with units and more | 45057 | The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels! | 2018-04-14T10:47:51Z | 2018-04-14T15:24:04Z | 2018-04-14T15:24:04Z | ed059c70e87a2930206652621e23a55167aa57c1 | 0 | eb3a37c34813ecbbfdae015305fec1f2a4ec27a5 | 6b15a53cd3cd40880a5e2d38827d5fac10e4bb5f | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/205 | |||||
187770345 | MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1 | 258 | closed | 0 | Add new metadata key persistent_urls which removes the hash from all database urls | 247131 | Add new metadata key "persistent_urls" which removes the hash from all database urls when set to "true" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow. | 2018-05-14T09:39:18Z | 2018-05-21T07:38:15Z | 2018-05-21T07:38:15Z | 457fcdfc82a0260db543d49006d49f8486f233b5 | 0 | 0d77a896ccb16b34c86fdeef7738f2d056e27e02 | 2b79f2bdeb1efa86e0756e741292d625f91cb93d | NONE | 107914493 | https://github.com/simonw/datasette/pull/258 | |||||
189860052 | MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy | 281 | closed | 0 | Reduces image size using Alpine + Multistage (re: #278) | 487897 | Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it <my-tag> datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 | 2018-05-23T05:27:05Z | 2018-05-26T02:10:38Z | 2018-05-26T02:10:38Z | 0d6c8fa841ae5d28e151e4ba43370289d1e2e22c | 0 | 3af65075c430d94647f8a1b1f215e82f563bc46f | 49f317752cfe89c5641165a490eef49e025752a7 | NONE | 107914493 | https://github.com/simonw/datasette/pull/281 | |||||
190901429 | MDExOlB1bGxSZXF1ZXN0MTkwOTAxNDI5 | 293 | closed | 0 | Support for external database connectors | 11912854 | I think it would be nice that Datasette could work with other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. These external connectors must have a structure similar to the structure [PyTables Datasette connector](https://github.com/PyTables/datasette-pytables) has. | 2018-05-28T11:02:45Z | 2018-09-11T14:32:45Z | 2018-09-11T14:32:45Z | ad2cb12473025ffab738d4df6bb47cd8b2e27859 | 0 | 59c94be46f9ccd806dd352fa28a6dba142d5ab82 | b7257a21bf3dfa7353980f343c83a616da44daa7 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/293 | |||||
193361341 | MDExOlB1bGxSZXF1ZXN0MTkzMzYxMzQx | 307 | closed | 0 | Initial sketch of custom URL routing, refs #306 | 9599 | See #306 for background on this. | 2018-06-07T15:26:48Z | 2018-06-07T15:29:54Z | 2018-06-07T15:29:41Z | 8c6663d3cc8043fc6f5c796275e80b0445bdff12 | 0 | 018af454f286120452e33d2568dd40908474a8a8 | a246f476b4fe490f5450836b22961bc607e6b4b0 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/307 | |||||
195339111 | MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx | 311 | closed | 0 | ?_labels=1 to expand foreign keys (in csv and json), refs #233 | 9599 | Output looks something like this: { "rowid": 233, "TreeID": 121240, "qLegalStatus": { "value" 2, "label": "Private" } "qSpecies": { "value": 16, "label": "Sycamore" } "qAddress": "91 Commonwealth Ave", ... } | 2018-06-16T16:31:12Z | 2018-06-16T22:20:31Z | 2018-06-16T22:20:31Z | 9fe59e54ad65eb1c8239b1a78edb5219d3ab8ab0 | 0 | 40287b1ba09d6e75f0db1458fe78d8c055f128af | d0a578c0fc07b9d9208cd9de981bdf7385a26c49 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/311 | |||||
196526861 | MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx | 322 | closed | 0 | Feature/in operator | 2691848 | 2018-06-21T17:41:51Z | 2018-06-21T17:45:25Z | 2018-06-21T17:45:25Z | 80b7bcefa1c07202779d98c9e2214f3ebad704e3 | 0 | 1acc562a2f60a7289438df657db8fd6dd3a7391d | e7566cc59d4b02ef301054fd35fdde6c925a8e38 | NONE | 107914493 | https://github.com/simonw/datasette/pull/322 | ||||||
211860706 | MDExOlB1bGxSZXF1ZXN0MjExODYwNzA2 | 363 | open | 0 | Search all apps during heroku publish | 436032 | Adds the `-A` option to include apps from all organizations when searching app names for publish. | 2018-08-29T19:25:10Z | 2018-08-31T14:39:45Z | b684b04c30f6b8779a3d11f7599329092fb152f3 | 0 | 2dd363e01fa73b24ba72f539c0a854bc901d23a7 | b7257a21bf3dfa7353980f343c83a616da44daa7 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/363 | ||||||
214653641 | MDExOlB1bGxSZXF1ZXN0MjE0NjUzNjQx | 364 | open | 0 | Support for other types of databases using external connectors | 11912854 | This PR is related to #293, but now all commits have been merged. The purpose is to support other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. The modifications in the original datasette code are minimal and many are in a separated file. | 2018-09-11T14:31:47Z | 2018-09-11T14:31:47Z | d84f3b1f585cb52b58aed0401c34214de2e8b47b | 0 | 592fd05f685859b271f0305c2fc8cdb7da58ebfb | b7257a21bf3dfa7353980f343c83a616da44daa7 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/364 | ||||||
232172106 | MDExOlB1bGxSZXF1ZXN0MjMyMTcyMTA2 | 389 | closed | 0 | Bump dependency versions | 9599 | 2018-11-20T02:23:12Z | 2019-11-13T19:13:41Z | 2019-11-13T19:13:41Z | 9194c0165aef411e0784ba49939b1005306f1f38 | 0 | f8349b45916e68d2f89c57694bd0e6afaf1bd508 | 5e3a432a0caa23837fa58134f69e2f82e4f632a6 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/389 | ||||||
247923347 | MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3 | 404 | closed | 0 | Experiment: run Jinja in async mode | 9599 | See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea. | 2019-01-27T00:28:44Z | 2019-11-12T05:02:18Z | 2019-11-12T05:02:13Z | 773bcac907d17b16eef604ad943837da39a10090 | 0 | dd7f24a47f660e2f0fc1e97a13d28908c28dc245 | 909cc8fbdfc9c05e447f40e9a73489809602c3cd | OWNER | 107914493 | https://github.com/simonw/datasette/pull/404 | |||||
266035382 | MDExOlB1bGxSZXF1ZXN0MjY2MDM1Mzgy | 424 | closed | 0 | Column types in inspected metadata | 45057 | This PR does two things: * Adds the sqlite column type for each column to the inspected table info. * Stops binary columns from being rendered to HTML, unless a plugin handles it. There's a bit more detail in the changeset descriptions. These changes are intended as a precursor to a plugin which adds first-class support for Spatialite geographic primitives, and perhaps more useful geo-stuff. | 2019-03-31T18:46:33Z | 2019-04-29T18:30:50Z | 2019-04-29T18:30:46Z | a332d4e0b3fed7165a22880430664f1c3a00963d | 0 | 92e7b8c67fe5bcd484f19576f20c9235aca9050b | 0209a0a344503157351e625f0629b686961763c9 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/424 | |||||
270251021 | MDExOlB1bGxSZXF1ZXN0MjcwMjUxMDIx | 432 | closed | 0 | Refactor facets to a class and new plugin, refs #427 | 9599 | WIP for #427 | 2019-04-13T20:04:45Z | 2019-05-03T00:04:24Z | 2019-05-03T00:04:24Z | b78bc19269ed83b054a60c79c4fe08f4ca943942 | 0 | 5c198f7ca5d2aff49180820271ba8d06b79aefb1 | 9c77e6e355ec718d76178a7607721d10a66b6aef | OWNER | 107914493 | https://github.com/simonw/datasette/pull/432 | |||||
274174614 | MDExOlB1bGxSZXF1ZXN0Mjc0MTc0NjE0 | 437 | closed | 0 | Add inspect and prepare_sanic hooks | 45057 | This adds two new plugin hooks: The `inspect` hook allows plugins to add data to the inspect dictionary. The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now... On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer). Ref #14 | 2019-04-28T11:53:34Z | 2019-06-24T16:38:57Z | 2019-06-24T16:38:56Z | 7aeaac7c478acf572bda61bdaa6ac3247dc15811 | 0 | f33a0a63a7442f0b665320ac3e2eb55de315f1f7 | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/437 | |||||
274313625 | MDExOlB1bGxSZXF1ZXN0Mjc0MzEzNjI1 | 439 | closed | 0 | [WIP] Add primary key to the extra_body_script hook arguments | 45057 | This allows the row to be identified on row pages. The context here is that I want to access the row's data to plot it on a map. I considered passing the entire template context through to the hook function. This would expose the actual row data and potentially avoid a further fetch request in JS, but it does make the plugin API a lot more leaky. (At any rate, using the selected row data is tricky in my case because of Spatialite's infuriating custom binary representation...) | 2019-04-29T10:08:23Z | 2019-05-01T09:58:32Z | 2019-05-01T09:58:30Z | b3cbcfef4d11d2741cf00861734d726a4730afe5 | 0 | 76b2c8fa406063b436155a7d8995e07b7e718c13 | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/439 | |||||
275909197 | MDExOlB1bGxSZXF1ZXN0Mjc1OTA5MTk3 | 450 | closed | 0 | Coalesce hidden table count to 0 | 45057 | For some reason I'm hitting a `None` here with a FTS table. I'm not entirely sure why but this makes the logic work the same as with non-hidden tables. | 2019-05-04T09:37:10Z | 2019-05-11T18:10:09Z | 2019-05-11T18:10:09Z | 5918489a2a2f14b58c5c71773a9d4fb6bb0e3e0a | 0 | f81d9df985e8d054fc16ab91f72878fe71656354 | 55643430f7ac8d27e99b00e7cf79db741003e811 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/450 | |||||
275923066 | MDExOlB1bGxSZXF1ZXN0Mjc1OTIzMDY2 | 452 | open | 0 | SQL builder utility classes | 45057 | This adds a straightforward set of classes to aid in the construction of SQL queries. My plan for this was to allow plugins to manipulate the Datasette-generated SQL in a more structured way. I'm not sure that's going to work, but I feel like this is still a step forward - it reduces the number of intermediate variables in `TableView.data` which aids readability, and also factors out a lot of the boring string concatenation. There are a fair number of minor structure changes in here too as I've tried to make the ordering of `TableView.data` a bit more logical. As far as I can tell, I haven't broken anything... | 2019-05-04T13:57:47Z | 2019-05-04T14:03:04Z | 45e7460d78c3f87c01f2e9e142cb7f646b23b156 | 0 | c63762280d3bd66ad6ea24933dafe218861efef2 | 55643430f7ac8d27e99b00e7cf79db741003e811 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/452 | ||||||
285698310 | MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw | 501 | closed | 0 | Test against Python 3.8-dev using Travis | 9599 | 2019-06-06T08:37:53Z | 2019-11-11T03:23:29Z | 2019-11-11T03:23:29Z | 1aac0cf0ab962060dd5cff19b8b179bb7fa0f00b | 0 | a5defb684fcc734f6325ca08beef9f49c3e7a298 | 5e8fbf7f6fbc0b63d0479da3806dd9ccd6aaa945 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/501 | ||||||
290971295 | MDExOlB1bGxSZXF1ZXN0MjkwOTcxMjk1 | 524 | closed | 0 | Sort commits using isort, refs #516 | 9599 | Also added a lint unit test to ensure they stay sorted. #516 | 2019-06-24T05:04:48Z | 2023-08-23T01:31:08Z | 2023-08-23T01:31:08Z | 4e92ebe00a058e02b2d7543cff60ac2f78aa97c7 | 0 | dafae70ee7f74ce79b541a94385172be3ad0de83 | cdd24f3eaa207f67d948c1876725b0f84654a623 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/524 | |||||
291534596 | MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2 | 529 | closed | 0 | Use keyed rows - fixes #521 | 1383872 | Supports template syntax like this: ``` {% for row in display_rows %} <h2 class="scientist">{{ row["First_Name"] }} {{ row["Last_Name"] }}</h2> ... ``` | 2019-06-25T12:33:48Z | 2019-06-25T12:35:07Z | 2019-06-25T12:35:07Z | 3be9759418fdfe4a8ae8aec46fc2a937d45332d2 | 0 | 312e3394bd9f3eaef606fbe37eb409ec7462baaf | 9e97b725f11be3f4dca077fe5569078a62ec2761 | NONE | 107914493 | https://github.com/simonw/datasette/pull/529 | |||||
295127213 | MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz | 546 | open | 0 | Facet by delimiter | 9599 | Refs #510 | 2019-07-07T20:06:05Z | 2019-11-18T23:46:01Z | 68a6fb1a576a747b868771d00a10753f35aaa0cf | 0 | 47ac6c6e46da16716d295d7cda8f79cd0663ca5e | a9909c29ccac771c23c2ef22b89d10697b5256b9 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/546 | ||||||
296735320 | MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw | 557 | closed | 0 | Get tests running on Windows using Travis CI | 9599 | Refs #511 | 2019-07-11T16:36:57Z | 2021-07-10T23:39:48Z | 2021-07-10T23:39:48Z | cddb9a9fecfa25147d80df05f1a6d6e1686ca30d | 0 | 47b5ab43be87217c4e40ad93b8aa2e9639fa371f | f2006cca80040871439055ae6ccbc14e589bdf4b | OWNER | 107914493 | https://github.com/simonw/datasette/pull/557 | |||||
301483613 | MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz | 564 | open | 0 | First proof-of-concept of Datasette Library | 9599 | Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() | 2019-07-26T10:22:26Z | 2023-02-07T15:14:11Z | 4f425d2b39d1be10d7ef5c146480a3eb494d5086 | 1 | 947645d84710677ea50762016081a9fbc6b014a8 | a9453c4dda70bbf5122835e68f63db6ecbe1a6fc | OWNER | 107914493 | https://github.com/simonw/datasette/pull/564 | ||||||
313007483 | MDExOlB1bGxSZXF1ZXN0MzEzMDA3NDgz | 56 | closed | 0 | Escape the table name in populate_fts and search. | 49260 | The table names weren't escaped using double quotes in the populate_fts method. Reproducible case: ``` >>> import sqlite_utils >>> db = sqlite_utils.Database("abc.db") >>> db["http://example.com"].insert_all([ ... {"id": 1, "age": 4, "name": "Cleo"}, ... {"id": 2, "age": 2, "name": "Pancakes"} ... ], pk="id") <Table http://example.com (id, age, name)> >>> db["http://example.com"].enable_fts(["name"]) Traceback (most recent call last): File "<input>", line 1, in <module> db["http://example.com"].enable_fts(["name"]) File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py", l ine 705, in enable_fts self.populate_fts(columns) File "/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py", l ine 715, in populate_fts self.db.conn.executescript(sql) sqlite3.OperationalError: unrecognized token: ":" >>> ``` | 2019-09-01T06:29:05Z | 2019-09-02T17:23:21Z | 2019-09-02T17:23:21Z | 79852e97ecb69b88da87da0cba2a55887cf83bda | 0 | 83ca4c802f5d5102e73ff366e61514ded81dc7a1 | cb70f7d10996b844154bf3da88779dd1f65590bc | CONTRIBUTOR | 140912432 | https://github.com/simonw/sqlite-utils/pull/56 | |||||
322529381 | MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx | 578 | closed | 0 | Added support for multi arch builds | 887095 | Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s). | 2019-09-29T18:43:03Z | 2019-11-13T19:13:15Z | 2019-11-13T19:13:15Z | ae1aa0929b9e62a413ec9b4a40588e6aafe50573 | 0 | ce6372bc6210ae52ac1951647b8fbaee40d64fc1 | 0fc8afde0eb5ef677f4ac31601540d6168c8208d | NONE | 107914493 | https://github.com/simonw/datasette/pull/578 | |||||
323983732 | MDExOlB1bGxSZXF1ZXN0MzIzOTgzNzMy | 579 | open | 0 | New connection pooling | 9599 | See #569 | 2019-10-02T23:22:19Z | 2019-11-15T22:57:21Z | 025b4024b1b43ea034b7fd331c30740165ff75f2 | 0 | 32cbfd2acd28bcefb97c442ac8e3ee2c07401e19 | a9909c29ccac771c23c2ef22b89d10697b5256b9 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/579 | ||||||
327541046 | MDExOlB1bGxSZXF1ZXN0MzI3NTQxMDQ2 | 595 | closed | 0 | bump uvicorn to 0.9.0 to be Python-3.8 friendly | 4312421 | as uvicorn-0.9 is needed to get websockets-8.0.2, which is needed to have Python-3.8 compatibility | 2019-10-13T10:00:04Z | 2019-11-12T04:46:48Z | 2019-11-12T04:46:48Z | 5a7185bcd15aab28e86338b3771c25af13a94a4c | 0 | e1d92ea94ca8f14879ef280cb7dadab7eed76e9c | fffd69ec031b83f46680f192ba57a27f0d1f0b8a | NONE | 107914493 | https://github.com/simonw/datasette/pull/595 | |||||
335980246 | MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2 | 8 | closed | 0 | stargazers command, refs #4 | 9599 | Needs tests. Refs #4. | 2019-11-03T00:37:36Z | 2020-05-02T20:00:27Z | 2020-05-02T20:00:26Z | db25bdf8cee4c3e2d730cf269eb9a903b51cdb41 | 0 | ea07274667a08c67907e8bfbbccb6f0fb95ce817 | ae9035f8fe5aff1c54bff4c6b4c2e808a44f0f2a | MEMBER | 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/8 | |||||
346264926 | MDExOlB1bGxSZXF1ZXN0MzQ2MjY0OTI2 | 67 | closed | 0 | Run tests against 3.5 too | 9599 | 2019-11-27T14:20:35Z | 2019-12-31T01:29:44Z | 2019-12-31T01:29:43Z | 88375b0bc055067b996584f06ed85a9a90c5aa1a | 0 | 4c6e5a4486e0e17555774eb3279142234a8b4abc | 0a0cec3cf27861455e8cd1c4d84937825a18bb30 | OWNER | 140912432 | https://github.com/simonw/sqlite-utils/pull/67 | ||||||
347179081 | MDExOlB1bGxSZXF1ZXN0MzQ3MTc5MDgx | 644 | closed | 0 | Validate metadata json on startup | 6025893 | This PR adds a sanity check which builds up a marshmallow schema on-the-fly based on the structure of the database(s) on startup and then validates the metadata json against it. In case of invalid data, this will raise with a descriptive error e.g: ``` marshmallow.exceptions.ValidationError: {'databases': {'fixtures': {'tables': {'not_a_table': ['Unknown field.']}}}} ``` Closes #260 --- This was intended to be fairly self-contained, but then while I was working on it, I hit some problems getting the tests to pass in the context of the test suite as a whole. My tests passed in isolation, but then failed while doing a full test suite run. That's when the worms started coming out of the can :bug: After some sleuthing, it turned out this was essentially the result of several issues intersecting: * There are certain events in the application lifecycle where the metadata schema can be modified after it is loaded e.g: https://github.com/simonw/datasette/blob/a562f2965552fb2dbbbd74df245c9965ee23d886/datasette/app.py#L299-L320 This means that sometimes what goes in isn't always exactly what comes out when you call `/-/metadata`. * Because the test fixtures use session scope for performance reasons if one unit test performs an action which mutates the metadata, that can impact on other unit tests which run after it using the same fixture. * Because the `self._metadata` property was being set with a simple assignment `self._metadata = metadata`, that created an object reference to the test fixture data, so operating on `self._metadata` was actually modifying the test fixture `METADATA` meaning that depending on when it was loaded in the test suite lifecycle, `METADATA` had different content, which was somewhat unexpected. As such, I've added some band-aids in 3552024 and 6859fd8: * Switching the metadata object to a `deepcopy` of the input prevents us directly mutating the input fixture. * I've switched some of the tests to use a fixture with function scope instead of session scope so we're workin… | 2019-11-30T00:32:15Z | 2021-07-28T17:58:45Z | 2021-07-28T17:58:45Z | e71b642474d54f986bc8857346103d8a10d84e6d | 0 | 6859fd8c5eef26f397aa949dc4edf3747e8ab0a5 | a562f2965552fb2dbbbd74df245c9965ee23d886 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/644 | |||||
354869391 | MDExOlB1bGxSZXF1ZXN0MzU0ODY5Mzkx | 652 | closed | 0 | Quick (and uninformed and perhaps misguided) attempt to add a <base> url for hosting datasette at a particular host/URI | 132978 | As usual, I don't really know what I'm doing... so this is just a suggested approach. I've not written tests, I've not run the tests, I don't know if I've missed some absolute URLs that would need to have the leading slash dropped. BUT, I tested it with `--config base_url:http://127.0.0.1:8001/` on the command line and from what little I know about datasette it's at least working in some obvious cases. My changes are based on what I saw in https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf (thanks!) I'm happy to be more thorough on this if you think it's worth pursuing. Fixes #394 (he said, optimistically). | 2019-12-18T23:37:16Z | 2020-03-24T22:14:50Z | 2020-03-24T22:14:50Z | 8e674de58c17c89c8a4a90bc3ec6e02151b354e5 | 0 | eaa636841e38d40360a74596ef1a0df50f6a86a5 | a498d0fe6590f9bdbc4faf9e0dd5faeb3b06002c | NONE | 107914493 | https://github.com/simonw/datasette/pull/652 | |||||
369394043 | MDExOlB1bGxSZXF1ZXN0MzY5Mzk0MDQz | 80 | closed | 0 | on_create mechanism for after table creation | 9599 | I need this for `geojson-to-sqlite`, in particular https://github.com/simonw/geojson-to-sqlite/issues/6 | 2020-01-31T03:38:48Z | 2020-01-31T05:08:04Z | 2020-01-31T05:08:04Z | e6dc95d19348e72b28b42e73a18737cb2e4563e0 | 0 | 45bf0c25492c276bde0b85868ffb55f169375bd7 | f7289174e66ae4d91d57de94bbd9d09fabf7aff4 | OWNER | 140912432 | https://github.com/simonw/sqlite-utils/pull/80 | |||||
372273608 | MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4 | 33 | closed | 0 | Upgrade to sqlite-utils 2.2.1 | 9599 | 2020-02-07T07:32:12Z | 2020-03-20T19:21:42Z | 2020-03-20T19:21:41Z | 5338f6baab3ec1424431133968d8b64a656ce4c4 | 0 | 08f51271d6309aad698b9e8a7587fcebbbd67781 | 35c18a09fa664324dcb75e5e58ccb90644456d02 | MEMBER | 206156866 | https://github.com/dogsheep/twitter-to-sqlite/pull/33 | ||||||
375180832 | MDExOlB1bGxSZXF1ZXN0Mzc1MTgwODMy | 672 | open | 0 | --dirs option for scanning directories for SQLite databases | 9599 | Refs #417. | 2020-02-14T02:25:52Z | 2020-03-27T01:03:53Z | 0e0e544f1f23451f04d7ca576ace5b18ce168e6f | 0 | ee718b98b793df2a15b125cbf20816c9864bf7e9 | 6aa516d82dea9885cb4db8d56ec2ccfd4cd9b840 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/672 | ||||||
391924509 | MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5 | 703 | closed | 0 | WIP implementation of writable canned queries | 9599 | Refs #698. | 2020-03-21T22:23:51Z | 2020-06-03T00:08:14Z | 2020-06-02T23:57:35Z | 80c5a74a947e63673389604de12e80fa27305454 | 1 | 61e40e917efc43a8aea5298a22badbb6eaea3fa1 | 89c4ddd4828623888e91a1d2cb396cba12d4e7b4 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/703 | |||||
406677205 | MDExOlB1bGxSZXF1ZXN0NDA2Njc3MjA1 | 730 | closed | 0 | Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12 | 27856297 | Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/1026c39495a963ff3e5fee7da2ae9f3a5d21fb83"><code>1026c39</code></a> 0.11.0</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/ab2b14048a691479fa9f8811aaa558018c6db6e3"><code>ab2b140</code></a> Test on Python 3.8, drop 3.3 and 3.4</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/6397a2255e3e9ef858439b164018438a8106f454"><code>6397a22</code></a> plugin: Use pytest 5.4.0 new Function API</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/21a0f9476be84ca0c84af60057f0f24c5fb2fd71"><code>21a0f94</code></a> Replace yield_fixture() by fixture()</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/964b295ba280a6e217159706279b67f8f4cbb5f4"><code>964b295</code></a> Added min hypothesis version so that bugfix for <a href="https://github.com/Hypothesis">https://github.com/Hypothesis</a>...</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/4a11a206fbcf88ee18cbed2d01041e61c20a9a48"><code>4a11a20</code></a> Add max supported pytest version to < 5.4.0 to prevent fails until <a href="https://github-redirect.dependabot.com/pytest-dev/pytest-asyncio/issues/141">#141</a> is fi...</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/b3055940db49cc17e36b66631e3d863e15fe34e4"><code>b305594</code></a> Change event_loop to module scope in hypothesis tests, fixing <a href="https://github-redirect.dependabot.com/pytest-dev/pytest-asyncio/issues/145">#145</a>.</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/d5a0f4789e7fecb58d509409e2c537b206c4fde2"><code>d5a0f47</code></a> Enable test_subprocess to be run on win, by changing to ProactorEventLoop in ...</li> <li><a href="https://github.com/pytest-dev/pytest-asyncio/commit/d07cd2d447cf313446c4e00e25a35cb2adcb2c63"><code>d07cd2d</code><… | 2020-04-21T13:32:35Z | 2020-05-04T13:27:24Z | 2020-05-04T13:27:23Z | 460708c7107a7cf15971a9aa1040635f6bc1be6d | 0 | 11c67f82cdccc6e34cbff717e673451ac6172ef4 | 15e232180427e988174fdf88440c84b91d2d98d1 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/730 | |||||
410469272 | MDExOlB1bGxSZXF1ZXN0NDEwNDY5Mjcy | 746 | closed | 0 | shutil.Error, not OSError | 9599 | Refs #744 | 2020-04-29T03:30:51Z | 2020-04-29T07:07:24Z | 2020-04-29T07:07:23Z | e4e8b51b50e51b2515c6d8874d16c4607f79b80a | 0 | af3a5b91503f5d74aa111bbcd1ee531ee00f9ed7 | 89c4ddd4828623888e91a1d2cb396cba12d4e7b4 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/746 | |||||
434085235 | MDExOlB1bGxSZXF1ZXN0NDM0MDg1MjM1 | 848 | closed | 0 | Reload support for config_dir mode. | 49260 | A reference implementation for adding support to reload when datasette is in the config_dir mode. This implementation is flawed since it is watching the entire directory and any changes to the database will reload the server and adding unrelated files to the directory will also reload the server. | 2020-06-14T02:34:46Z | 2020-07-03T02:44:54Z | 2020-07-03T02:44:53Z | 888538efdbf545c0df524ca590a17fb6c6fa2419 | 0 | 0d100d15aca93fae200b3bc2e29dfd60aaa4b384 | 57879dc8b346a435804a9e45ffaacbf2a0228bc6 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/848 | |||||
440735814 | MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0 | 868 | open | 0 | initial windows ci setup | 702729 | Picking up the work done on #557 with a new PR. Seeing if I can get this working. | 2020-06-26T18:49:13Z | 2021-07-10T23:41:43Z | b99adb1720a0b53ff174db54d0e4a67357b47f33 | 0 | c99cabae638958ef057438a92cb9a182ba4f8188 | 180c7a5328457aefdf847ada366e296fef4744f1 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/868 | ||||||
442505088 | MDExOlB1bGxSZXF1ZXN0NDQyNTA1MDg4 | 883 | open | 0 | Skip counting hidden tables | 3243482 | Potential fix for https://github.com/simonw/datasette/issues/859. Disabling table counts for hidden tables speeds up database page quite a bit. In my setup it reduced load time by 2/3 (~300 -> ~90ms) | 2020-07-01T07:38:08Z | 2020-07-02T00:25:44Z | 527624338acd38b97bb33b0a0b913d80e8345fee | 0 | 251884f58895faf8056b3dfdeae3bb92c5bc58ac | 676bb64c877d73f8ff496cef4632f5a8a5a9283c | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/883 | ||||||
448355680 | MDExOlB1bGxSZXF1ZXN0NDQ4MzU1Njgw | 30 | open | 0 | Handle empty bucket on first upload. Allow specifying the endpoint_url for services other than S3 (like b2 and digitalocean spaces) | 110038 | Finally got around to trying dogsheep-photos but I want to use backblaze's b2 service instead of AWS S3. Had to add a way to optionally specify the endpoint_url to connect to. Then with the bucket being empty the initial key retrieval would fail. Probably a better way to see that the bucket is empty than doing a test inside the paginator loop. Also probably a better way to specify the endpoint_url as we get and test for it twice using the same code in two different places but did not want to spend too much time worrying about it. | 2020-07-13T16:15:26Z | 2020-07-13T16:15:26Z | 583b26f244166aadf2dcc680e39d1ca59765da37 | 0 | 647d4b42c6f4d1fba4b99f73fe163946cea6ee36 | 45ce3f8bfb8c70f57ca5d8d82f22368fea1eb391 | FIRST_TIME_CONTRIBUTOR | 256834907 | https://github.com/dogsheep/dogsheep-photos/pull/30 | ||||||
474703007 | MDExOlB1bGxSZXF1ZXN0NDc0NzAzMDA3 | 952 | closed | 0 | Update black requirement from ~=19.10b0 to >=19.10,<21.0 | 27856297 | Updates the requirements on [black](https://github.com/psf/black) to permit the latest version. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/psf/black/blob/master/CHANGES.md">black's changelog</a>.</em></p> <blockquote> <h3>20.8b1</h3> <h4><em>Packaging</em></h4> <ul> <li>explicitly depend on Click 7.1.2 or newer as <code>Black</code> no longer works with versions older than 7.0</li> </ul> <h3>20.8b0</h3> <h4><em>Black</em></h4> <ul> <li> <p>re-implemented support for explicit trailing commas: now it works consistently within any bracket pair, including nested structures (<a href="https://github-redirect.dependabot.com/psf/black/issues/1288">#1288</a> and duplicates)</p> </li> <li> <p><code>Black</code> now reindents docstrings when reindenting code around it (<a href="https://github-redirect.dependabot.com/psf/black/issues/1053">#1053</a>)</p> </li> <li> <p><code>Black</code> now shows colored diffs (<a href="https://github-redirect.dependabot.com/psf/black/issues/1266">#1266</a>)</p> </li> <li> <p><code>Black</code> is now packaged using 'py3' tagged wheels (<a href="https://github-redirect.dependabot.com/psf/black/issues/1388">#1388</a>)</p> </li> <li> <p><code>Black</code> now supports Python 3.8 code, e.g. star expressions in return statements (<a href="https://github-redirect.dependabot.com/psf/black/issues/1121">#1121</a>)</p> </li> <li> <p><code>Black</code> no longer normalizes capital R-string prefixes as those have a community-accepted meaning (<a href="https://github-redirect.dependabot.com/psf/black/issues/1244">#1244</a>)</p> </li> <li> <p><code>Black</code> now uses exit code 2 when specified configuration file doesn't exit (<a href="https://github-redirect.dependabot.com/psf/black/issues/1361">#1361</a>)</p> </li> <li> <p><code>Black</code> now works on AWS Lambda (<a href="https://github-redirect.dependabot.com/psf/black/issues/1141">#1141</a>)</p> </li> <li> <p>added <code>--force-exclude</code> argument (<a href="https://github-redirect.dependabot.com/p… | 2020-08-27T13:31:36Z | 2020-09-02T22:26:17Z | 2020-09-02T22:26:16Z | 37f8531b321855bdbc58960281957febaa59e4b9 | 0 | 7b1354706467136f5030504fe799201b13333a95 | 86aefc39c5aca01b00dbc57ba386a6743c21fb46 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/952 | |||||
496298180 | MDExOlB1bGxSZXF1ZXN0NDk2Mjk4MTgw | 986 | closed | 0 | Allow facet by primary keys, fixes #985 | 39452697 | Hello! This PR makes it possible to facet by primary keys. Did I get it right that just removing the condition on UI side is enough? From testing it works fine with primary keys, just as with normal keys. If so, should I also remove unused `data-is-pk`? | 2020-10-01T14:18:55Z | 2020-10-01T16:51:45Z | 2020-10-01T16:51:45Z | 58906c597f1217381f5d746726bcb8bdfa8f52f8 | 0 | 76f7094bd33f037a1c689a173f0dbbb988e6dcdd | 141544613f9e76ddb74eee38d6f8ee1e0e70f833 | NONE | 107914493 | https://github.com/simonw/datasette/pull/986 | |||||
500798091 | MDExOlB1bGxSZXF1ZXN0NTAwNzk4MDkx | 1008 | open | 0 | Add json_loads and json_dumps jinja2 filters | 649467 | 2020-10-09T20:11:34Z | 2020-12-15T02:30:28Z | e33e91ca7c9b2fdeab9d8179ce0d603918b066aa | 0 | 40858989d47043743d6b1c9108528bec6a317e43 | 1bdbc8aa7f4fd7a768d456146e44da86cb1b36d1 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1008 | |||||||
505076418 | MDExOlB1bGxSZXF1ZXN0NTA1MDc2NDE4 | 5 | open | 0 | Add fitbit-to-sqlite | 4632208 | 2020-10-16T20:04:05Z | 2020-10-16T20:04:05Z | 9b9a677a4fcb6a31be8c406b3050cfe1c6e7e398 | 0 | db64d60ee92448b1d2a7e190d9da20eb306326b0 | d0686ebed6f08e9b18b4b96c2b8170e043a69adb | FIRST_TIME_CONTRIBUTOR | 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/5 | |||||||
505453900 | MDExOlB1bGxSZXF1ZXN0NTA1NDUzOTAw | 1030 | open | 0 | Make `package` command deal with a configuration directory argument | 299380 | Currently if we run `datasette package` on a configuration directory we'll get an exception when we try to hard link to the directory. This PR copies the tree and makes the Dockerfile run inspect on all *.db files. | 2020-10-18T11:07:02Z | 2020-10-19T08:01:51Z | 124142e4d2710525b09ff2bd2a7a787cbed163a4 | 0 | e0825334692967fec195e104cb6aa11095807a8e | c37a0a93ecb847e66cfe7b6f9452ba210fcae91b | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1030 | ||||||
505769462 | MDExOlB1bGxSZXF1ZXN0NTA1NzY5NDYy | 1031 | closed | 0 | Fallback to databases in inspect-data.json when no -i options are passed | 299380 | Currenlty `Datasette.__init__` checks immutables against None to decide whether to fallback to inspect-data.json. This patch modifies the serve command to pass None when no -i options are passed so this fallback works correctly. | 2020-10-19T07:51:06Z | 2021-03-29T01:46:45Z | 2021-03-29T00:23:41Z | 3ee6b39e96ef684e1ac393bb269d804e957fee1d | 0 | 7e7eaa4e712b01de0b5a8a1b90145bdc1c3cd731 | c37a0a93ecb847e66cfe7b6f9452ba210fcae91b | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1031 | |||||
509590205 | MDExOlB1bGxSZXF1ZXN0NTA5NTkwMjA1 | 1049 | closed | 0 | Add template block prior to extra URL loaders | 82988 | To handle packages that require Javascript state setting prior to loading a package (eg [`thebelab`](https://thebelab.readthedocs.io/en/latest/examples/minimal_example.html), provide a template block before the URLs are loaded. | 2020-10-25T13:08:55Z | 2020-10-29T09:20:52Z | 2020-10-29T09:20:34Z | 99f994b14e2dbe22fda18b67dd5c824d359443fb | 0 | 50a743ad35684f09d3c3880f6af2019e59271237 | 42f4851e3e7885f1092f104d6c883cea40b12f02 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1049 | |||||
521054612 | MDExOlB1bGxSZXF1ZXN0NTIxMDU0NjEy | 13 | open | 0 | SQLite does not have case sensitive columns | 1689944 | This solves a weird issue when there is record with metadata key that is only different in letter cases. See the test for details. | 2020-11-14T20:12:32Z | 2021-08-24T13:28:26Z | 38856acbc724ffdb8beb9e9f4ef0dbfa8ff51ad1 | 0 | 3e1b2945bc7c31be59e89c5fed86a5d2a59ebd5a | 71e36e1cf034b96de2a8e6652265d782d3fdf63b | FIRST_TIME_CONTRIBUTOR | 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/13 | ||||||
521287994 | MDExOlB1bGxSZXF1ZXN0NTIxMjg3OTk0 | 203 | open | 0 | changes to allow for compound foreign keys | 1049910 | Add support for compound foreign keys, as per issue #117 Not sure if this is the right approach. In particular I'm unsure about: - the new `ForeignKey` class, which replaces the namedtuple in order to ensure that `column` and `other_column` are forced into tuples. The class does the job, but doesn't feel very elegant. - I haven't rewritten `guess_foreign_table` to take account of multiple columns, so it just checks for the first column in the foreign key definition. This isn't ideal. - I haven't added any ability to the CLI to add compound foreign keys, it's only in the python API at the moment. The PR also contains a minor related change that columns and tables are always quoted in foreign key definitions. | 2020-11-16T00:30:10Z | 2023-01-25T18:47:18Z | 0507a9464314f84e9e58b1931c583df51d757d7c | 0 | 5e43e31c2b9bcf6b5d1460b0f848fed019ed42a6 | f1277f638f3a54a821db6e03cb980adad2f2fa35 | FIRST_TIME_CONTRIBUTOR | 140912432 | https://github.com/simonw/sqlite-utils/pull/203 | ||||||
532348919 | MDExOlB1bGxSZXF1ZXN0NTMyMzQ4OTE5 | 1130 | open | 0 | Fix footer not sticking to bottom in short pages | 3243482 | Fixes https://github.com/simonw/datasette/issues/1129 | 2020-12-04T07:29:01Z | 2021-06-15T13:27:48Z | af3aa34786f134af8073342a3c4bb74b968750fd | 0 | 8d4c69c6fb0ef741a19070f5172017ea3522e83c | 49d8fc056844d5a537d6cfd96dab0dd5686fe718 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1130 | ||||||
542406910 | MDExOlB1bGxSZXF1ZXN0NTQyNDA2OTEw | 10 | closed | 0 | BugFix for encoding and not update info. | 1277270 | Bugfix 1: Traceback (most recent call last): File "d:\anaconda3\lib\runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "d:\anaconda3\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "D:\Anaconda3\Scripts\evernote-to-sqlite.exe\__main__.py", line 7, in <module> File "d:\anaconda3\lib\site-packages\click\core.py", line 829, in __call__ File "d:\anaconda3\lib\site-packages\click\core.py", line 782, in main rv = self.invoke(ctx) File "d:\anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) return ctx.invoke(self.callback, **ctx.params) File "d:\anaconda3\lib\site-packages\click\core.py", line 610, in invoke return callback(*args, **kwargs) File "d:\anaconda3\lib\site-packages\evernote_to_sqlite\cli.py", line 30, in enex for tag, note in find_all_tags(fp, ["note"], progress_callback=bar.update): File "d:\anaconda3\lib\site-packages\evernote_to_sqlite\utils.py", line 11, in find_all_tags chunk = fp.read(1024 * 1024) UnicodeDecodeError: 'gbk' codec can't decode byte 0xa4 in position 383: illegal multibyte sequence Bugfix 2: Traceback (most recent call last): File "D:\Anaconda3\Scripts\evernote-to-sqlite-script.py", line 33, in <module> sys.exit(load_entry_point('evernote-to-sqlite==0.3', 'console_scripts', 'evernote-to-sqlite')()) File "D:\Anaconda3\lib\site-packages\click\core.py", line 829, in __call__ return self.main(*args, **kwargs) File "D:\Anaconda3\lib\site-packages\click\core.py", line 782, in main rv = self.invoke(ctx) File "D:\Anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "D:\Anaconda3\lib\site-packages\click\core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "D:\Anaconda3\lib\site-packages\click\core.py", line 610, in invoke return callback(*args, **kw… | 2020-12-18T08:58:54Z | 2021-02-11T22:37:56Z | 2021-02-11T22:37:56Z | 4425daeccd43ce3c7bb45deaae577984f978e40f | 0 | 7b8b96b69f43cb2247875c3ca6d39878edf77a78 | 92254b71075c8806bca258c939e24af8397cdf98 | NONE | 303218369 | https://github.com/dogsheep/evernote-to-sqlite/pull/10 | |||||
543015825 | MDExOlB1bGxSZXF1ZXN0NTQzMDE1ODI1 | 31 | open | 0 | Update for Big Sur | 41546558 | Refactored out the SQL for extracting aesthetic scores to use osxphotos -- adds compatbility for Big Sur via osxphotos which has been updated for new table names in Big Sur. Have not yet refactored the SQL for extracting labels which is still compatible with Big Sur. | 2020-12-20T04:36:45Z | 2023-08-08T15:52:52Z | 0e571b07430024d4ce00d5e8ba28591cefd27d6f | 0 | 39c12f8cda206ad621ec9940cce538570513e764 | edc80a0d361006f478f2904a90bfe6c730ed6194 | CONTRIBUTOR | 256834907 | https://github.com/dogsheep/dogsheep-photos/pull/31 | ||||||
545264436 | MDExOlB1bGxSZXF1ZXN0NTQ1MjY0NDM2 | 1159 | open | 0 | Improve the display of facets information | 552629 | This PR changes the display of facets to hopefully make them more readable. Before | After ---|--- ![image](https://user-images.githubusercontent.com/552629/103084609-b1ec2980-45df-11eb-85bc-68ab8df3e8d9.png) | ![image](https://user-images.githubusercontent.com/552629/103085220-620e6200-45e1-11eb-8189-5dd5d3e2569e.png) | 2020-12-24T11:01:47Z | 2023-07-31T18:57:59Z | 0276c5609da34bfb660f65212e1a367e637979d7 | 3268330 | 0 | c820abd0bcb34d1ea5a03be64a2158ae7c42920c | a882d679626438ba0d809944f06f239bcba8ee96 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1159 | |||||
560204306 | MDExOlB1bGxSZXF1ZXN0NTYwMjA0MzA2 | 224 | closed | 0 | Add fts offset docs. | 37962604 | The limit can be passed as a string to the query builder to have an offset. I have tested it using the shorthand `limit=f"15, 30"`, the standard syntax should work too. | 2021-01-22T20:50:58Z | 2021-02-14T19:31:06Z | 2021-02-14T19:31:06Z | 4d6ff040770119fb2c1bcbc97678d9deca752f2f | 0 | 341f50d2d95ba1d69ad64ba8c0ec0ffa9a68d063 | 36dc7e3909a44878681c266b90f9be76ac749f2d | NONE | 140912432 | https://github.com/simonw/sqlite-utils/pull/224 | |||||
560760145 | MDExOlB1bGxSZXF1ZXN0NTYwNzYwMTQ1 | 1204 | open | 0 | WIP: Plugin includes | 9599 | Refs #1191 Next steps: - [ ] Get comfortable that this pattern is the right way to go - [ ] Implement it for all of the other pages, not just the table page - [ ] Add a new set of plugin tests that exercise ALL of these new hook locations - [ ] Document, then ship | 2021-01-25T03:59:06Z | 2021-12-17T07:10:49Z | 98f06a766317a40035962416cf3211d7a374866a | 1 | 05258469ae39bcaad17beb57c5b7eeab0d58a589 | 07e163561592c743e4117f72102fcd350a600909 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/1204 | ||||||
561512503 | MDExOlB1bGxSZXF1ZXN0NTYxNTEyNTAz | 15 | open | 0 | added try / except to write_records | 9857779 | to keep the data write from failing if it came across an error during processing. In particular when trying to convert my HealthKit zip file (and that of my wife's) it would consistently error out with the following: ``` db.py 1709 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: too many SQL variables --------------------------------------------------------------------------------------------------------------------------------------------------------------------- db.py 1709 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: too many SQL variables --------------------------------------------------------------------------------------------------------------------------------------------------------------------- db.py 1709 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table rBodyMass has no column named metadata_HKWasUserEntered --------------------------------------------------------------------------------------------------------------------------------------------------------------------- healthkit-to-sqlite 8 <module> sys.exit(cli()) core.py 829 __call__ return self.main(*args, **kwargs) core.py 782 main rv = self.invoke(ctx) core.py 1066 invoke return ctx.invoke(self.callback, **ctx.params) core.py 610 invoke return callback(*args, **kwargs) cli.py 57 cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) utils.py 42 convert_xml_to_sqlite write_records(records, db) utils.py 143 write_records db[table].insert_all( db.py 1899 insert_all self.insert_chunk( db.py 1720 insert_chunk self.insert_chunk( db.py 1720 insert_chunk self.insert_chunk( db.py 1714 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.co… | 2021-01-26T03:56:21Z | 2021-01-26T03:56:21Z | 8527278a87e448f57c7c6bd76a2d85f12d0233dd | 0 | 7f1b168c752b5af7c1f9052dfa61e26afc83d574 | 71e36e1cf034b96de2a8e6652265d782d3fdf63b | FIRST_TIME_CONTRIBUTOR | 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/15 | ||||||
564215011 | MDExOlB1bGxSZXF1ZXN0NTY0MjE1MDEx | 225 | closed | 0 | fix for problem in Table.insert_all on search for columns per chunk of rows | 261237 | Hi, I ran into a problem when trying to create a database from my Apple Healthkit data using [healthkit-to-sqlite](https://github.com/dogsheep/healthkit-to-sqlite). The program crashed because of an invalid insert statement that was generated for table `rDistanceCycling`. The actual problem turned out to be in [sqlite-utils](https://github.com/simonw/sqlite-utils). `Table.insert_all` processes the data to be inserted in chunks of rows and checks for every chunk which columns are used, and it will collect all column names in the variable `all_columns`. The collection of columns is done using a nested list comprehension that is not completely correct. I'm using a Windows machine and had to make a few adjustments to the tests in order to be able to run them because they had a posix dependency. Thanks, kind regards, Frans ``` # this is a (condensed) chunk of data from my Apple healthkit export that caused the problem. # the 3 last items in the chunk have additional keys: metadata_HKMetadataKeySyncVersion and metadata_HKMetadataKeySyncIdentifier chunk = [{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<<HKDevice: 0x281cf6c70>, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:09 +0100', 'startDate': '2020-10-10 12:29:06 +0100', 'endDate': '2020-10-10 12:29:07 +0100', 'value': '0.00518016'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<<HKDevice: 0x281cf6c70>, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:10 +0100', 'startDate': '2020-10-10 12:29:07 +0100', 'endDate': '2020-10-10 12:29:08 +0100', 'value': '0.00544049'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<<HKDevice: 0x281cf83e0>, name:Apple Watch, manu… | 2021-01-29T20:16:07Z | 2021-02-14T21:04:13Z | 2021-02-14T21:04:13Z | 1cba965a1ddc2bd77db3bc3912aa7e8467e2fa2f | 0 | 929ea7551135df0cc2ac9d67f4fbbecf701a11f6 | 36dc7e3909a44878681c266b90f9be76ac749f2d | NONE | 140912432 | https://github.com/simonw/sqlite-utils/pull/225 | |||||
577953727 | MDExOlB1bGxSZXF1ZXN0NTc3OTUzNzI3 | 5 | open | 0 | WIP: Add Gmail takeout mbox import | 306240 | WIP This PR adds the ability to import emails from a Gmail mbox export from Google Takeout. This is my first PR to a datasette/dogsheep repo. I've tested this on my personal Google Takeout mbox with ~520,000 emails going back to 2004. This took around ~20 minutes to process. To provide some feedback on the progress of the import I added the "rich" python module. I'm happy to remove that if adding a dependency is discouraged. However, I think it makes a nice addition to give feedback on the progress of a long import. Do we want to log emails that have errors when trying to import them? Dealing with encodings with emails is a bit tricky. I'm very open to feedback on how to deal with those better. As well as any other feedback for improvements. | 2021-02-22T21:30:40Z | 2021-07-28T07:18:56Z | 65182811d59451299e75f09b4366bb221bc32b20 | 0 | a3de045eba0fae4b309da21aa3119102b0efc576 | e54e544427f1cc3ea8189f0e95f54046301a8645 | FIRST_TIME_CONTRIBUTOR | 206649770 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5 | ||||||
580235427 | MDExOlB1bGxSZXF1ZXN0NTgwMjM1NDI3 | 241 | open | 0 | Extract expand - work in progress | 9599 | Refs #239. Still needs documentation and CLI implementation. | 2021-02-25T16:36:38Z | 2021-02-25T16:36:38Z | 0bb6c7a38994627a64e7b3375931528e96b8c222 | 1 | 8d641ab08ac449081e96f3e25bd6c0226870948a | 38e688fb8bcb58ae888b676fe3f7dd0529b4eecc | OWNER | 140912432 | https://github.com/simonw/sqlite-utils/pull/241 | ||||||
588601627 | MDExOlB1bGxSZXF1ZXN0NTg4NjAxNjI3 | 1254 | closed | 0 | Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions | 3200608 | This requires adding the RT Topology library (Spatialite changed to RT Topology from LWGEOM between 4.4 and 5.0), as well as upgrading the GEOS version (which is the reason for switching to `python:3.7.10-slim-buster` as the base image.) `autoconf` and `libtool` are added to build RT Topology, and Spatialite is now built with `--disable-minizip` (minizip wasn't an option in 4.4 and I didn't want to add another dependency) and `--disable-dependency-tracking` which, according to Spatialite, "speeds up one-time builds" | 2021-03-09T20:49:08Z | 2021-03-10T18:27:45Z | 2021-03-09T22:04:23Z | bc09c84d6af4721b32f01f4d9186a6fbf9863081 | 0 | b103204155c2396d353fa195a320cee6aca258cf | d0fd833b8cdd97e1b91d0f97a69b494895d82bee | NONE | 107914493 | https://github.com/simonw/datasette/pull/1254 | |||||
592364255 | MDExOlB1bGxSZXF1ZXN0NTkyMzY0MjU1 | 16 | open | 0 | Add a fallback ID, print if no ID found | 1234956 | Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/14 | 2021-03-13T13:38:29Z | 2021-03-13T14:44:04Z | 16ab307b2138891f226a66e4954c5470de753a0f | 0 | 27b3d54ccfe7d861770a9d0b173f6503580fea4a | 71e36e1cf034b96de2a8e6652265d782d3fdf63b | FIRST_TIME_CONTRIBUTOR | 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/16 | ||||||
592548103 | MDExOlB1bGxSZXF1ZXN0NTkyNTQ4MTAz | 1260 | closed | 0 | Fix: code quality issues | 25361949 | ### Description Hi :wave: I work at [DeepSource](https://deepsource.io), I ran DeepSource analysis on the forked copy of this repo and found some interesting [code quality issues](https://deepsource.io/gh/withshubh/datasette/issues/?category=recommended) in the codebase, opening this PR so you can assess if our platform is right and helpful for you. ### Summary of changes - Replaced ternary syntax with if expression - Removed redundant `None` default - Used `is` to compare type of objects - Iterated dictionary directly - Removed unnecessary lambda expression - Refactored unnecessary `else` / `elif` when `if` block has a `return` statement - Refactored unnecessary `else` / `elif` when `if` block has a `raise` statement - Added .deepsource.toml to continuously analyze and detect code quality issues | 2021-03-14T13:56:10Z | 2021-03-29T00:22:41Z | 2021-03-29T00:22:41Z | bc868ae8c8152a25bcab7adb490c5b89411bdf3a | 0 | 90f5fb6d2fb36ddffc49acee924d042f2d5d1d58 | 8e18c7943181f228ce5ebcea48deb59ce50bee1f | NONE | 107914493 | https://github.com/simonw/datasette/pull/1260 | |||||
596627780 | MDExOlB1bGxSZXF1ZXN0NTk2NjI3Nzgw | 18 | open | 0 | Add datetime parsing | 1234956 | Parses the datetime columns so they are subsequently properly recognized as datetime. Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/17 | 2021-03-19T14:34:22Z | 2021-03-19T14:34:22Z | c87f4e8aa88ec277c6b5a000670c2cb42a10c03d | 0 | e0e7a0f99f844db33964b27c29b0b8d5f160202b | 71e36e1cf034b96de2a8e6652265d782d3fdf63b | FIRST_TIME_CONTRIBUTOR | 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/18 | ||||||
598213565 | MDExOlB1bGxSZXF1ZXN0NTk4MjEzNTY1 | 1271 | open | 0 | Use SQLite conn.interrupt() instead of sqlite_timelimit() | 9599 | Refs #1270, #1268, #1249 Before merging this I need to do some more testing (to make sure that expensive queries really are properly cancelled). I also need to delete a bunch of code relating to the old mechanism of cancelling queries. [See comment below: this doesn't actually cancel the query due to a thread-local confusion] | 2021-03-22T17:34:20Z | 2021-03-22T21:49:27Z | a4fd7e5a761523881c031b4fee266a366e1c97bd | 1 | fb2ad7ada0b86a7fe4a576fe23236757c41eb05e | c4f1ec7f33fd7d5b93f0f895dafb5351cc3bfc5b | OWNER | 107914493 | https://github.com/simonw/datasette/pull/1271 | ||||||
602261092 | MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky | 6 | closed | 0 | Add testres-db tool | 1151557 | 2021-03-28T15:43:23Z | 2022-02-16T05:12:05Z | 2022-02-16T05:12:05Z | eceb016506b5db29b9c21bc7fcf5e6e77259c7b4 | 0 | 91cfa6f7dcab032e2d21e80657c81e69119e2018 | 92c6bb77629feeed661c7b8d9183a11367de39e0 | NONE | 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/6 | ||||||
613178968 | MDExOlB1bGxSZXF1ZXN0NjEzMTc4OTY4 | 1296 | open | 0 | Dockerfile: use Ubuntu 20.10 as base | 82332573 | This PR changes the main Dockerfile to use ubuntu:20.10 as base image instead of python:3.9.2-slim-buster (itself based on debian:buster-slim). The Dockerfile is essentially the one from https://github.com/simonw/datasette/issues/1249#issuecomment-803698983 with some additional cleanups to slim it down. This fixes a couple of issues: 1. The SQLite version in Debian Buster (2.6.0) doesn't support generated columns 2. Installing SpatiaLite from the Debian sid repositories has the side effect of also installing updates to libc and libstdc++ from sid. As a bonus, the Docker image becomes smaller: ``` $ docker image ls REPOSITORY TAG IMAGE ID CREATED SIZE datasette 0.56-ubuntu f7aca255140a 5 hours ago 212MB datasetteproject/datasette 0.56 efb3b282f390 13 days ago 258MB ``` ### Reproduction of the first issue ``` $ curl -O https://latest.datasette.io/fixtures.db % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 260k 0 260k 0 0 489k 0 --:--:-- --:--:-- --:--:-- 489k $ docker run -v `pwd`:/mnt datasetteproject/datasette:0.56 datasette /mnt/fixtures.db Traceback (most recent call last): File "/usr/local/bin/datasette", line 8, in <module> sys.exit(cli()) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 610, in invoke return callback(*args, … | 2021-04-12T00:23:32Z | 2021-07-20T08:52:13Z | 2ba522dbd7168a104a33621598c5a2460aae3e74 | 0 | 8f00c312f6b8ab5cecbb8a698ab4ad659aabf4ef | c73af5dd72305f6a01ea94a2c76d52e5e26de38b | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1296 | ||||||
624635440 | MDExOlB1bGxSZXF1ZXN0NjI0NjM1NDQw | 1309 | closed | 0 | Bump black from 20.8b1 to 21.4b0 | 27856297 | Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/psf/black/releases">black's releases</a>.</em></p> <blockquote> <h2>21.4b0</h2> <h4><em>Black</em></h4> <ul> <li> <p>Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by <code>Black</code> and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue <a href="https://github-redirect.dependabot.com/psf/black/issues/1629">#1629</a> and all of its many many duplicates. (<a href="https://github-redirect.dependabot.com/psf/black/issues/2126">#2126</a>)</p> </li> <li> <p><code>Black</code> now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (<a href="https://github-redirect.dependabot.com/psf/black/issues/1740">#1740</a>)</p> </li> <li> <p><code>Black</code> now cleans up leading non-breaking spaces in comments (<a href="https://github-redirect.dependabot.com/psf/black/issues/2092">#2092</a>)</p> </li> <li> <p><code>Black</code> now respects <code>--skip-string-normalization</code> when normalizing multiline docstring quotes (<a href="https://github-redirect.dependabot.com/psf/black/issues/1637">#1637</a>)</p> </li> <li> <p><code>Black</code> no longer removes all empty lines between non-function code and decorators when formatting typing stubs. Now <code>Black</code> enforces a single empty line. (<a href="https://github-redirect.dependabot.com/psf/black/issues/1646">#1646</a>)</p> </li> <li> <p><code>Black</code> no longer adds an incorrect space after a parenthesized assignment expression in if/while statements (<a href="https://github-redirect.dependabot.com/psf/black/issues/1655">#1655</a>)</p> </li> <li> <p>Added <code>--skip-magic-trailing-comma</code> / <code>-C</code> to avoid using trailing commas as a reason to split lines (<a href="https://github-redire… | 2021-04-27T20:28:11Z | 2021-04-28T18:26:06Z | 2021-04-28T18:26:04Z | 1220c60d8a6bb8e621543ef78d669a2bccc2a3c8 | 0 | 20fc3fe2797b81a23cd464c1450d13086d53ea7f | a4bb2abce0764d49d255e5379f9e9c70981834ca | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1309 | |||||
625457579 | MDExOlB1bGxSZXF1ZXN0NjI1NDU3NTc5 | 1311 | closed | 0 | Bump black from 20.8b1 to 21.4b1 | 27856297 | Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/psf/black/releases">black's releases</a>.</em></p> <blockquote> <h2>21.4b1</h2> <h4><em>Black</em></h4> <ul> <li> <p>Fix crash on docstrings ending with "\ ". (<a href="https://github-redirect.dependabot.com/psf/black/issues/2142">#2142</a>)</p> </li> <li> <p>Fix crash when atypical whitespace is cleaned out of dostrings (<a href="https://github-redirect.dependabot.com/psf/black/issues/2120">#2120</a>)</p> </li> <li> <p>Reflect the <code>--skip-magic-trailing-comma</code> and <code>--experimental-string-processing</code> flags in the name of the cache file. Without this fix, changes in these flags would not take effect if the cache had already been populated. (<a href="https://github-redirect.dependabot.com/psf/black/issues/2131">#2131</a>)</p> </li> <li> <p>Don't remove necessary parentheses from assignment expression containing assert / return statements. (<a href="https://github-redirect.dependabot.com/psf/black/issues/2143">#2143</a>)</p> </li> </ul> <h4><em>Packaging</em></h4> <ul> <li>Bump pathspec to >= 0.8.1 to solve invalid .gitignore exclusion handling</li> </ul> <h2>21.4b0</h2> <h4><em>Black</em></h4> <ul> <li> <p>Fixed a rare but annoying formatting instability created by the combination of optional trailing commas inserted by <code>Black</code> and optional parentheses looking at pre-existing "magic" trailing commas. This fixes issue <a href="https://github-redirect.dependabot.com/psf/black/issues/1629">#1629</a> and all of its many many duplicates. (<a href="https://github-redirect.dependabot.com/psf/black/issues/2126">#2126</a>)</p> </li> <li> <p><code>Black</code> now processes one-line docstrings by stripping leading and trailing spaces, and adding a padding space when needed to break up """". (<a href="https://github-redirect.dependabot.com/psf/black/issues/1740">#1740</a>)</p> </li> <li> <p><code>… | 2021-04-28T18:25:58Z | 2021-04-29T13:58:11Z | 2021-04-29T13:58:09Z | a8e260b47e0fb951790f155780354c8f8df88bc8 | 0 | baf303063a76800ec97abee46cd5f264e6a6447a | a4bb2abce0764d49d255e5379f9e9c70981834ca | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1311 | |||||
630578735 | MDExOlB1bGxSZXF1ZXN0NjMwNTc4NzM1 | 1318 | closed | 0 | Bump black from 21.4b2 to 21.5b0 | 49699333 | Bumps [black](https://github.com/psf/black) from 21.4b2 to 21.5b0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/psf/black/releases">black's releases</a>.</em></p> <blockquote> <h2>21.5b0</h2> <h4><em>Black</em></h4> <ul> <li>Set <code>--pyi</code> mode if <code>--stdin-filename</code> ends in <code>.pyi</code> (<a href="https://github-redirect.dependabot.com/psf/black/issues/2169">#2169</a>)</li> <li>Stop detecting target version as Python 3.9+ with pre-PEP-614 decorators that are being called but with no arguments (<a href="https://github-redirect.dependabot.com/psf/black/issues/2182">#2182</a>)</li> </ul> <h4><em>Black-Primer</em></h4> <ul> <li>Add <code>--no-diff</code> to black-primer to suppress formatting changes (<a href="https://github-redirect.dependabot.com/psf/black/issues/2187">#2187</a>)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/psf/black/blob/master/CHANGES.md">black's changelog</a>.</em></p> <blockquote> <h3>21.5b0</h3> <h4><em>Black</em></h4> <ul> <li>Set <code>--pyi</code> mode if <code>--stdin-filename</code> ends in <code>.pyi</code> (<a href="https://github-redirect.dependabot.com/psf/black/issues/2169">#2169</a>)</li> <li>Stop detecting target version as Python 3.9+ with pre-PEP-614 decorators that are being called but with no arguments (<a href="https://github-redirect.dependabot.com/psf/black/issues/2182">#2182</a>)</li> </ul> <h4><em>Black-Primer</em></h4> <ul> <li>Add <code>--no-diff</code> to black-primer to suppress formatting changes (<a href="https://github-redirect.dependabot.com/psf/black/issues/2187">#2187</a>)</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li>See full diff in <a href="https://github.com/psf/black/commits">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-versi… | 2021-05-05T13:07:51Z | 2021-05-11T13:12:32Z | 2021-05-11T13:12:31Z | e864f5420abb7a5d135f8fe470183786b577ce9a | 0 | e06c09911be52202940808d7a08df2e9b71b3af2 | 1b697539f5b53cec3fe13c0f4ada13ba655c88c7 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1318 | |||||
645100848 | MDExOlB1bGxSZXF1ZXN0NjQ1MTAwODQ4 | 12 | open | 0 | Recovering of malformed ENEX file | 8431437 | Hey .. Awesome work developing this project, that I found very useful to me and saved me some work.. Thanks.. :) Some background to this PR... I've been searching around for a tool allowing me to transforming my personal collection of Evernote notes to a format easier to search and potentially easier import to future services. Now I discovered problem processing my large data ~5GB using the existing source using Pythons builtin xml-parser that unfortunately was unable to succeed without exception breaking the process. My first attempt I tried to adapt to more robust lxml package allowing huge data and with "recover", but even if it worked better it also failed processing the whole data. Even using the memory efficient etree.iterparse() it also unfortunately got into trouble. And with no luck finding any other libraries successfully parsing this enormous file I instead chose to build a "hugexmlparser" module that allows parsing this huge file using yield (on a byte-to-byte-level) and allows you to set a maximum size for <note> to cater for potential malformed or undesirable large attachments to export, should succeed covering potential exceptions. Some cases found where the parses discover malformed XML within <content> so also in those cases try to save as much as possible by escaping (to be dealt at a later stage, better than nothing), and if a missing end </note> before new (malformed?) it would add this after encounter a new start-tag. The code for the recovery process is a bit rough and for certain room for refactoring, but at the moment is seem to achieve what I wanted. Now with the above we pass this a minor changed version of save_note_recovery() assure the existing works. Also adding this as a new recover-enex command to click and kept the original options. A couple of new tests was added as well to check against using this command. Now this currently works to me, but thought I might share a PR in such as you find use for this yourself or found useful to others finding this rep… | 2021-05-15T07:49:31Z | 2021-05-15T19:57:50Z | 95f21ca163606db74babd036e6fa44b7d484d137 | 0 | a5839dadaa43694f208ad74a53670cebbe756956 | 0bc6ba503eecedb947d2624adbe1327dd849d7fe | FIRST_TIME_CONTRIBUTOR | 303218369 | https://github.com/dogsheep/evernote-to-sqlite/pull/12 | ||||||
655726387 | MDExOlB1bGxSZXF1ZXN0NjU1NzI2Mzg3 | 1347 | closed | 0 | Test docker platform blair only | 10801138 | 2021-05-28T02:47:09Z | 2021-05-28T02:47:28Z | 2021-05-28T02:47:28Z | e755dd8c8cf7149046a8b5fd44aec07c4b2416d3 | 0 | f730725fd260ba6578c472c344269d5d5df4e650 | 7b106e106000713bbee31b34d694b3dadbd4818c | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1347 | ||||||
655741428 | MDExOlB1bGxSZXF1ZXN0NjU1NzQxNDI4 | 1348 | open | 0 | DRAFT: add test and scan for docker images | 10801138 | **NOTE: I don't think this PR is ready, since the arm/v6 and arm/v7 images are failing pytest due to missing dependencies (gcc and friends). But it's pretty close.** Closes https://github.com/simonw/datasette/issues/1344 . Using a build-matrix for the platforms and [this test](https://github.com/simonw/datasette/issues/1344#issuecomment-849820019), we test all the platforms in parallel. I also threw in container scanning. ### Switch `pip install` to use either tags or commit shas Notably! This also [changes the Dockerfile](https://github.com/blairdrummond/datasette/blob/7fe5315d68e04fce64b5bebf4e2d7feec44f8546/Dockerfile#L20) so that it accepts tags or commit-shas. ``` # It's backwards compatible with tags, but also lets you use shas root@712071df17af:/# pip install git+git://github.com/simonw/datasette.git@0.56 Collecting git+git://github.com/simonw/datasette.git@0.56 Cloning git://github.com/simonw/datasette.git (to revision 0.56) to /tmp/pip-req-build-u6dhm945 Running command git clone -q git://github.com/simonw/datasette.git /tmp/pip-req-build-u6dhm945 Running command git checkout -q af5a7f1c09f6a902bb2a25e8edf39c7034d2e5de Collecting Jinja2<2.12.0,>=2.10.3 Downloading Jinja2-2.11.3-py2.py3-none-any.whl (125 kB) ``` This le… | 2021-05-28T03:02:12Z | 2021-05-28T03:06:16Z | eeea7cb835be0f0319cafccf50dffa6ad26826c5 | 0 | 56cba8fb837cd938c2f9d7423ee43d62a81c8f7c | 7b106e106000713bbee31b34d694b3dadbd4818c | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1348 | ||||||
672053811 | MDExOlB1bGxSZXF1ZXN0NjcyMDUzODEx | 65 | open | 0 | basic support for events | 231498 | a quick first pass at implementing the feature requested in https://github.com/dogsheep/github-to-sqlite/issues/64 testing instructions: ``` $ github-to-sqlite events events.db user/khimaros ``` if the specified user is the authenticated user, it will also include private events. caveat: pagination appears to be broken (i don't see `next` in the response JSON from GitHub) | 2021-06-17T00:51:30Z | 2022-10-03T22:35:03Z | 0a252a06a15e307c8a67b2e0aac0907e2566bf19 | 0 | 82da9f91deda81d92ec64c9eda960aa64340c169 | 0e45b72312a0756e5a562effbba08cb8de1e480b | FIRST_TIME_CONTRIBUTOR | 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/65 | ||||||
673872974 | MDExOlB1bGxSZXF1ZXN0NjczODcyOTc0 | 7 | open | 0 | Add instagram-to-sqlite | 36654812 | The tool covers only chat imports at the time of opening this PR but I'm planning to import everything else that I feel inquisitive about ref: https://github.com/gavindsouza/instagram-to-sqlite | 2021-06-19T12:26:16Z | 2021-07-28T07:58:59Z | 66e9828db4a8ddc4049ab9932e1304288e571821 | 0 | 4e4c6baf41778071a960d288b0ef02bd01cb6376 | 92c6bb77629feeed661c7b8d9183a11367de39e0 | FIRST_TIME_CONTRIBUTOR | 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/7 | ||||||
677554929 | MDExOlB1bGxSZXF1ZXN0Njc3NTU0OTI5 | 293 | closed | 0 | Test against Python 3.10-dev | 9599 | 2021-06-25T01:40:39Z | 2021-10-13T21:49:33Z | 2021-10-13T21:49:33Z | 0f64d20b044ecb86d9e4e5843f9590006d2f39c2 | 0 | ae0f46a78958c0118e98c2ab18bd1b57a0478326 | 747be6057d09a4e5d9d726e29d5cf99b10c59dea | OWNER | 140912432 | https://github.com/simonw/sqlite-utils/pull/293 | ||||||
678459554 | MDExOlB1bGxSZXF1ZXN0Njc4NDU5NTU0 | 1385 | closed | 0 | Fix + improve get_metadata plugin hook docs | 2670795 | This fixes documentation inaccuracies and adds a disclaimer about the signature of the `get_metadata` hook. Addresses the following comments: - https://github.com/simonw/datasette/issues/1384#issuecomment-869069926 - https://github.com/simonw/datasette/issues/1384#issuecomment-869075368 | 2021-06-27T05:43:20Z | 2021-09-13T18:53:11Z | 2021-09-13T18:53:11Z | d283ef6806aabcd749623ffe4e69011879f7bfad | 0 | 8d78c8c22ddfa10c041f7b5dd9118d4c8674729f | 67cbf0ae7243431bf13702e6e3ba466b619c4d6f | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1385 | |||||
692557381 | MDExOlB1bGxSZXF1ZXN0NjkyNTU3Mzgx | 1399 | open | 0 | Multiple sort | 87192257 | Closes #197. I have added support for sorting by multiple parameters as mentioned in the issue above, and together with that, a suggestion on how to implement such sorting in the user interface. | 2021-07-19T12:20:14Z | 2021-07-19T12:20:14Z | 3161cd1202824921054cf78d82c1d8c07b140451 | 0 | 739697660382e4d2974619b4a5605baef87d233a | c73af5dd72305f6a01ea94a2c76d52e5e26de38b | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1399 | ||||||
698423667 | MDExOlB1bGxSZXF1ZXN0Njk4NDIzNjY3 | 8 | open | 0 | Add Gmail takeout mbox import (v2) | 28565 | WIP This PR builds on #5 to continue implementing gmail import support. Building on @UtahDave's work, these commits add a few performance and bug fixes: * Decreased memory overhead for import by manually parsing mbox headers. * Fixed error where some messages in the mbox would yield a row with NULL in all columns. I will send more commits to fix any errors I encounter as I run the importer on my personal takeout data. | 2021-07-28T07:05:32Z | 2023-09-08T01:22:49Z | d2809fd3fd835358d01ad10401228a562539b29e | 0 | 8e6d487b697ce2e8ad885acf613a157bfba84c59 | e54e544427f1cc3ea8189f0e95f54046301a8645 | FIRST_TIME_CONTRIBUTOR | 206649770 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8 | ||||||
712412883 | MDExOlB1bGxSZXF1ZXN0NzEyNDEyODgz | 1434 | open | 0 | Enrich arbitrary query results with foreign key links and column descriptions | 9599 | Refs #1293, follows #942. | 2021-08-13T14:43:01Z | 2021-08-19T21:18:58Z | d4d4f5566b1d43075cb52ded5d19a9dcf4350761 | 0 | 281c0872d5b8a462c9d7b2b2d77a924da4ed25a7 | 2883098770fc66e50183b2b231edbde20848d4d6 | OWNER | 107914493 | https://github.com/simonw/datasette/pull/1434 | ||||||
716357982 | MDExOlB1bGxSZXF1ZXN0NzE2MzU3OTgy | 66 | open | 0 | Add --merged-by flag to pull-requests sub command | 30531572 | ## Description Proposing a solution to the API limitation for `merged_by` in pull_requests. Specifically the following called out in the readme: ``` Note that the merged_by column on the pull_requests table will only be populated for pull requests that are loaded using the --pull-request option - the GitHub API does not return this field for pull requests that are loaded in bulk. ``` This approach might cause larger repos to hit rate limits called out in https://github.com/dogsheep/github-to-sqlite/issues/51 but seems to work well in the repos I tested and included below. ## Old Behavior - Had to list out the pull-requests individually via multiple `--pull-request` flags ## New Behavior - `--merged-by` flag for getting 'merge_by' information out of pull-requests without having to specify individual PR numbers. # Testing Picking some repo that has more than one merger (datasette only has 1 😉 ) ``` $ github-to-sqlite pull-requests ./github.db opnsense/tools --merged-by $ echo "select id, url, merged_by from pull_requests;" | sqlite3 ./github.db 83533612|https://github.com/opnsense/tools/pull/39|1915288 102632885|https://github.com/opnsense/tools/pull/43|1915288 149114810|https://github.com/opnsense/tools/pull/57|1915288 160394495|https://github.com/opnsense/tools/pull/64|1915288 163308408|https://github.com/opnsense/tools/pull/67|1915288 169723264|https://github.com/opnsense/tools/pull/69|1915288 171381422|https://github.com/opnsense/tools/pull/72|1915288 179938195|https://github.com/opnsense/tools/pull/77|1915288 196233824|https://github.com/opnsense/tools/pull/82|1915288 215289964|https://github.com/opnsense/tools/pull/93| 219696100|https://github.com/opnsense/tools/pull/97|1915288 223664843|https://github.com/opnsense/tools/pull/99| 228446172|https://github.com/opnsense/tools/pull/103|1915288 238930434|https://github.com/opnsense/tools/pull/110|1915288 255507110|https://github.com/opnsense/tools/pull/119|1915288 255980675|https://github.com/opnsense/tools/pull/120… | 2021-08-20T00:57:55Z | 2021-09-28T21:50:31Z | 6b4276d9469e4579c81588ac9e3d128026d919a0 | 0 | a92a31d5d446022baeaf7f3c9ea107094637e64d | ed3752022e45b890af63996efec804725e95d0d4 | FIRST_TIME_CONTRIBUTOR | 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/66 | ||||||
718734191 | MDExOlB1bGxSZXF1ZXN0NzE4NzM0MTkx | 22 | open | 0 | Make sure that case-insensitive column names are unique | 32016596 | This closes #21. When there are metadata entries with the same case insensitive string, then there is an error when trying to create a new column for that metadata entry in the database table, because a column with that case insensitive name already exists. ```xml <Record type="HKQuantityTypeIdentifierDietaryCholesterol" sourceName="MyFitnessPal" sourceVersion="35120" unit="mg" creationDate="2021-07-04 20:55:27 +0200" startDate="2021-07-04 20:55:00 +0200" endDate="2021-07-04 20:55:00 +0200" value="124"> <MetadataEntry key="meal" value="Dinner"/> <MetadataEntry key="Meal" value="Dinner"/> </Record> ``` The code added in this PR checks if a key already exists in a record and if so adds a number at its end. The resulting column names look like the example below then. Interestingly, the column names viewed with Datasette are not case insensitive. ```text startDate, endDate, value, unit, sourceName, sourceVersion, creationDate, metadata_meal, metadata_Meal_2, metadata_Mahlzeit ``` | 2021-08-24T13:13:38Z | 2021-08-24T13:26:20Z | c757d372c10284cd6fa58d144549bc89691341c3 | 0 | b16fb556f84a0eed262a518ca7ec82a467155d23 | 9fe3cb17e03d6c73222b63e643638cf951567c4c | FIRST_TIME_CONTRIBUTOR | 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/22 | ||||||
721686721 | MDExOlB1bGxSZXF1ZXN0NzIxNjg2NzIx | 67 | open | 0 | Replacing step ID key with step_id | 16374374 | Workflows that have an `id` in any step result in the following error when running `workflows`: e.g.`github-to-sqlite workflows github.db nixos/nixpkgs` ```Traceback (most recent call last): File "/usr/local/bin/github-to-sqlite", line 8, in <module> sys.exit(cli()) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1137, in __call__ return self.main(*args, **kwargs) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1062, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1668, in invoke```Traceback (most recent call last): File "/usr/local/bin/github-to-sqlite", line 8, in <module> sys.exit(cli()) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1137, in __call__ return self.main(*args, **kwargs) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1062, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 763, in invoke return __callback(*args, **kwargs) File "/usr/local/lib/python3.8/dist-packages/github_to_sqlite/cli.py", line 601, in workflows utils.save_workflow(db, repo_id, filename, content) File "/usr/local/lib/python3.8/dist-packages/github_to_sqlite/utils.py", line 865, in save_workflow db["steps"].insert_all( File "/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py", line 2596, in insert_all self.insert_chunk( File "/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py", line 2378, in insert_chunk result = self.db.execute(query, params) File "/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py", line 419, in execute return self.conn.execute(sql, parameters) … | 2021-08-28T01:26:41Z | 2021-08-28T01:27:00Z | 9f73c9bf29dec9a1482d9af56b9fac271869585c | 0 | 9b5acceb25cf48b00e9c6c8293358b036440deb2 | ed3752022e45b890af63996efec804725e95d0d4 | FIRST_TIME_CONTRIBUTOR | 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/67 | ||||||
722480542 | MDExOlB1bGxSZXF1ZXN0NzIyNDgwNTQy | 1453 | closed | 0 | Bump black from 21.7b0 to 21.8b0 | 49699333 | Bumps [black](https://github.com/psf/black) from 21.7b0 to 21.8b0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/psf/black/releases">black's releases</a>.</em></p> <blockquote> <h2>21.8b0</h2> <h3><em>Black</em></h3> <ul> <li>Add support for formatting Jupyter Notebook files (<a href="https://github-redirect.dependabot.com/psf/black/issues/2357">#2357</a>)</li> <li>Move from <code>appdirs</code> dependency to <code>platformdirs</code> (<a href="https://github-redirect.dependabot.com/psf/black/issues/2375">#2375</a>)</li> <li>Present a more user-friendly error if .gitignore is invalid (<a href="https://github-redirect.dependabot.com/psf/black/issues/2414">#2414</a>)</li> <li>The failsafe for accidentally added backslashes in f-string expressions has been hardened to handle more edge cases during quote normalization (<a href="https://github-redirect.dependabot.com/psf/black/issues/2437">#2437</a>)</li> <li>Avoid changing a function return type annotation's type to a tuple by adding a trailing comma (<a href="https://github-redirect.dependabot.com/psf/black/issues/2384">#2384</a>)</li> <li>Parsing support has been added for unparenthesized walruses in set literals, set comprehensions, and indices (<a href="https://github-redirect.dependabot.com/psf/black/issues/2447">#2447</a>).</li> <li>Pin <code>setuptools-scm</code> build-time dependency version (<a href="https://github-redirect.dependabot.com/psf/black/issues/2457">#2457</a>)</li> <li>Exclude typing-extensions version 3.10.0.1 due to it being broken on Python 3.10 (<a href="https://github-redirect.dependabot.com/psf/black/issues/2460">#2460</a>)</li> </ul> <h3><em>Blackd</em></h3> <ul> <li>Replace sys.exit(-1) with raise ImportError as it plays more nicely with tools that scan installed packages (<a href="https://github-redirect.dependabot.com/psf/black/issues/2440">#2440</a>)</li> </ul> <h3>Integrations</h3> <ul> <li>The provided pre-commit hooks no longer specify <code>language_version</code> to avoid overriding <co… | 2021-08-30T13:13:39Z | 2021-09-14T13:10:40Z | 2021-09-14T13:10:38Z | 41e89206c9421f58bbc49b9a3f43439c351595a9 | 0 | 4f492a79aec631904e3302857a0ab5ea10cbf1af | 67cbf0ae7243431bf13702e6e3ba466b619c4d6f | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1453 | |||||
726990680 | MDExOlB1bGxSZXF1ZXN0NzI2OTkwNjgw | 35 | open | 0 | Support for Datasette's --base-url setting | 2670795 | This makes it so you can use Dogsheep if you're using Datasette with the `--base-url /some-path/` setting. | 2021-09-03T17:47:45Z | 2021-09-03T17:47:45Z | 0f5931da2099303111c49ec726b78bae814f755e | 0 | e6679d287b2e97fc94f50da64e1a7b91c1fbbf67 | a895bc360f2738c7af43deda35c847f1ee5bff51 | FIRST_TIME_CONTRIBUTOR | 197431109 | https://github.com/dogsheep/dogsheep-beta/pull/35 | ||||||
727390835 | MDExOlB1bGxSZXF1ZXN0NzI3MzkwODM1 | 36 | open | 0 | Correct naming of tool in readme | 2129 | 2021-09-05T12:05:40Z | 2022-01-06T16:04:46Z | 358678c6b48072769f2985fe6be8fc5e54ed2e06 | 0 | bf26955c250e601a0d9e751311530940b704f81e | edc80a0d361006f478f2904a90bfe6c730ed6194 | FIRST_TIME_CONTRIBUTOR | 256834907 | https://github.com/dogsheep/dogsheep-photos/pull/36 | |||||||
729704537 | MDExOlB1bGxSZXF1ZXN0NzI5NzA0NTM3 | 1465 | open | 0 | add support for -o --get /path | 51016 | Fixes https://github.com/simonw/datasette/issues/1459 Adds support for `--open --get /path` to be used in combination. If `--open` is provided alone, datasette will open a web page to a default URL. If `--get <url>` is provided alone, datasette will output the result of doing a GET to that URL and then exit. If `--open --get <url>` are provided together, datasette will open a web page to that URL. TODO items: - [ ] update documentation - [ ] print out error message when `--root --open --get <url>` is used - [ ] adjust code to require that `<url>` start with a `/` when `-o --get <url>` is used - [ ] add test(s) note, '@CTB' is used in this PR to flag code that needs revisiting. | 2021-09-08T14:30:42Z | 2021-09-08T14:31:45Z | 064e9511923fc4e50566bf9430b4a5b26f169357 | 1 | 9b66a7d9ba55bad8a3b409ede8855f4b4fff1f88 | d57ab156b35ec642549fb69d08279850065027d2 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1465 | ||||||
729731441 | MDExOlB1bGxSZXF1ZXN0NzI5NzMxNDQx | 326 | closed | 0 | Test against 3.10-dev | 9599 | This tests against the latest 3.10 RC, https://www.python.org/downloads/release/python-3100rc2/ | 2021-09-08T15:01:15Z | 2021-10-13T21:49:28Z | 2021-10-13T21:49:28Z | c563260408e1b802cbbc81ec7c1e398350a1ca3a | 0 | 078a08765d8aefa5ce376a03b2643d4ebe1aa57e | 77c240df56068341561e95e4a412cbfa24dc5bc7 | OWNER | 140912432 | https://github.com/simonw/sqlite-utils/pull/326 | |||||
730020867 | MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3 | 1467 | closed | 0 | Add Authorization header when CORS flag is set | 3058200 | This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give. | 2021-09-08T22:14:41Z | 2021-10-17T02:29:07Z | 2021-10-14T18:54:18Z | 15f258735ddee555028a075c09e1e8f74069be70 | 0 | 05109e8d61dedd477c4cedfb89b1da65610f70d1 | d57ab156b35ec642549fb69d08279850065027d2 | NONE | 107914493 | https://github.com/simonw/datasette/pull/1467 | |||||
737050557 | PR_kwDOCGYnMM4r7n-9 | 327 | closed | 0 | Extract expand: Support JSON Arrays | 101753 | Hi, I needed to extract data in JSON Arrays to normalize data imports. I've quickly hacked the following together based on #241 which refers to #239 where you, @simonw, wrote: > Could this handle lists of objects too? That would be pretty amazing - if the column has a [{...}, {...}] list in it could turn that into a many-to-many. They way this works in my work is that many-to-many relationships are created for anything that maps to an dictionary in a list, and many-to-one relations for everything else (assumed to be scalar values). Not sure what the best approach here would be? Are many-to-one relationships are at all useful here? What do you think about this approach? I could try to add it to the cli interface and documentation if wanted. Thanks for this awesome piece of software in any case! :sun_with_face: | 2021-09-19T10:34:30Z | 2022-12-29T09:05:36Z | 2022-12-29T09:05:36Z | f0105cde23452cb4c8a15fc6096154b15d9b7c5a | 0 | 2840c697aa9817462d864ed5f8a7696d749fe039 | 8d641ab08ac449081e96f3e25bd6c0226870948a | NONE | 140912432 | https://github.com/simonw/sqlite-utils/pull/327 | |||||
737690951 | PR_kwDOBm6k_c4r-EVH | 1475 | open | 0 | feat: allow joins using _through in both directions | 5268174 | Currently the `_through` clause can only work if the FK relationship is defined in a specific direction. I don't think there is any reason for this limitation, as an FK allows joining in both directions. This is an admittedly hacky change to implement bidirectional joins using `_through`. It does work for our use-case, but I don't know if there are other implications that I haven't thought of. Also if this change is desirable we probably want to make the code a little nicer. | 2021-09-20T15:28:20Z | 2021-09-20T15:28:20Z | aa2f1c103730c0ede4ab67978288d91bbe1e00a6 | 0 | edf3c4c3271c8f13ab4c28ad88b817e115477e41 | b28b6cd2fe97f7e193a235877abeec2c8eb0a821 | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1475 | ||||||
747742034 | PR_kwDODFdgUs4skaNS | 68 | open | 0 | Add support for retrieving teams / members | 68329 | Adds a method for retrieving all the teams within an organisation and all the members in those teams. The latter is stored as a join table `team_members` beteween `teams` and `users`. | 2021-10-01T15:55:02Z | 2021-10-01T15:59:53Z | f46e276c356c893370d5893296f4b69f08baf02c | 0 | cc838e87b1eb19b299f277a07802923104f35ce2 | ed3752022e45b890af63996efec804725e95d0d4 | FIRST_TIME_CONTRIBUTOR | 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/68 | ||||||
754942128 | PR_kwDOBm6k_c4s_4Cw | 1484 | closed | 0 | GitHub Actions: Add Python 3.10 to the tests | 3709715 | 2021-10-11T06:03:03Z | 2021-10-11T06:03:31Z | 2021-10-11T06:03:28Z | 69027b8c3e0e2236acd817a6fa5d32f762e3e9aa | 0 | 02c3218ca093df8b595d8ba7d88a32a0207b6385 | 0d5cc20aeffa3537cfc9296d01ec24b9c6e23dcf | FIRST_TIME_CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1484 | ||||||
764281468 | PR_kwDOBm6k_c4tjgJ8 | 1495 | open | 0 | Allow routes to have extra options | 536941 | Right now, datasette routes can only be a 2-tuple of `(regex, view_fn)`. If it was possible for datasette to handle extra options, like [standard Django does](https://docs.djangoproject.com/en/3.2/topics/http/urls/#passing-extra-options-to-view-functions), it would add flexibility for plugin authors. For example, if extra options were enabled, then it would be easy to make a single table the home page (#1284). This plugin would accomplish it. ```python from datasette import hookimpl from datasette.views.table import TableView @hookimpl def register_routes(datasette): return [ (r"^/$", TableView.as_view(datasette), {'db_name': 'DB_NAME', 'table': 'TABLE_NAME'}) ] ``` | 2021-10-22T15:00:45Z | 2021-11-19T15:36:27Z | 44969c5654748fb26ad05ab37245678f245f32e5 | 0 | fe7fa14b39846b919dfed44514a7d18d67e01dfd | ff9ccfb0310501a3b4b4ca24d73246a8eb3e7914 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1495 | ||||||
768796296 | PR_kwDOCGYnMM4t0uaI | 333 | closed | 0 | Add functionality to read Parquet files. | 2118708 | I needed this for a project of mine, and I thought it'd be useful to have it in sqlite-utils (It's also mentioned in #248 ). The current implementation works (data is read & data types are inferred correctly. I've added a single straightforward test case, but @simonw please let me know if there are any non-obvious flags/combinations I should test too. | 2021-10-28T23:43:19Z | 2021-11-25T19:47:35Z | 2021-11-25T19:47:35Z | eda2b1f8d2670c6ca8512e3e7c0150866bd0bdc6 | 0 | 50ec2e49dee3b09a48a7aef55eceaa3f752a52e7 | fda4dad23a0494890267fbe8baf179e2b56ee914 | NONE | 140912432 | https://github.com/simonw/sqlite-utils/pull/333 | |||||
770511531 | PR_kwDOBm6k_c4t7RKr | 1500 | closed | 0 | Bump black from 21.9b0 to 21.10b0 | 49699333 | Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.10b0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/psf/black/releases">black's releases</a>.</em></p> <blockquote> <h2>21.10b0</h2> <h3><em>Black</em></h3> <ul> <li>Document stability policy, that will apply for non-beta releases (<a href="https://github-redirect.dependabot.com/psf/black/issues/2529">#2529</a>)</li> <li>Add new <code>--workers</code> parameter (<a href="https://github-redirect.dependabot.com/psf/black/issues/2514">#2514</a>)</li> <li>Fixed feature detection for positional-only arguments in lambdas (<a href="https://github-redirect.dependabot.com/psf/black/issues/2532">#2532</a>)</li> <li>Bumped typed-ast version minimum to 1.4.3 for 3.10 compatiblity (<a href="https://github-redirect.dependabot.com/psf/black/issues/2519">#2519</a>)</li> <li>Fixed a Python 3.10 compatibility issue where the loop argument was still being passed even though it has been removed (<a href="https://github-redirect.dependabot.com/psf/black/issues/2580">#2580</a>)</li> <li>Deprecate Python 2 formatting support (<a href="https://github-redirect.dependabot.com/psf/black/issues/2523">#2523</a>)</li> </ul> <h3><em>Blackd</em></h3> <ul> <li>Remove dependency on aiohttp-cors (<a href="https://github-redirect.dependabot.com/psf/black/issues/2500">#2500</a>)</li> <li>Bump required aiohttp version to 3.7.4 (<a href="https://github-redirect.dependabot.com/psf/black/issues/2509">#2509</a>)</li> </ul> <h3><em>Black-Primer</em></h3> <ul> <li>Add primer support for --projects (<a href="https://github-redirect.dependabot.com/psf/black/issues/2555">#2555</a>)</li> <li>Print primer summary after individual failures (<a href="https://github-redirect.dependabot.com/psf/black/issues/2570">#2570</a>)</li> </ul> <h3>Integrations</h3> <ul> <li>Allow to pass <code>target_version</code> in the vim plugin (<a href="https://github-redirect.dependabot.com/psf/black/issues/1319">#1319</a>)</li> <li>Install build tools in docker file and use mul… | 2021-11-01T13:11:23Z | 2021-11-17T13:14:00Z | 2021-11-17T13:13:58Z | bc0c2637d3dabbbf55a1cb86df620683a2486ae5 | 0 | 1b7f679b0d732162e8841c63fd4b8b0682627c10 | 2c31d1cd9cd3b63458ccbe391866499fa3f44978 | CONTRIBUTOR | 107914493 | https://github.com/simonw/datasette/pull/1500 | |||||
771790589 | PR_kwDOEhK-wc4uAJb9 | 15 | open | 0 | include note tags in the export | 436138 | When parsing the Evernote `<note>` elements, the script will now also parse any nested `<tag>` elements, writing them out into a separate sqlite table. Here is an example of how to query the data after the script has run: ``` select notes.*, (select group_concat(tag) from notes_tags where notes_tags.note_id=notes.id) as tags from notes; ``` My .enex source file is 3+ years old so I am assuming the structure hasn't changed. Interestingly, my _notebook names_ show up in the _tags_ list where the tag name is prefixed with `notebook_`, so this could maybe help work around the first limitation mentioned in the [evernote-to-sqlite blog post](https://simonwillison.net/2020/Oct/16/building-evernote-sqlite-exporter/). | 2021-11-02T20:04:31Z | 2021-11-02T20:04:31Z | ee36aba995b0a5385bdf9a451851dcfc316ff7f6 | 0 | 8cc3aa49c6e61496b04015c14048c5dac58d6b42 | fff89772b4404995400e33fe1d269050717ff4cf | FIRST_TIME_CONTRIBUTOR | 303218369 | https://github.com/dogsheep/evernote-to-sqlite/pull/15 | ||||||
774610166 | PR_kwDOCGYnMM4uK5z2 | 337 | closed | 0 | Default values for `--attach` and `--param` options | 771193 | It seems that `click` 8.x uses `None` as the default value for `multiple=True` options. This change makes the code forward-compatible with `click` 8.x. See this build failure for more info: https://hydra.nixos.org/build/156926608 | 2021-11-05T21:57:53Z | 2021-11-05T22:33:03Z | 2021-11-05T22:33:02Z | eb8bf28da1794638a5693043cd5268f506a674d3 | 0 | 095fc64c5399d75d44d304571a21293d06d817f0 | fda4dad23a0494890267fbe8baf179e2b56ee914 | NONE | 140912432 | https://github.com/simonw/sqlite-utils/pull/337 |