pull_requests
436 rows where repo = 107914493
This data as json, CSV (advanced)
Suggested facets: state, milestone, draft, author_association, created_at (date), updated_at (date), closed_at (date), merged_at (date)
id ▼ | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | repo | url | merged_by | auto_merge |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
152360740 | MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw | 81 | closed | 0 | :fire: Removes DS_Store | jefftriplett 50527 | 2017-11-13T22:07:52Z | 2017-11-14T02:24:54Z | 2017-11-13T22:16:55Z | 2017-11-13T22:16:55Z | 06a826c3188af82f27bb6b4e09cc89b782d30bd6 | 0 | c66d297eac556a7f4fd4dcdb15cfb9466fddac77 | d75f423b6fcfc074b7c6f8f7679da8876f181edd | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/81 | |||||
152522762 | MDExOlB1bGxSZXF1ZXN0MTUyNTIyNzYy | 89 | closed | 0 | SQL syntax highlighting with CodeMirror | tomdyson 15543 | Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {"Ctrl-Space": "autocomplete"}, hintOptions: {tables: { users: ["name", "score", "birthDate"], countries: ["name", "population", "size"] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/) | 2017-11-14T14:43:33Z | 2017-11-15T02:03:01Z | 2017-11-15T02:03:01Z | 2017-11-15T02:03:01Z | 8252daa4c14d73b4b69e3f2db4576bb39d73c070 | 0 | 7f6ad095e9c41bf24d73b7724d898965c419965b | 075d422c0a1c70259188dfbd940538c67419694a | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/89 | ||||
152631570 | MDExOlB1bGxSZXF1ZXN0MTUyNjMxNTcw | 94 | closed | 0 | Initial add simple prod ready Dockerfile refs #57 | macropin 247192 | Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ``` | 2017-11-14T22:09:09Z | 2017-11-15T03:08:04Z | 2017-11-15T03:08:04Z | 2017-11-15T03:08:04Z | 86755503d26b4a83c2ec59f08ec1b8de791fd954 | 0 | 147195c2fdfa2b984d8f9fc1c6cab6634970a056 | 075d422c0a1c70259188dfbd940538c67419694a | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/94 | ||||
152870030 | MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw | 104 | closed | 0 | [WIP] Add publish to heroku support | jacobian 21148 | Refs #90 | 2017-11-15T19:56:22Z | 2017-11-21T20:55:05Z | 2017-11-21T20:55:05Z | 2017-11-21T20:55:05Z | e47117ce1d15f11246a3120aa49de70205713d05 | 0 | de42240afd1e3829fd21cbe77a89ab0eaab20d78 | 0331666e346c68b86de4aa19fbb37f3a408d37ca | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/104 | ||||
152914480 | MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw | 107 | closed | 0 | add support for ?field__isnull=1 | raynae 3433657 | Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)? | 2017-11-15T23:36:36Z | 2017-11-17T15:12:29Z | 2017-11-17T13:29:22Z | 2017-11-17T13:29:22Z | ed2b3f25beac720f14869350baacc5f62b065194 | 0 | 14d5bb572fadbd45973580bd9ad2a16c2bf12909 | b7c4165346ee8b6a6fbd72d6ba2275a24a8a8ae3 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/107 | ||||
153201945 | MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1 | 114 | closed | 0 | Add spatialite, switch to debian and local build | ingenieroariel 54999 | Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite. | 2017-11-17T02:37:09Z | 2017-11-17T03:50:52Z | 2017-11-17T03:50:52Z | 2017-11-17T03:50:52Z | 8b4c600d98b85655b3a1454ebf64f858b5fe54c8 | 0 | 6c6b63d890529eeefcefb7ab126ea3bd7b2315c1 | b7c4165346ee8b6a6fbd72d6ba2275a24a8a8ae3 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/114 | ||||
153306882 | MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy | 115 | closed | 0 | Add keyboard shortcut to execute SQL query | rgieseke 198537 | Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling. | 2017-11-17T14:13:33Z | 2017-11-17T15:16:34Z | 2017-11-17T14:22:56Z | 2017-11-17T14:22:56Z | eda848b37f8452dba7913583ef101f39d9b130ba | 0 | bb514164e69400fc0be4e033c27f45f90b1ef651 | ed2b3f25beac720f14869350baacc5f62b065194 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/115 | ||||
153324301 | MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx | 117 | closed | 0 | Don't prevent tabbing to `Run SQL` button | rgieseke 198537 | Mentioned in #115 Here you go! | 2017-11-17T15:27:50Z | 2017-11-19T20:30:24Z | 2017-11-18T00:53:43Z | 2017-11-18T00:53:43Z | 6d39429daa4655e3cf7a6a7671493292a20a30a1 | 0 | 7b4d00e87ed8ac931e6f5458599aece1a95d4e82 | eda848b37f8452dba7913583ef101f39d9b130ba | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/117 | ||||
153432045 | MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1 | 118 | closed | 0 | Foreign key information on row and table pages | simonw 9599 | 2017-11-18T03:13:27Z | 2017-11-18T03:15:57Z | 2017-11-18T03:15:50Z | 2017-11-18T03:15:50Z | 1b04662585ea1539014bfbd616a8112b650d5699 | 0 | 2fa60bc5e3c9d75c19e21a2384f52b58e1872fa8 | 6d39429daa4655e3cf7a6a7671493292a20a30a1 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/118 | |||||
154246816 | MDExOlB1bGxSZXF1ZXN0MTU0MjQ2ODE2 | 145 | closed | 0 | Fix pytest version conflict | simonw 9599 | https://travis-ci.org/simonw/datasette/jobs/305929426 pkg_resources.VersionConflict: (pytest 3.2.1 (/home/travis/virtualenv/python3.5.3/lib/python3.5/site-packages), Requirement.parse('pytest==3.2.3')) | 2017-11-22T20:15:34Z | 2017-11-22T20:17:54Z | 2017-11-22T20:17:52Z | 2017-11-22T20:17:52Z | f96e55bce55d26c4d5b198edc536e1b8e9bbea43 | 0 | e319478c4a34fb5afbff2b2a8c3b9ef9f859bb10 | fa8eb0bf1b113ab17ede9cd107b7c3bd5cde39c3 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/145 | ||||
157365811 | MDExOlB1bGxSZXF1ZXN0MTU3MzY1ODEx | 168 | closed | 0 | Upgrade to Sanic 0.7.0 | simonw 9599 | 2017-12-09T01:25:08Z | 2017-12-09T03:00:34Z | 2017-12-09T03:00:34Z | 2017-12-09T03:00:34Z | 446f4b832272b2286f6f65af19714eb64afb7aa6 | 0 | d9e13a5cc2b77637a6cdd8bd21b9b8fc1350051a | 61e3c5a1e904a6e1cbee86ba1494b5cb4b5820cf | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/168 | |||||
161982711 | MDExOlB1bGxSZXF1ZXN0MTYxOTgyNzEx | 178 | closed | 0 | If metadata exists, add it to heroku launch command | psychemedia 82988 | The heroku build does seem to make use of any provided `metadata.json` file. Add the `--metadata` switch to the Heroku web launch command if a `metadata.json` file is available. Addresses: https://github.com/simonw/datasette/issues/177 | 2018-01-09T21:42:21Z | 2018-01-15T09:42:46Z | 2018-01-14T21:05:16Z | 2018-01-14T21:05:16Z | 3a56a2cd7eea5d477d5d936b01098be5cba0d98e | 0 | 1bc9ed98c4f4fd91b70560ac8f507a2fddbd8317 | 306e1c6ac4f00cc25d676a6ee660938f5b27427c | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/178 | ||||
163523976 | MDExOlB1bGxSZXF1ZXN0MTYzNTIzOTc2 | 180 | closed | 0 | make html title more readable in query template | ryanpitts 56477 | tiny tweak to make this easier to visually parse—I think it matches your style in other templates | 2018-01-17T18:56:03Z | 2018-04-03T16:03:38Z | 2018-04-03T15:24:05Z | 2018-04-03T15:24:05Z | 446d47fdb005b3776bc06ad8d1f44b01fc2e938b | 0 | dc900b2f587c839e97389aaca70140fb06b4d40b | 56623e48da5412b25fb39cc26b9c743b684dd968 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/180 | ||||
163561830 | MDExOlB1bGxSZXF1ZXN0MTYzNTYxODMw | 181 | closed | 0 | add "format sql" button to query page, uses sql-formatter | bsmithgall 1957344 | Cool project! This fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance. | 2018-01-17T21:50:04Z | 2019-11-11T03:08:25Z | 2019-11-11T03:08:25Z | a9ac208088e536043890e0f7ff8a182398576a51 | 0 | 86ac746cfcbf2fa86863f8fab528494600eac1ae | a290f28caae61b47e76e825c06984f22fc41a694 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/181 | |||||
165029807 | MDExOlB1bGxSZXF1ZXN0MTY1MDI5ODA3 | 182 | closed | 0 | Add db filesize next to download link | raynae 3433657 | Took a stab at #172, will this do the trick? | 2018-01-25T04:58:56Z | 2019-03-22T13:50:57Z | 2019-02-06T04:59:38Z | a8d9e69872dec9a551b25cd609ffdbf3896045bd | 0 | b62835205a830472abb66c708822c2dcdf4ab027 | 56623e48da5412b25fb39cc26b9c743b684dd968 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/182 | |||||
179108961 | MDExOlB1bGxSZXF1ZXN0MTc5MTA4OTYx | 192 | closed | 0 | New ?_shape=objects/object/lists param for JSON API | simonw 9599 | Refs #122 | 2018-04-03T14:02:58Z | 2018-04-03T14:53:00Z | 2018-04-03T14:52:55Z | 2018-04-03T14:52:55Z | 0abd3abacb309a2bd5913a7a2df4e9256585b1bb | 0 | a759e09e8599e2cf54f6c5ab4d1cf8adf8608793 | dd0566ff8eda7fa2f0d92e51809581fae62cffed | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/192 | ||||
180188397 | MDExOlB1bGxSZXF1ZXN0MTgwMTg4Mzk3 | 196 | closed | 0 | _sort= and _sort_desc= parameters to table view | simonw 9599 | See #189 | 2018-04-09T00:07:21Z | 2018-04-09T05:10:29Z | 2018-04-09T05:10:23Z | 2018-04-09T05:10:23Z | c1d37fdf2be84fb07155bb1b1f61057444b03300 | 0 | fdd6b71e40c8aa9a93e95802a8b6291099d4db2c | b2188f044265c95f7e54860e28107c17d2a6ed2e | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/196 | ||||
181033024 | MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0 | 200 | closed | 0 | Hide Spatialite system tables | russss 45057 | They were getting on my nerves. | 2018-04-11T21:26:58Z | 2018-04-12T21:34:48Z | 2018-04-12T21:34:48Z | 2018-04-12T21:34:48Z | d08a13314081ae2ce0313a17d3c07c1a7f2d94d5 | 0 | 765b5d677154c633b91e3e826dfffc53b7c4b5d3 | bfb4e45a7bcb880758dbc18f66258de26c1d1904 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/200 | ||||
181247568 | MDExOlB1bGxSZXF1ZXN0MTgxMjQ3NTY4 | 202 | closed | 0 | Raise 404 on nonexistent table URLs | russss 45057 | Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything. This is issue #184. | 2018-04-12T15:47:06Z | 2018-04-13T19:22:56Z | 2018-04-13T18:19:15Z | 134150933ade84327cfd97a88d536f5bff37a136 | 0 | 71bbf4e4be8a9ab7bcc4ddfb33760c7d902f4a34 | bfb4e45a7bcb880758dbc18f66258de26c1d1904 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/202 | |||||
181600926 | MDExOlB1bGxSZXF1ZXN0MTgxNjAwOTI2 | 204 | closed | 0 | Initial units support | russss 45057 | Add support for specifying units for a column in metadata.json and rendering them on display using [pint](https://pint.readthedocs.io/en/latest/). Example table metadata: ```json "license_frequency": { "units": { "frequency": "Hz", "channel_width": "Hz", "height": "m", "antenna_height": "m", "azimuth": "degrees" } } ``` [Example result](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency/1) This works surprisingly well! I'd like to add support for using units when querying but this is PR is pretty usable as-is. (Pint doesn't seem to support decibels though - it thinks they're decibytes - which is an annoying omission.) (ref ticket #203) | 2018-04-13T21:32:49Z | 2018-04-14T09:44:33Z | 2018-04-14T03:32:54Z | 2018-04-14T03:32:54Z | ec6abc81e433c9bac1b9f085111785fc227e9e34 | 0 | 67c20a98a0cbb59a10247a49320c2feb7d0b1b41 | fb988ace7c7e2bee5ac142a0eab22431d0675a77 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/204 | ||||
181642114 | MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0 | 205 | closed | 0 | Support filtering with units and more | russss 45057 | The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels! | 2018-04-14T10:47:51Z | 2018-04-14T15:24:04Z | 2018-04-14T15:24:04Z | ed059c70e87a2930206652621e23a55167aa57c1 | 0 | eb3a37c34813ecbbfdae015305fec1f2a4ec27a5 | 6b15a53cd3cd40880a5e2d38827d5fac10e4bb5f | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/205 | |||||
181644805 | MDExOlB1bGxSZXF1ZXN0MTgxNjQ0ODA1 | 206 | closed | 0 | Fix sqlite error when loading rows with no incoming FKs | russss 45057 | This fixes `ERROR: conn=<sqlite3.Connection object at 0x10bbb9f10>, sql = 'select ', params = {'id': '1'}` caused by an invalid query loading incoming FKs when none exist. The error was ignored due to async but it still got printed to the console. | 2018-04-14T12:08:17Z | 2018-04-14T14:32:42Z | 2018-04-14T14:24:25Z | 2018-04-14T14:24:25Z | 1cc5161089e559c8b16049b20f7a5b3a43290c21 | 0 | 93b038e2469bee07d36ae8a943aab8b9d8610c1d | ec6abc81e433c9bac1b9f085111785fc227e9e34 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/206 | ||||
181647717 | MDExOlB1bGxSZXF1ZXN0MTgxNjQ3NzE3 | 207 | closed | 0 | Link foreign keys which don't have labels | russss 45057 | This renders unlabeled FKs as simple links. I can't see why this would cause any major problems. ![image](https://user-images.githubusercontent.com/45057/38768722-ea15a000-3fef-11e8-8664-ffd7aa4894ea.png) Also includes bonus fixes for two minor issues: * In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. * Print tracebacks to console when handling 500 errors. | 2018-04-14T13:27:14Z | 2018-04-14T15:00:00Z | 2018-04-14T15:00:00Z | 2018-04-14T15:00:00Z | f2b940d6026677f6859d46a4f16fa402745d261d | 0 | d9858672da8f74e5530deead140e2e633e1c2627 | ec6abc81e433c9bac1b9f085111785fc227e9e34 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/207 | ||||
181654839 | MDExOlB1bGxSZXF1ZXN0MTgxNjU0ODM5 | 208 | closed | 0 | Return HTTP 405 on InvalidUsage rather than 500 | russss 45057 | This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue. | 2018-04-14T16:12:50Z | 2018-04-14T18:00:39Z | 2018-04-14T18:00:39Z | 2018-04-14T18:00:39Z | efbb4e83374a2c795e436c72fa79f70da72309b8 | 0 | 20e5fcf827046adf76968d0e58f47e0b7d9271c3 | 8d394586f55bc4b8ab70476968d08fb6ec8339e5 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/208 | ||||
181723303 | MDExOlB1bGxSZXF1ZXN0MTgxNzIzMzAz | 209 | closed | 0 | Don't duplicate simple primary keys in the link column | russss 45057 | When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. This might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text "view" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.) Bonus change in this PR: fix urlencoding of links in the displayed HTML. Before: ![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png) After: ![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png) | 2018-04-15T21:56:15Z | 2018-04-18T08:40:37Z | 2018-04-18T01:13:04Z | 2018-04-18T01:13:04Z | 136a70d88741e2a5892c3de437064a9d14494d66 | 0 | 4acde4e187795214af6fc86f46af48982ec5de46 | bf5ec2d61148f9852441934dd206b3b1c07a512f | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/209 | ||||
181731956 | MDExOlB1bGxSZXF1ZXN0MTgxNzMxOTU2 | 210 | closed | 0 | Start of the plugin system, based on pluggy | simonw 9599 | Refs #14 | 2018-04-16T00:51:30Z | 2018-04-16T00:56:16Z | 2018-04-16T00:56:16Z | 2018-04-16T00:56:16Z | 33c7c53ff87c25445c68088ede49d062d9c31fe8 | 0 | d75e57060d9ef4ef6ebab3600e542885b7467272 | efbb4e83374a2c795e436c72fa79f70da72309b8 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/210 | ||||
181755220 | MDExOlB1bGxSZXF1ZXN0MTgxNzU1MjIw | 212 | closed | 0 | New --plugins-dir=plugins/ option | simonw 9599 | Refs #211 | 2018-04-16T05:19:28Z | 2018-04-16T05:22:18Z | 2018-04-16T05:22:01Z | 2018-04-16T05:22:01Z | b2955d9065ea019500c7d072bcd9d49d1967f051 | 0 | 33c6bcadb962457be6b0c7f369826b404e2bcef5 | 92396ae5bacedfcb3d7c81319ccdd04483fd7fd4 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/212 | ||||
182357613 | MDExOlB1bGxSZXF1ZXN0MTgyMzU3NjEz | 222 | closed | 0 | Fix for plugins in Python 3.5 | simonw 9599 | 2018-04-18T03:21:01Z | 2018-04-18T04:26:50Z | 2018-04-18T03:24:21Z | 2018-04-18T03:24:21Z | 4be6deb94776744071311777f0b18efb993c0cfa | 0 | 420cdcb88ee41c15a90fce30fdec5832c03295bd | 1c36d07dd432b9960f4f2d096739460b4fcf8877 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/222 | |||||
183135604 | MDExOlB1bGxSZXF1ZXN0MTgzMTM1NjA0 | 232 | closed | 0 | Fix a typo | lsb 45281 | It looks like this was the only instance of it: https://github.com/simonw/datasette/search?utf8=%E2%9C%93&q=SOLite&type= | 2018-04-20T18:20:04Z | 2018-04-21T00:19:08Z | 2018-04-21T00:19:08Z | 2018-04-21T00:19:08Z | a971718d2a5e1b61b5e5c27b0ef6c4ec65616e35 | 0 | b0f0f16a1c1f48aba62dfa30fa039dc6d3c07802 | 3a5d7951ce8f35118ffdd7f8d86e09b909e1218c | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/232 | ||||
185307407 | MDExOlB1bGxSZXF1ZXN0MTg1MzA3NDA3 | 246 | closed | 0 | ?_shape=array and _timelimit= | simonw 9599 | 2018-05-02T00:18:54Z | 2018-05-02T00:20:41Z | 2018-05-02T00:20:40Z | 2018-05-02T00:20:40Z | 690736436bac599ca042d1caa465c6d66d2651f9 | 0 | 3807d93b98573e142858c5871b8b4aadda71d28f | aa954382c3776d596f459897b0d984161293529d | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/246 | |||||
187668890 | MDExOlB1bGxSZXF1ZXN0MTg3NjY4ODkw | 257 | closed | 0 | Refactor views | simonw 9599 | * Split out view classes from main `app.py` * Run [black](https://github.com/ambv/black) against resulting code to apply opinionated source code formatting * Run [isort](https://github.com/timothycrosley/isort) to re-order my imports Refs #256 | 2018-05-13T13:00:50Z | 2018-05-14T03:04:25Z | 2018-05-14T03:04:24Z | 2018-05-14T03:04:24Z | 2b79f2bdeb1efa86e0756e741292d625f91cb93d | 0 | 0e2b41f3fa38456af32548c536f955c48c7637e8 | 4301a8f3ac69f2f54916e73cc90fcf216a9a3746 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/257 | ||||
187770345 | MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1 | 258 | closed | 0 | Add new metadata key persistent_urls which removes the hash from all database urls | philroche 247131 | Add new metadata key "persistent_urls" which removes the hash from all database urls when set to "true" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow. | 2018-05-14T09:39:18Z | 2018-05-21T07:38:15Z | 2018-05-21T07:38:15Z | 457fcdfc82a0260db543d49006d49f8486f233b5 | 0 | 0d77a896ccb16b34c86fdeef7738f2d056e27e02 | 2b79f2bdeb1efa86e0756e741292d625f91cb93d | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/258 | |||||
188312411 | MDExOlB1bGxSZXF1ZXN0MTg4MzEyNDEx | 261 | closed | 0 | Facets improvements plus suggested facets | simonw 9599 | Refs #255 | 2018-05-16T03:52:39Z | 2018-05-16T15:27:26Z | 2018-05-16T15:27:25Z | 2018-05-16T15:27:25Z | 9959a9e4deec8e3e178f919e8b494214d5faa7fd | 0 | af0e91e7769891949198fb1e1760886424f34b16 | 2b79f2bdeb1efa86e0756e741292d625f91cb93d | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/261 | ||||
189318453 | MDExOlB1bGxSZXF1ZXN0MTg5MzE4NDUz | 277 | closed | 0 | Refactor inspect logic | russss 45057 | This pulls the logic for inspect out into a new file which makes it a bit easier to understand. This was going to be the first part of an implementation for #276, but it seems like that might take a while so I'm going to PR a few bits of refactoring individually. | 2018-05-21T08:49:31Z | 2018-05-22T16:07:24Z | 2018-05-22T14:03:07Z | 2018-05-22T14:03:07Z | 58b5a37dbbf13868a46bcbb284509434e66eca25 | 0 | 0b81e047ad27b67ba17e8c176e38a94cf4548115 | d59366d36e95b973d674e62edff0168d5bdd90eb | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/277 | ||||
189707374 | MDExOlB1bGxSZXF1ZXN0MTg5NzA3Mzc0 | 279 | closed | 0 | Add version number support with Versioneer | rgieseke 198537 | I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 | 2018-05-22T15:39:45Z | 2018-05-22T19:35:23Z | 2018-05-22T19:35:22Z | 2018-05-22T19:35:22Z | 49f317752cfe89c5641165a490eef49e025752a7 | 0 | d0d19453e8623cc98a2baa2cadaaff4cd2fee973 | 58b5a37dbbf13868a46bcbb284509434e66eca25 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/279 | ||||
189723716 | MDExOlB1bGxSZXF1ZXN0MTg5NzIzNzE2 | 280 | closed | 0 | Build Dockerfile with recent Sqlite + Spatialite | r4vi 565628 | This solves #278 without bloating the Dockerfile too much, the image size is now 495MB (original was ~240MB) but it could be reduced significantly if we only copied the output of the compilation of spatialite and friends to /usr/local/lib, instead of the entirety of it however that will take more time. In the python code change references to `import sqlite3` to `import pysqlite3` and it should use the compiled version of sqlite3.23.1. You don't need to try/except because pysqlite3 falls back to builtin sqlite3 if there is no compiled version. ```bash $ docker run --rm -it datasette spatialite SpatiaLite version ..: 4.4.0-RC0 Supported Extensions: - 'VirtualShape' [direct Shapefile access] - 'VirtualDbf' [direct DBF access] - 'VirtualXL' [direct XLS access] - 'VirtualText' [direct CSV/TXT access] - 'VirtualNetwork' [Dijkstra shortest path] - 'RTree' [Spatial Index - R*Tree] - 'MbrCache' [Spatial Index - MBR cache] - 'VirtualSpatialIndex' [R*Tree metahandler] - 'VirtualElementary' [ElemGeoms metahandler] - 'VirtualKNN' [K-Nearest Neighbors metahandler] - 'VirtualXPath' [XML Path Language - XPath] - 'VirtualFDO' [FDO-OGR interoperability] - 'VirtualGPKG' [OGC GeoPackage interoperability] - 'VirtualBBox' [BoundingBox tables] - 'SpatiaLite' [Spatial SQL - OGC] PROJ.4 version ......: Rel. 4.9.3, 15 August 2016 GEOS version ........: 3.5.1-CAPI-1.9.1 r4246 TARGET CPU ..........: x86_64-linux-gnu the SPATIAL_REF_SYS table already contains some row(s) SQLite version ......: 3.23.1 Enter ".help" for instructions SQLite version 3.23.1 2018-04-10 17:39:29 Enter ".help" for instructions Enter SQL statements terminated with a ";" spatialite> ``` ```bash $ docker run --rm -it datasette python -c "import pysqlite3; print(pysqlite3.sqlite_version)" 3.23.1 ``` | 2018-05-22T16:33:50Z | 2018-06-28T11:26:23Z | 2018-05-23T17:43:35Z | 2018-05-23T17:43:35Z | bd30c696e18927207358ee9d63174a5c41c8297e | 0 | 5cf78eded61cacec435b854e18f1e94511cf2da8 | 58b5a37dbbf13868a46bcbb284509434e66eca25 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/280 | ||||
189860052 | MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy | 281 | closed | 0 | Reduces image size using Alpine + Multistage (re: #278) | iMerica 487897 | Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it <my-tag> datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 | 2018-05-23T05:27:05Z | 2018-05-26T02:10:38Z | 2018-05-26T02:10:38Z | 0d6c8fa841ae5d28e151e4ba43370289d1e2e22c | 0 | 3af65075c430d94647f8a1b1f215e82f563bc46f | 49f317752cfe89c5641165a490eef49e025752a7 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/281 | |||||
190901429 | MDExOlB1bGxSZXF1ZXN0MTkwOTAxNDI5 | 293 | closed | 0 | Support for external database connectors | jsancho-gpl 11912854 | I think it would be nice that Datasette could work with other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. These external connectors must have a structure similar to the structure [PyTables Datasette connector](https://github.com/PyTables/datasette-pytables) has. | 2018-05-28T11:02:45Z | 2018-09-11T14:32:45Z | 2018-09-11T14:32:45Z | ad2cb12473025ffab738d4df6bb47cd8b2e27859 | 0 | 59c94be46f9ccd806dd352fa28a6dba142d5ab82 | b7257a21bf3dfa7353980f343c83a616da44daa7 | FIRST_TIME_CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/293 | |||||
193361341 | MDExOlB1bGxSZXF1ZXN0MTkzMzYxMzQx | 307 | closed | 0 | Initial sketch of custom URL routing, refs #306 | simonw 9599 | See #306 for background on this. | 2018-06-07T15:26:48Z | 2018-06-07T15:29:54Z | 2018-06-07T15:29:41Z | 8c6663d3cc8043fc6f5c796275e80b0445bdff12 | 0 | 018af454f286120452e33d2568dd40908474a8a8 | a246f476b4fe490f5450836b22961bc607e6b4b0 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/307 | |||||
195339111 | MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx | 311 | closed | 0 | ?_labels=1 to expand foreign keys (in csv and json), refs #233 | simonw 9599 | Output looks something like this: { "rowid": 233, "TreeID": 121240, "qLegalStatus": { "value" 2, "label": "Private" } "qSpecies": { "value": 16, "label": "Sycamore" } "qAddress": "91 Commonwealth Ave", ... } | 2018-06-16T16:31:12Z | 2018-06-16T22:20:31Z | 2018-06-16T22:20:31Z | 9fe59e54ad65eb1c8239b1a78edb5219d3ab8ab0 | 0 | 40287b1ba09d6e75f0db1458fe78d8c055f128af | d0a578c0fc07b9d9208cd9de981bdf7385a26c49 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/311 | |||||
195413241 | MDExOlB1bGxSZXF1ZXN0MTk1NDEzMjQx | 315 | closed | 0 | Streaming mode for downloading all rows as a CSV | simonw 9599 | Refs #266 | 2018-06-18T03:06:59Z | 2018-06-18T03:29:13Z | 2018-06-18T03:21:02Z | 2018-06-18T03:21:02Z | fc3660cfad7668dbce6ead12766e048fc1f78b11 | 0 | b15f412e04ce9ff21983986e661fbe4396f97b43 | 0d7ba1ba676828dc7c8dda78ebe7921f7986fc18 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/315 | ||||
196526861 | MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx | 322 | closed | 0 | Feature/in operator | 4e1e0603 2691848 | 2018-06-21T17:41:51Z | 2018-06-21T17:45:25Z | 2018-06-21T17:45:25Z | 80b7bcefa1c07202779d98c9e2214f3ebad704e3 | 0 | 1acc562a2f60a7289438df657db8fd6dd3a7391d | e7566cc59d4b02ef301054fd35fdde6c925a8e38 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/322 | ||||||
196628304 | MDExOlB1bGxSZXF1ZXN0MTk2NjI4MzA0 | 324 | closed | 0 | Speed up Travis by reusing pip wheel cache across builds | simonw 9599 | From https://atchai.com/blog/faster-ci/ - refs #323 | 2018-06-22T03:20:08Z | 2018-06-24T01:03:47Z | 2018-06-24T01:03:47Z | 2018-06-24T01:03:47Z | 47e689a89b3f5f0969595b17d2ec59ea3caffb3b | 0 | 7d7f5f61fd6dca3385386a657a13057680d8ddd7 | e7566cc59d4b02ef301054fd35fdde6c925a8e38 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/324 | ||||
201075532 | MDExOlB1bGxSZXF1ZXN0MjAxMDc1NTMy | 341 | closed | 0 | Bump aiohttp to fix compatibility with Python 3.7 | simonw 9599 | Tests failed here: https://travis-ci.org/simonw/datasette/jobs/403223333 | 2018-07-12T17:41:24Z | 2018-07-12T18:07:38Z | 2018-07-12T18:07:38Z | 2018-07-12T18:07:38Z | 31a5d8fa77be68d4f837f0a80a611675dce49f4b | 0 | 8d34ed776168dcac530859c51f22e8b48829a513 | 130dc8823ebdcc1834fc7c4a03c996b13fc1e444 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/341 | ||||
201451332 | MDExOlB1bGxSZXF1ZXN0MjAxNDUxMzMy | 345 | closed | 0 | Allow app names for `datasette publish heroku` | russss 45057 | Lets you supply the `-n` parameter for Heroku deploys, which also lets you update existing Heroku deployments. | 2018-07-14T13:12:34Z | 2018-07-14T14:09:54Z | 2018-07-14T14:04:44Z | 2018-07-14T14:04:44Z | 58fec99ab0a31bcf25968f2aa05d37de8139b83c | 0 | 864cf8c1b01a581d6dc9711efe7cb4f2a6ac87e8 | 31a5d8fa77be68d4f837f0a80a611675dce49f4b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/345 | ||||
204029142 | MDExOlB1bGxSZXF1ZXN0MjA0MDI5MTQy | 349 | closed | 0 | publish_subcommand hook + default plugins mechanism, used for publish heroku/now | simonw 9599 | This change introduces a new plugin hook, publish_subcommand, which can be used to implement new subcommands for the "datasette publish" command family. I've used this new hook to refactor out the "publish now" and "publish heroku" implementations into separate modules. I've also added unit tests for these two publishers, mocking the subprocess.call and subprocess.check_output functions. As part of this, I introduced a mechanism for loading default plugins. These are defined in the new "default_plugins" list inside datasette/app.py Closes #217 (Plugin support for "datasette publish") Closes #348 (Unit tests for "datasette publish") Refs #14, #59, #102, #103, #146, #236, #347 | 2018-07-26T05:03:22Z | 2018-07-26T05:28:54Z | 2018-07-26T05:16:00Z | 2018-07-26T05:16:00Z | dbbe707841973b50a76d2703003ae2c40e7ad1fd | 0 | 7abdfd55daa9c617da02fd768b8e7476e89f0f94 | 3ac21c749881d0fb1c35b0f9b7a819e29f61c5c1 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/349 | ||||
204851511 | MDExOlB1bGxSZXF1ZXN0MjA0ODUxNTEx | 353 | closed | 0 | render_cell(value) plugin hook | simonw 9599 | Closes #352. | 2018-07-30T15:57:08Z | 2018-08-05T00:14:57Z | 2018-08-05T00:14:57Z | 2018-08-05T00:14:57Z | 4ac913224061f2dc4f673efab1a5ac6bc748854f | 0 | 2e538d924f3b17f82e94e8e8b5a05abcf9e1e697 | 295d005ca48747faf046ed30c3c61e7563c61ed2 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/353 | ||||
206863803 | MDExOlB1bGxSZXF1ZXN0MjA2ODYzODAz | 358 | closed | 0 | Bump versions of pytest, pluggy and beautifulsoup4 | simonw 9599 | 2018-08-08T00:44:38Z | 2018-08-08T01:11:13Z | 2018-08-08T01:11:13Z | 2018-08-08T01:11:13Z | e1db8194e8c1d7f361fd0c1c3fc1b91d6aa920e5 | 0 | 848ed0e0420d2e8c95a96b4cf73082da4c65d8f6 | fe5b6ea95a973534fe8a44907c0ea2449aae7602 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/358 | |||||
208719043 | MDExOlB1bGxSZXF1ZXN0MjA4NzE5MDQz | 361 | closed | 0 | Import pysqlite3 if available, closes #360 | simonw 9599 | 2018-08-16T00:52:21Z | 2018-08-16T00:58:57Z | 2018-08-16T00:58:57Z | 2018-08-16T00:58:57Z | aae49fef3b75848628d824077ec063834e3e5167 | 0 | da41daa168af8f29a1beb5278aed833cf3dc48ce | e1db8194e8c1d7f361fd0c1c3fc1b91d6aa920e5 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/361 | |||||
211860706 | MDExOlB1bGxSZXF1ZXN0MjExODYwNzA2 | 363 | open | 0 | Search all apps during heroku publish | kevboh 436032 | Adds the `-A` option to include apps from all organizations when searching app names for publish. | 2018-08-29T19:25:10Z | 2018-08-31T14:39:45Z | b684b04c30f6b8779a3d11f7599329092fb152f3 | 0 | 2dd363e01fa73b24ba72f539c0a854bc901d23a7 | b7257a21bf3dfa7353980f343c83a616da44daa7 | FIRST_TIME_CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/363 | ||||||
214653641 | MDExOlB1bGxSZXF1ZXN0MjE0NjUzNjQx | 364 | open | 0 | Support for other types of databases using external connectors | jsancho-gpl 11912854 | This PR is related to #293, but now all commits have been merged. The purpose is to support other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. The modifications in the original datasette code are minimal and many are in a separated file. | 2018-09-11T14:31:47Z | 2018-09-11T14:31:47Z | d84f3b1f585cb52b58aed0401c34214de2e8b47b | 0 | 592fd05f685859b271f0305c2fc8cdb7da58ebfb | b7257a21bf3dfa7353980f343c83a616da44daa7 | FIRST_TIME_CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/364 | ||||||
216651317 | MDExOlB1bGxSZXF1ZXN0MjE2NjUxMzE3 | 365 | closed | 0 | fix small doc typo | jaywgraves 418191 | 2018-09-19T14:02:02Z | 2019-12-19T02:30:33Z | 2018-09-19T17:15:43Z | 2018-09-19T17:15:43Z | 1bcd54a834a2f9730d21095df855f6708c85c200 | 0 | d3fb6a80c5878c73befa2a35e11a9ce28a6e1ab6 | b7257a21bf3dfa7353980f343c83a616da44daa7 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/365 | |||||
226314862 | MDExOlB1bGxSZXF1ZXN0MjI2MzE0ODYy | 367 | closed | 0 | Mark codemirror files as vendored | jaap3 48517 | GitHub lists datasette as a Javascript project, primarily because of the vendored codemirror files. This is somewhat confusing when you're looking for datasette, knowing it's written in Python. Luckily it's possible exclude certain files from GitHub's code statistics: https://github.com/github/linguist#using-gitattributes | 2018-10-27T18:41:25Z | 2019-05-03T21:12:09Z | 2019-05-03T21:11:20Z | 2019-05-03T21:11:20Z | 66c87cee0c7344c7877373c60b180c766c206101 | 0 | 8e23e4548eedd58b0ff0e69c9a5010fc1a0136d5 | 6b398c2971801d9a20cfdb7998f59020d5534e22 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/367 | ||||
226315513 | MDExOlB1bGxSZXF1ZXN0MjI2MzE1NTEz | 368 | closed | 0 | Update installation instructions | jaap3 48517 | I was writing this as a response to your tweet, but decided I might just make it a pull request. I feel like it might be confusing to those unfamiliar with Python's `-m` flag and the built-in `venv` module to omit the space between the flag and its argument. By adding a space and prefixing the second occurrence of `venv` with a `./` it's maybe a bit clearer what the arguments are and what they do. By also using `python3 -m pip` it becomes even clearer that `-m` is a special flag that makes the python executable do neat things. | 2018-10-27T18:52:31Z | 2019-05-03T18:18:43Z | 2019-05-03T18:18:42Z | 2019-05-03T18:18:42Z | f853d5592ec7f901a50381de22a26a9ab098f885 | 0 | 8f8d6072820a13e2c698d9c326998b63810779c6 | 553314dcd699a84aa7cc806377150ca0d57a6024 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/368 | ||||
232172106 | MDExOlB1bGxSZXF1ZXN0MjMyMTcyMTA2 | 389 | closed | 0 | Bump dependency versions | simonw 9599 | 2018-11-20T02:23:12Z | 2019-11-13T19:13:41Z | 2019-11-13T19:13:41Z | 9194c0165aef411e0784ba49939b1005306f1f38 | 0 | f8349b45916e68d2f89c57694bd0e6afaf1bd508 | 5e3a432a0caa23837fa58134f69e2f82e4f632a6 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/389 | ||||||
235194286 | MDExOlB1bGxSZXF1ZXN0MjM1MTk0Mjg2 | 390 | closed | 0 | tiny typo in customization docs | jaywgraves 418191 | was looking to add some custom templates to my use of datasette and saw this small typo. | 2018-12-01T13:44:42Z | 2019-12-19T02:30:35Z | 2018-12-16T21:32:56Z | 2018-12-16T21:32:56Z | ed78922ae38b51513319b60ac39990b7c2aca810 | 0 | f8c01373dad3b8dcd10577a2e541f88ef34c77bc | 3de8fac1d322cbab6c8c55899e0e8511b36337d0 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/390 | ||||
241418443 | MDExOlB1bGxSZXF1ZXN0MjQxNDE4NDQz | 392 | closed | 0 | Fix some regex DeprecationWarnings | simonw 9599 | 2018-12-29T02:10:28Z | 2018-12-29T02:22:28Z | 2018-12-29T02:22:28Z | 2018-12-29T02:22:28Z | a2bfcfc1b1c60dac3526364af17c2fa2f3d41a0a | 0 | d245982aedaf7c54bf41d60ea7f0e0cf419c2b2f | eac08f0dfc61a99e8887442fc247656d419c76f8 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/392 | |||||
247923347 | MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3 | 404 | closed | 0 | Experiment: run Jinja in async mode | simonw 9599 | See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea. | 2019-01-27T00:28:44Z | 2019-11-12T05:02:18Z | 2019-11-12T05:02:13Z | 773bcac907d17b16eef604ad943837da39a10090 | 0 | dd7f24a47f660e2f0fc1e97a13d28908c28dc245 | 909cc8fbdfc9c05e447f40e9a73489809602c3cd | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/404 | |||||
250628275 | MDExOlB1bGxSZXF1ZXN0MjUwNjI4Mjc1 | 407 | closed | 0 | Heroku --include-vcs-ignore | simonw 9599 | Should mean `datasette publish heroku` can work under Travis, unlike this failure: https://travis-ci.org/simonw/fivethirtyeight-datasette/builds/488047550 ``` 2.25s$ datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette tar: unrecognized option '--exclude-vcs-ignores' Try 'tar --help' or 'tar --usage' for more information. ▸ Command failed: tar cz -C /tmp/tmpuaxm7i8f --exclude-vcs-ignores --exclude ▸ .git --exclude .gitmodules . > ▸ /tmp/f49440e0-1bf3-4d3f-9eb0-fbc2967d1fd4.tar.gz ▸ tar: unrecognized option '--exclude-vcs-ignores' ▸ Try 'tar --help' or 'tar --usage' for more information. ▸ The command "datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette" exited with 0. ``` The fix for that issue is to call the heroku command like this: heroku builds:create -a app_name --include-vcs-ignore | 2019-02-06T04:06:20Z | 2019-02-06T04:31:30Z | 2019-02-06T04:15:47Z | 2019-02-06T04:15:46Z | 195a5b36349d0d24a6bbb758cebb719b6de303b6 | 0 | 01978ddb9682c828cafddaf9ca625e08ba3ba3a4 | 436b8bc1d17c2ab415800ab209204f94e7f7929e | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/407 | ||||
255725057 | MDExOlB1bGxSZXF1ZXN0MjU1NzI1MDU3 | 413 | closed | 0 | Update spatialite.rst | joelondon 28597217 | a line of sql added to create the idx_<table_name> in the python recipe | 2019-02-25T00:08:35Z | 2019-03-15T05:06:45Z | 2019-03-15T05:06:45Z | 2019-03-15T05:06:45Z | 9e8c36793bfbb17c2f67371cc7f9aa8b9202fdc4 | 0 | e87565604a169a34eadadfc99e96a8f503123e8c | 1f91065b20cbc691f464bccfd8eef7d1ce4b14a8 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/413 | ||||
261418285 | MDExOlB1bGxSZXF1ZXN0MjYxNDE4Mjg1 | 416 | closed | 0 | URL hashing now optional: turn on with --config hash_urls:1 (#418) | simonw 9599 | 2019-03-15T04:26:06Z | 2019-03-17T22:55:04Z | 2019-03-17T22:55:04Z | 2019-03-17T22:55:04Z | 6f6d0ff2b41f1cacaf42287b1b230b646bcba9ee | 0 | 0d02a99c9665669540aebff981246d8c743072b3 | afe9aa3ae03c485c5d6652741438d09445a486c1 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/416 | |||||
266035382 | MDExOlB1bGxSZXF1ZXN0MjY2MDM1Mzgy | 424 | closed | 0 | Column types in inspected metadata | russss 45057 | This PR does two things: * Adds the sqlite column type for each column to the inspected table info. * Stops binary columns from being rendered to HTML, unless a plugin handles it. There's a bit more detail in the changeset descriptions. These changes are intended as a precursor to a plugin which adds first-class support for Spatialite geographic primitives, and perhaps more useful geo-stuff. | 2019-03-31T18:46:33Z | 2019-04-29T18:30:50Z | 2019-04-29T18:30:46Z | a332d4e0b3fed7165a22880430664f1c3a00963d | 0 | 92e7b8c67fe5bcd484f19576f20c9235aca9050b | 0209a0a344503157351e625f0629b686961763c9 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/424 | |||||
269364924 | MDExOlB1bGxSZXF1ZXN0MjY5MzY0OTI0 | 426 | closed | 0 | Upgrade to Jinja2==2.10.1 | simonw 9599 | https://nvd.nist.gov/vuln/detail/CVE-2019-10906 This is only a security issue of concern if evaluating templates from untrusted sources, which isn't something I would ever expect a Datasette user to do. | 2019-04-10T23:03:08Z | 2019-04-22T21:23:22Z | 2019-04-10T23:13:31Z | 2019-04-10T23:13:31Z | 9cd3b44277e6a8ea9273bf659379ff0414e0b8ae | 0 | 629453383c7f911eddfc891f22c39b7d6e9661aa | 78e45ead4d771007c57b307edf8fc920101f8733 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/426 | ||||
270191084 | MDExOlB1bGxSZXF1ZXN0MjcwMTkxMDg0 | 430 | closed | 0 | ?_where= parameter on table views, closes #429 | simonw 9599 | 2019-04-13T01:15:09Z | 2019-04-13T01:37:23Z | 2019-04-13T01:37:23Z | 2019-04-13T01:37:23Z | bc6a9b45646610f362b4287bc4110440991aa4d6 | 0 | 3ee087c7b60da7ec3e5d2f73611fc6ea99ff82fc | e11cb4c66442abca2a6b6159521a6cf4da8739c1 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/430 | |||||
270251021 | MDExOlB1bGxSZXF1ZXN0MjcwMjUxMDIx | 432 | closed | 0 | Refactor facets to a class and new plugin, refs #427 | simonw 9599 | WIP for #427 | 2019-04-13T20:04:45Z | 2019-05-03T00:04:24Z | 2019-05-03T00:04:24Z | b78bc19269ed83b054a60c79c4fe08f4ca943942 | 0 | 5c198f7ca5d2aff49180820271ba8d06b79aefb1 | 9c77e6e355ec718d76178a7607721d10a66b6aef | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/432 | |||||
271338405 | MDExOlB1bGxSZXF1ZXN0MjcxMzM4NDA1 | 434 | closed | 0 | "datasette publish cloudrun" command to publish to Google Cloud Run | rprimet 10352819 | This is a very rough draft to start a discussion on a possible datasette cloud run publish plugin (see issue #400). The main change was to dynamically set the listening port in `make_dockerfile` to satisfy cloud run's [requirements](https://cloud.google.com/run/docs/reference/container-contract). This was done by running `datasette` through `sh` to get environment variable substitution. Not sure if that's the right approach? | 2019-04-17T14:41:18Z | 2019-05-03T21:50:44Z | 2019-05-03T13:59:02Z | 2019-05-03T13:59:02Z | 75a21fc2a136ccfc9da7bbf521cf288e63c9707f | 0 | 74c20d0d2eac13892ac20db0e66fcb3437544aa6 | bf229c9bd88179c8ec16bd65fd4fb28ab4241c2e | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/434 | ||||
274174614 | MDExOlB1bGxSZXF1ZXN0Mjc0MTc0NjE0 | 437 | closed | 0 | Add inspect and prepare_sanic hooks | russss 45057 | This adds two new plugin hooks: The `inspect` hook allows plugins to add data to the inspect dictionary. The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now... On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer). Ref #14 | 2019-04-28T11:53:34Z | 2019-06-24T16:38:57Z | 2019-06-24T16:38:56Z | 7aeaac7c478acf572bda61bdaa6ac3247dc15811 | 0 | f33a0a63a7442f0b665320ac3e2eb55de315f1f7 | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/437 | |||||
274313625 | MDExOlB1bGxSZXF1ZXN0Mjc0MzEzNjI1 | 439 | closed | 0 | [WIP] Add primary key to the extra_body_script hook arguments | russss 45057 | This allows the row to be identified on row pages. The context here is that I want to access the row's data to plot it on a map. I considered passing the entire template context through to the hook function. This would expose the actual row data and potentially avoid a further fetch request in JS, but it does make the plugin API a lot more leaky. (At any rate, using the selected row data is tricky in my case because of Spatialite's infuriating custom binary representation...) | 2019-04-29T10:08:23Z | 2019-05-01T09:58:32Z | 2019-05-01T09:58:30Z | b3cbcfef4d11d2741cf00861734d726a4730afe5 | 0 | 76b2c8fa406063b436155a7d8995e07b7e718c13 | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/439 | |||||
274468836 | MDExOlB1bGxSZXF1ZXN0Mjc0NDY4ODM2 | 441 | closed | 0 | Add register_output_renderer hook | russss 45057 | This changeset refactors out the JSON renderer and then adds a hook and dispatcher system to allow custom output renderers to be registered. The CSV output renderer is untouched because supporting streaming renderers through this system would be significantly more complex, and probably not worthwhile. We can't simply allow hooks to be called at request time because we need a list of supported file extensions when the request is being routed in order to resolve ambiguous database/table names. So, renderers need to be registered at startup. I've tried to make this API independent of Sanic's request/response objects so that this can remain stable during the switch to ASGI. I'm using dictionaries to keep it simple and to make adding additional options in the future easy. Fixes #440 | 2019-04-29T18:03:21Z | 2019-05-01T23:01:57Z | 2019-05-01T23:01:57Z | 2019-05-01T23:01:57Z | cf406c075433882b656e340870adf7757976fa4c | 0 | c9f941f06eb0268841de49407725917c74a8a2dc | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/441 | ||||
274478761 | MDExOlB1bGxSZXF1ZXN0Mjc0NDc4NzYx | 442 | closed | 0 | Suppress rendering of binary data | russss 45057 | Binary columns (including spatialite geographies) get shown as ugly binary strings in the HTML by default. Nobody wants to see that mess. Show the size of the column in bytes instead. If you want to decode the binary data, you can use a plugin to do it. | 2019-04-29T18:36:41Z | 2019-05-03T18:26:48Z | 2019-05-03T16:44:49Z | 2019-05-03T16:44:49Z | d555baf508de71a5e3dc9a9aed2c13f6f202956d | 0 | bbbd9ea5ad774f088bd963106fa5756bfd77c799 | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/442 | ||||
275275610 | MDExOlB1bGxSZXF1ZXN0Mjc1Mjc1NjEw | 443 | closed | 0 | Pass view_name to extra_body_script hook | russss 45057 | At the moment it's not easy to tell whether the hook is being called in (for example) the row or table view, as in both cases the `database` and `table` parameters are provided. This passes the `view_name` added in #441 to the `extra_body_script` hook. | 2019-05-02T08:38:36Z | 2019-05-03T13:12:20Z | 2019-05-03T13:12:20Z | 2019-05-03T13:12:20Z | bf229c9bd88179c8ec16bd65fd4fb28ab4241c2e | 0 | 83b6b82d4787b30d34eb26c22ad1ff9c5c118134 | efc93b8ab5a21e3802f75f08d5e41409f5684b5d | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/443 | ||||
275281307 | MDExOlB1bGxSZXF1ZXN0Mjc1MjgxMzA3 | 444 | closed | 0 | Add a max-line-length setting for flake8 | russss 45057 | This stops my automatic editor linting from flagging lines which are too long. It's been lingering in my checkout for ages. 160 is an arbitrary large number - we could alter it if we have any opinions (but I find the line length limit to be my least favourite part of PEP8). | 2019-05-02T08:58:57Z | 2019-05-04T09:44:48Z | 2019-05-03T13:11:28Z | 2019-05-03T13:11:28Z | 470cf0b05d4fda0d2563f81c7e32af13fe346ccc | 0 | 4f0d265951d7e95920298b46eff39bb9cc783984 | efc93b8ab5a21e3802f75f08d5e41409f5684b5d | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/444 | ||||
275558612 | MDExOlB1bGxSZXF1ZXN0Mjc1NTU4NjEy | 445 | closed | 0 | Extract facet code out into a new plugin hook, closes #427 | simonw 9599 | Datasette previously only supported one type of faceting: exact column value counting. With this change, faceting logic is extracted out into one or more separate classes which can implement other patterns of faceting - this is discussed in #427, but potential upcoming facet types include facet-by-date, facet-by-JSON-array, facet-by-many-2-many and more. A new plugin hook, register_facet_classes, can be used by plugins to add in additional facet classes. Each class must implement two methods: suggest(), which scans columns in the table to decide if they might be worth suggesting for faceting, and facet_results(), which executes the facet operation and returns results ready to be displayed in the UI. | 2019-05-03T00:02:41Z | 2019-05-03T18:17:18Z | 2019-05-03T00:11:27Z | 2019-05-03T00:11:27Z | ea66c45df96479ef66a89caa71fff1a97a862646 | 0 | 1b47d4d8736627c260eb4e8303e552b0e9620a01 | efc93b8ab5a21e3802f75f08d5e41409f5684b5d | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/445 | ||||
275801463 | MDExOlB1bGxSZXF1ZXN0Mjc1ODAxNDYz | 447 | closed | 0 | Use dist: xenial and python: 3.7 on Travis | simonw 9599 | 2019-05-03T18:07:07Z | 2019-05-03T18:17:05Z | 2019-05-03T18:16:53Z | 2019-05-03T18:16:53Z | 553314dcd699a84aa7cc806377150ca0d57a6024 | 0 | cd22e389d09b4fd5ed28205ba38a20faf1ed14f1 | 01b3de5b66742f0f661183e9e2ef66be3600e831 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/447 | |||||
275861559 | MDExOlB1bGxSZXF1ZXN0Mjc1ODYxNTU5 | 449 | closed | 0 | Apply black to everything | simonw 9599 | I've been hesitating on this for literally months, because I'm not at all excited about the giant diff that will result. But I've been using black on many of my other projects (most actively [sqlite-utils](https://github.com/simonw/sqlite-utils)) and the productivity boost is undeniable: I don't have to spend a single second thinking about code formatting any more! So it's worth swallowing the one-off pain and moving on in a new, black-enabled world. | 2019-05-03T21:57:26Z | 2019-05-04T02:17:14Z | 2019-05-04T02:15:15Z | 2019-05-04T02:15:15Z | 35d6ee2790e41e96f243c1ff58be0c9c0519a8ce | 0 | 9683aeb2394a4b7e44499b8a0240af3baafda832 | 66c87cee0c7344c7877373c60b180c766c206101 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/449 | ||||
275909197 | MDExOlB1bGxSZXF1ZXN0Mjc1OTA5MTk3 | 450 | closed | 0 | Coalesce hidden table count to 0 | russss 45057 | For some reason I'm hitting a `None` here with a FTS table. I'm not entirely sure why but this makes the logic work the same as with non-hidden tables. | 2019-05-04T09:37:10Z | 2019-05-11T18:10:09Z | 2019-05-11T18:10:09Z | 5918489a2a2f14b58c5c71773a9d4fb6bb0e3e0a | 0 | f81d9df985e8d054fc16ab91f72878fe71656354 | 55643430f7ac8d27e99b00e7cf79db741003e811 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/450 | |||||
275923066 | MDExOlB1bGxSZXF1ZXN0Mjc1OTIzMDY2 | 452 | open | 0 | SQL builder utility classes | russss 45057 | This adds a straightforward set of classes to aid in the construction of SQL queries. My plan for this was to allow plugins to manipulate the Datasette-generated SQL in a more structured way. I'm not sure that's going to work, but I feel like this is still a step forward - it reduces the number of intermediate variables in `TableView.data` which aids readability, and also factors out a lot of the boring string concatenation. There are a fair number of minor structure changes in here too as I've tried to make the ordering of `TableView.data` a bit more logical. As far as I can tell, I haven't broken anything... | 2019-05-04T13:57:47Z | 2019-05-04T14:03:04Z | 45e7460d78c3f87c01f2e9e142cb7f646b23b156 | 0 | c63762280d3bd66ad6ea24933dafe218861efef2 | 55643430f7ac8d27e99b00e7cf79db741003e811 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/452 | ||||||
277524072 | MDExOlB1bGxSZXF1ZXN0Mjc3NTI0MDcy | 458 | closed | 0 | setup: add tests to package exclusion | hellerve 7725188 | This PR fixes #456 by adding `tests` to the package exclusion list. Cheers | 2019-05-09T19:47:21Z | 2020-07-21T01:14:42Z | 2019-05-10T01:54:51Z | 2019-05-10T01:54:51Z | 9f8d9fe262866ff3463f8e61214dcc6897bd5a9c | 0 | 9c65ff1ba8c855e4ade5bc7ae29a69215b3979d0 | f825e2012109247fa246e2b938f8174069e574f1 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/458 | ||||
280204276 | MDExOlB1bGxSZXF1ZXN0MjgwMjA0Mjc2 | 479 | closed | 0 | doc typo fix | IgnoredAmbience 98555 | Fix typo in performance doc page | 2019-05-19T22:54:25Z | 2019-05-20T16:42:29Z | 2019-05-20T16:42:29Z | 2019-05-20T16:42:29Z | 70d2858067d3c4da0e17c1d39e03de89190e94b6 | 0 | 708e13ab87f8c8620796c3e8f2b0aa1b2fc26875 | e513a80afba30bca9eeebd71c5e6aa6d8a811f33 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/479 | ||||
280205502 | MDExOlB1bGxSZXF1ZXN0MjgwMjA1NTAy | 480 | closed | 0 | Split pypi and docker travis tasks | glasnt 813732 | Resolves #478 This *should* work, but because this is a change that'll only really be testable on a) this repo, b) master branch, this might fail fast if I didn't get the configurations right. Looking at #478 it should just be as simple as splitting out the docker and pypi processes into separate jobs, but it might end up being more complicated than that, depending on what pre-processes the pypi deployment needs, and how travisci treats deployment steps without scripts in general. | 2019-05-19T23:14:37Z | 2019-07-07T20:03:20Z | 2019-07-07T20:03:20Z | 2019-07-07T20:03:20Z | d95048031edb02bbc9892879507f55a4f29c5459 | Datasette 0.29 4471010 | 0 | 8b667898b6c2dd57fa68310c6d3c62d77b68f321 | 4246e138f9512686413e97878659ef953337e57b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/480 | |||
284390197 | MDExOlB1bGxSZXF1ZXN0Mjg0MzkwMTk3 | 497 | closed | 0 | Upgrade pytest to 4.6.1 | simonw 9599 | 2019-06-03T01:45:34Z | 2019-06-03T02:06:32Z | 2019-06-03T02:06:27Z | 2019-06-03T02:06:27Z | 5e8fbf7f6fbc0b63d0479da3806dd9ccd6aaa945 | 0 | bf2ab0306e6d3ce7524fecf015e2cec7ae45e994 | 803f750309bf0cd5b7501228c1efcf9a35686d74 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/497 | |||||
284743794 | MDExOlB1bGxSZXF1ZXN0Mjg0NzQzNzk0 | 500 | closed | 0 | Fix typo in install step: should be install -e | tmcw 32314 | 2019-06-03T21:50:51Z | 2019-06-11T18:48:43Z | 2019-06-11T18:48:40Z | 2019-06-11T18:48:40Z | aa911122feab13f8e65875c98edb00fd3832b7b8 | 0 | ff98f44d7f10ff65fc172df9155c77f169ab4c7f | 5e8fbf7f6fbc0b63d0479da3806dd9ccd6aaa945 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/500 | |||||
285698310 | MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw | 501 | closed | 0 | Test against Python 3.8-dev using Travis | simonw 9599 | 2019-06-06T08:37:53Z | 2019-11-11T03:23:29Z | 2019-11-11T03:23:29Z | 1aac0cf0ab962060dd5cff19b8b179bb7fa0f00b | 0 | a5defb684fcc734f6325ca08beef9f49c3e7a298 | 5e8fbf7f6fbc0b63d0479da3806dd9ccd6aaa945 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/501 | ||||||
290897104 | MDExOlB1bGxSZXF1ZXN0MjkwODk3MTA0 | 518 | closed | 0 | Port Datasette from Sanic to ASGI + Uvicorn | simonw 9599 | Most of the code here was fleshed out in comments on #272 (Port Datasette to ASGI) - this pull request will track the final pieces: - [x] Update test harness to more correctly simulate the `raw_path` issue - [x] Use `raw_path` so table names containing `/` can work correctly - [x] Bug: JSON not served with correct content-type - [x] Get ?_trace=1 working again - [x] Replacement for `@app.listener("before_server_start")` - [x] Bug: `/fixtures/table%2Fwith%2Fslashes.csv?_format=json` downloads as CSV - [x] Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency - [x] Final code tidy-up before merging to master | 2019-06-23T15:18:42Z | 2019-06-24T13:42:50Z | 2019-06-24T03:13:09Z | 2019-06-24T03:13:09Z | ba8db9679f3bd2454c9e76e7e6c352126848b57a | simonw 9599 | Datasette 1.0 3268330 | 0 | b794554a26ddc81bd772c4422d80d5ee863e92b0 | 35429f90894321eda7f2db31b9ea7976f31f73ac | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/518 | ||
290971295 | MDExOlB1bGxSZXF1ZXN0MjkwOTcxMjk1 | 524 | closed | 0 | Sort commits using isort, refs #516 | simonw 9599 | Also added a lint unit test to ensure they stay sorted. #516 | 2019-06-24T05:04:48Z | 2023-08-23T01:31:08Z | 2023-08-23T01:31:08Z | 4e92ebe00a058e02b2d7543cff60ac2f78aa97c7 | 0 | dafae70ee7f74ce79b541a94385172be3ad0de83 | cdd24f3eaa207f67d948c1876725b0f84654a623 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/524 | |||||
291534596 | MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2 | 529 | closed | 0 | Use keyed rows - fixes #521 | nathancahill 1383872 | Supports template syntax like this: ``` {% for row in display_rows %} <h2 class="scientist">{{ row["First_Name"] }} {{ row["Last_Name"] }}</h2> ... ``` | 2019-06-25T12:33:48Z | 2019-06-25T12:35:07Z | 2019-06-25T12:35:07Z | 3be9759418fdfe4a8ae8aec46fc2a937d45332d2 | 0 | 312e3394bd9f3eaef606fbe37eb409ec7462baaf | 9e97b725f11be3f4dca077fe5569078a62ec2761 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/529 | |||||
293962405 | MDExOlB1bGxSZXF1ZXN0MjkzOTYyNDA1 | 533 | closed | 0 | Support cleaner custom templates for rows and tables, closes #521 | simonw 9599 | - [x] Rename `_rows_and_columns.html` to `_table.html` - [x] Unit test - [x] Documentation | 2019-07-03T00:40:18Z | 2019-07-03T03:23:06Z | 2019-07-03T03:23:06Z | 2019-07-03T03:23:06Z | b9ede4c1898616512b5d204f9c941deff473cbe4 | 0 | 1add905532b7bc4f681318b8f22b9b74cca2b2a0 | 76882830548e16905348ee75acb0044cb8e1fd20 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/533 | ||||
293992382 | MDExOlB1bGxSZXF1ZXN0MjkzOTkyMzgy | 535 | closed | 0 | Added asgi_wrapper plugin hook, closes #520 | simonw 9599 | 2019-07-03T03:58:00Z | 2019-07-03T04:06:26Z | 2019-07-03T04:06:26Z | 2019-07-03T04:06:26Z | 4d2fdafe39159c9a8aa83f7e9bfe768bbbbb56a3 | 0 | 93bfa26bfd25a3cc911d637596e364d3474325bd | b9ede4c1898616512b5d204f9c941deff473cbe4 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/535 | |||||
293994443 | MDExOlB1bGxSZXF1ZXN0MjkzOTk0NDQz | 536 | closed | 0 | Switch to ~= dependencies, closes #532 | simonw 9599 | 2019-07-03T04:12:16Z | 2019-07-03T04:32:55Z | 2019-07-03T04:32:55Z | 2019-07-03T04:32:55Z | f0d32da0a9af87bcb15e34e35424f0c0053be83a | 0 | 391d109dc3f9230dc4ee4afd20041e480e90e739 | 4d2fdafe39159c9a8aa83f7e9bfe768bbbbb56a3 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/536 | |||||
294400446 | MDExOlB1bGxSZXF1ZXN0Mjk0NDAwNDQ2 | 539 | closed | 0 | Secret plugin configuration options | simonw 9599 | Refs #538 | 2019-07-04T03:21:20Z | 2019-07-04T05:36:45Z | 2019-07-04T05:36:45Z | 2019-07-04T05:36:45Z | a2d45931935f6bb73605a94afedf9e78308c95d6 | 0 | fd6164b03ebe450a9a00df2e5be2dc7bbfbd9a3f | f0d32da0a9af87bcb15e34e35424f0c0053be83a | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/539 | ||||
294992578 | MDExOlB1bGxSZXF1ZXN0Mjk0OTkyNTc4 | 542 | closed | 0 | extra_template_vars plugin hook | simonw 9599 | Refs #541 | 2019-07-05T22:19:17Z | 2019-07-06T00:05:57Z | 2019-07-06T00:05:56Z | 2019-07-06T00:05:56Z | fcfcae21e67cc15090942b1d2a47b5f016279337 | 0 | e81c7abb40c8ecf8d9d23cbcdde045e0a3b4ab14 | a18e0964ecd04593f227616538a80dee08768057 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/542 | ||||
295065796 | MDExOlB1bGxSZXF1ZXN0Mjk1MDY1Nzk2 | 544 | closed | 0 | --plugin-secret option | simonw 9599 | Refs #543 - [x] Zeit Now v1 support - [x] Solve escaping of ENV in Dockerfile - [x] Heroku support - [x] Unit tests - [x] Cloud Run support - [x] Documentation | 2019-07-06T22:18:20Z | 2019-07-08T02:06:31Z | 2019-07-08T02:06:31Z | 2019-07-08T02:06:31Z | 973f8f139df6ad425354711052cfc2256de2e522 | Datasette 0.29 4471010 | 0 | ccf80604e931fba1893b5bab11de386fed82009e | fcfcae21e67cc15090942b1d2a47b5f016279337 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/544 | |||
295127213 | MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz | 546 | open | 0 | Facet by delimiter | simonw 9599 | Refs #510 | 2019-07-07T20:06:05Z | 2019-11-18T23:46:01Z | 68a6fb1a576a747b868771d00a10753f35aaa0cf | 0 | 47ac6c6e46da16716d295d7cda8f79cd0663ca5e | a9909c29ccac771c23c2ef22b89d10697b5256b9 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/546 | ||||||
295711504 | MDExOlB1bGxSZXF1ZXN0Mjk1NzExNTA0 | 554 | closed | 0 | Fix static mounts using relative paths and prevent traversal exploits | abdusco 3243482 | While debugging why my static mounts using a relative path (`--static mystatic:rel/path/to/dir`) not working, I noticed that the requests fail no matter what, returning 404 errors. The reason is that datasette tries to prevent traversal exploits by checking if the path is relative to its registered directory. This check fails when the mount is a relative directory, because `/abs/dir/file` obviously not under `dir/file`. https://github.com/simonw/datasette/blob/81fa8b6cdc5457b42a224779e5291952314e8d20/datasette/utils/asgi.py#L303-L306 This also has the consequence of returning any requested file, because when `/abs/dir/../../evil.file` resolves `aiofiles` happily returns it to the client after it resolves the path itself. The solution is to make sure we're checking relativity of paths after they're fully resolved. I've implemented the mentioned changes and also updated the tests. | 2019-07-09T11:32:02Z | 2019-07-11T16:29:26Z | 2019-07-11T16:13:19Z | 2019-07-11T16:13:19Z | 74ecf8a7cc45cabf369e510c7214f5ed85c8c6d8 | 0 | fa7ddea3ea6c9378bee7d5f5c93fe05d735a0afb | 81fa8b6cdc5457b42a224779e5291952314e8d20 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/554 | ||||
295748268 | MDExOlB1bGxSZXF1ZXN0Mjk1NzQ4MjY4 | 556 | closed | 0 | Add support for running datasette as a module | abdusco 3243482 | This PR allows running datasette using `python -m datasette` command in addition to just running the executable. This function is quite useful when debugging a plugin in a project because IDEs like PyCharm can easily start a debug session when datasette is run as a module in contrast to trying to attach a debugger to a running process. ![image](https://user-images.githubusercontent.com/3243482/60890448-fc4ede80-a263-11e9-8b42-d2a3db8d1a59.png) | 2019-07-09T13:13:30Z | 2019-07-11T16:07:45Z | 2019-07-11T16:07:44Z | 2019-07-11T16:07:44Z | 9ca860e54fe480d0a365c0c1d8d085926d12be1e | 0 | 056a7eac9480cb814d9c453b983e6b2b831e0ca1 | 81fa8b6cdc5457b42a224779e5291952314e8d20 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/556 | ||||
296735320 | MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw | 557 | closed | 0 | Get tests running on Windows using Travis CI | simonw 9599 | Refs #511 | 2019-07-11T16:36:57Z | 2021-07-10T23:39:48Z | 2021-07-10T23:39:48Z | cddb9a9fecfa25147d80df05f1a6d6e1686ca30d | 0 | 47b5ab43be87217c4e40ad93b8aa2e9639fa371f | f2006cca80040871439055ae6ccbc14e589bdf4b | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/557 | |||||
297243073 | MDExOlB1bGxSZXF1ZXN0Mjk3MjQzMDcz | 559 | closed | 0 | Bump to uvicorn 0.8.4 | simonw 9599 | https://github.com/encode/uvicorn/commits/0.8.4 Query strings will now be included in log files: https://github.com/encode/uvicorn/pull/384 | 2019-07-12T22:30:29Z | 2019-07-13T22:34:58Z | 2019-07-13T22:34:58Z | 2019-07-13T22:34:58Z | d224ee2c98ac39c2c6e21a0ac0c62e5c3e1ccd11 | 0 | 029e3d53634cc38690d5b56427a3c87851a61b09 | f2006cca80040871439055ae6ccbc14e589bdf4b | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/559 | ||||
298962551 | MDExOlB1bGxSZXF1ZXN0Mjk4OTYyNTUx | 561 | closed | 0 | Fix typos | minho42 15278512 | 2019-07-18T15:13:35Z | 2019-07-26T10:25:45Z | 2019-07-26T10:25:45Z | 2019-07-26T10:25:45Z | 27cb29365c9f5f6f1492968d1268497193ed75a2 | 0 | 41341195075adc5093d33633d980657ecdac043c | a9453c4dda70bbf5122835e68f63db6ecbe1a6fc | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/561 | |||||
301483613 | MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz | 564 | open | 0 | First proof-of-concept of Datasette Library | simonw 9599 | Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() | 2019-07-26T10:22:26Z | 2023-02-07T15:14:11Z | 4f425d2b39d1be10d7ef5c146480a3eb494d5086 | 1 | 947645d84710677ea50762016081a9fbc6b014a8 | a9453c4dda70bbf5122835e68f63db6ecbe1a6fc | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/564 | ||||||
313384926 | MDExOlB1bGxSZXF1ZXN0MzEzMzg0OTI2 | 571 | closed | 0 | detect_fts now works with alternative table escaping | simonw 9599 | Fixes #570 | 2019-09-03T00:23:39Z | 2019-09-03T00:32:28Z | 2019-09-03T00:32:28Z | 2019-09-03T00:32:28Z | 2dc5c8dc259a0606162673d394ba8cc1c6f54428 | 0 | a85239f69261c10f1a9f90514c8b5d113cb94585 | f04deebec4f3842f7bd610cd5859de529f77d50e | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/571 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [pull_requests] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [state] TEXT, [locked] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [body] TEXT, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [merged_at] TEXT, [merge_commit_sha] TEXT, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [draft] INTEGER, [head] TEXT, [base] TEXT, [author_association] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [url] TEXT, [merged_by] INTEGER REFERENCES [users]([id]) , [auto_merge] TEXT); CREATE INDEX [idx_pull_requests_merged_by] ON [pull_requests] ([merged_by]); CREATE INDEX [idx_pull_requests_repo] ON [pull_requests] ([repo]); CREATE INDEX [idx_pull_requests_milestone] ON [pull_requests] ([milestone]); CREATE INDEX [idx_pull_requests_assignee] ON [pull_requests] ([assignee]); CREATE INDEX [idx_pull_requests_user] ON [pull_requests] ([user]);