pull_requests_fts
608 rows
This data as json, CSV (advanced)
Link | rowid ▼ | title | body | pull_requests_fts | rank |
---|---|---|---|---|---|
152360740 | 152360740 | :fire: Removes DS_Store | 43406 | ||
152522762 | 152522762 | SQL syntax highlighting with CodeMirror | Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {"Ctrl-Space": "autocomplete"}, hintOptions: {tables: { users: ["name", "score", "birthDate"], countries: ["name", "population", "size"] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/) | 43406 | |
152631570 | 152631570 | Initial add simple prod ready Dockerfile refs #57 | Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ``` | 43406 | |
152870030 | 152870030 | [WIP] Add publish to heroku support | Refs #90 | 43406 | |
152914480 | 152914480 | add support for ?field__isnull=1 | Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)? | 43406 | |
153201945 | 153201945 | Add spatialite, switch to debian and local build | Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite. | 43406 | |
153306882 | 153306882 | Add keyboard shortcut to execute SQL query | Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling. | 43406 | |
153324301 | 153324301 | Don't prevent tabbing to `Run SQL` button | Mentioned in #115 Here you go! | 43406 | |
153432045 | 153432045 | Foreign key information on row and table pages | 43406 | ||
154246816 | 154246816 | Fix pytest version conflict | https://travis-ci.org/simonw/datasette/jobs/305929426 pkg_resources.VersionConflict: (pytest 3.2.1 (/home/travis/virtualenv/python3.5.3/lib/python3.5/site-packages), Requirement.parse('pytest==3.2.3')) | 43406 | |
157365811 | 157365811 | Upgrade to Sanic 0.7.0 | 43406 | ||
161982711 | 161982711 | If metadata exists, add it to heroku launch command | The heroku build does seem to make use of any provided `metadata.json` file. Add the `--metadata` switch to the Heroku web launch command if a `metadata.json` file is available. Addresses: https://github.com/simonw/datasette/issues/177 | 43406 | |
163523976 | 163523976 | make html title more readable in query template | tiny tweak to make this easier to visually parse—I think it matches your style in other templates | 43406 | |
163561830 | 163561830 | add "format sql" button to query page, uses sql-formatter | Cool project! This fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance. | 43406 | |
165029807 | 165029807 | Add db filesize next to download link | Took a stab at #172, will this do the trick? | 43406 | |
179108961 | 179108961 | New ?_shape=objects/object/lists param for JSON API | Refs #122 | 43406 | |
180188397 | 180188397 | _sort= and _sort_desc= parameters to table view | See #189 | 43406 | |
181033024 | 181033024 | Hide Spatialite system tables | They were getting on my nerves. | 43406 | |
181247568 | 181247568 | Raise 404 on nonexistent table URLs | Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything. This is issue #184. | 43406 | |
181600926 | 181600926 | Initial units support | Add support for specifying units for a column in metadata.json and rendering them on display using [pint](https://pint.readthedocs.io/en/latest/). Example table metadata: ```json "license_frequency": { "units": { "frequency": "Hz", "channel_width": "Hz", "height": "m", "antenna_height": "m", "azimuth": "degrees" } } ``` [Example result](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency/1) This works surprisingly well! I'd like to add support for using units when querying but this is PR is pretty usable as-is. (Pint doesn't seem to support decibels though - it thinks they're decibytes - which is an annoying omission.) (ref ticket #203) | 43406 | |
181642114 | 181642114 | Support filtering with units and more | The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels! | 43406 | |
181644805 | 181644805 | Fix sqlite error when loading rows with no incoming FKs | This fixes `ERROR: conn=<sqlite3.Connection object at 0x10bbb9f10>, sql = 'select ', params = {'id': '1'}` caused by an invalid query loading incoming FKs when none exist. The error was ignored due to async but it still got printed to the console. | 43406 | |
181647717 | 181647717 | Link foreign keys which don't have labels | This renders unlabeled FKs as simple links. I can't see why this would cause any major problems. ![image](https://user-images.githubusercontent.com/45057/38768722-ea15a000-3fef-11e8-8664-ffd7aa4894ea.png) Also includes bonus fixes for two minor issues: * In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. * Print tracebacks to console when handling 500 errors. | 43406 | |
181654839 | 181654839 | Return HTTP 405 on InvalidUsage rather than 500 | This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue. | 43406 | |
181723303 | 181723303 | Don't duplicate simple primary keys in the link column | When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. This might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text "view" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.) Bonus change in this PR: fix urlencoding of links in the displayed HTML. Before: ![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png) After: ![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png) | 43406 | |
181731956 | 181731956 | Start of the plugin system, based on pluggy | Refs #14 | 43406 | |
181755220 | 181755220 | New --plugins-dir=plugins/ option | Refs #211 | 43406 | |
182357613 | 182357613 | Fix for plugins in Python 3.5 | 43406 | ||
183135604 | 183135604 | Fix a typo | It looks like this was the only instance of it: https://github.com/simonw/datasette/search?utf8=%E2%9C%93&q=SOLite&type= | 43406 | |
185307407 | 185307407 | ?_shape=array and _timelimit= | 43406 | ||
187668890 | 187668890 | Refactor views | * Split out view classes from main `app.py` * Run [black](https://github.com/ambv/black) against resulting code to apply opinionated source code formatting * Run [isort](https://github.com/timothycrosley/isort) to re-order my imports Refs #256 | 43406 | |
187770345 | 187770345 | Add new metadata key persistent_urls which removes the hash from all database urls | Add new metadata key "persistent_urls" which removes the hash from all database urls when set to "true" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow. | 43406 | |
188312411 | 188312411 | Facets improvements plus suggested facets | Refs #255 | 43406 | |
189318453 | 189318453 | Refactor inspect logic | This pulls the logic for inspect out into a new file which makes it a bit easier to understand. This was going to be the first part of an implementation for #276, but it seems like that might take a while so I'm going to PR a few bits of refactoring individually. | 43406 | |
189707374 | 189707374 | Add version number support with Versioneer | I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 | 43406 | |
189723716 | 189723716 | Build Dockerfile with recent Sqlite + Spatialite | This solves #278 without bloating the Dockerfile too much, the image size is now 495MB (original was ~240MB) but it could be reduced significantly if we only copied the output of the compilation of spatialite and friends to /usr/local/lib, instead of the entirety of it however that will take more time. In the python code change references to `import sqlite3` to `import pysqlite3` and it should use the compiled version of sqlite3.23.1. You don't need to try/except because pysqlite3 falls back to builtin sqlite3 if there is no compiled version. ```bash $ docker run --rm -it datasette spatialite SpatiaLite version ..: 4.4.0-RC0 Supported Extensions: - 'VirtualShape' [direct Shapefile access] - 'VirtualDbf' [direct DBF access] - 'VirtualXL' [direct XLS access] - 'VirtualText' [direct CSV/TXT access] - 'VirtualNetwork' [Dijkstra shortest path] - 'RTree' [Spatial Index - R*Tree] - 'MbrCache' [Spatial Index - MBR cache] - 'VirtualSpatialIndex' [R*Tree metahandler] - 'VirtualElementary' [ElemGeoms metahandler] - 'VirtualKNN' [K-Nearest Neighbors metahandler] - 'VirtualXPath' [XML Path Language - XPath] - 'VirtualFDO' [FDO-OGR interoperability] - 'VirtualGPKG' [OGC GeoPackage interoperability] - 'VirtualBBox' [BoundingBox tables] - 'SpatiaLite' [Spatial SQL - OGC] PROJ.4 version ......: Rel. 4.9.3, 15 August 2016 GEOS version ........: 3.5.1-CAPI-1.9.1 r4246 TARGET CPU ..........: x86_64-linux-gnu the SPATIAL_REF_SYS table already contains some row(s) SQLite version ......: 3.23.1 Enter ".help" for instructions SQLite version 3.23.1 2018-04-10 17:39:29 Enter ".help" for instructions Enter SQL statements terminated with a ";" spatialite> ``` ```bash $ docker run --rm -it datasette python -c "import pysqlite3; print(pysqlite3.sqlite_version)" 3.23.1 ``` | 43406 | |
189860052 | 189860052 | Reduces image size using Alpine + Multistage (re: #278) | Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it <my-tag> datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 | 43406 | |
190901429 | 190901429 | Support for external database connectors | I think it would be nice that Datasette could work with other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. These external connectors must have a structure similar to the structure [PyTables Datasette connector](https://github.com/PyTables/datasette-pytables) has. | 43406 | |
193361341 | 193361341 | Initial sketch of custom URL routing, refs #306 | See #306 for background on this. | 43406 | |
195339111 | 195339111 | ?_labels=1 to expand foreign keys (in csv and json), refs #233 | Output looks something like this: { "rowid": 233, "TreeID": 121240, "qLegalStatus": { "value" 2, "label": "Private" } "qSpecies": { "value": 16, "label": "Sycamore" } "qAddress": "91 Commonwealth Ave", ... } | 43406 | |
195413241 | 195413241 | Streaming mode for downloading all rows as a CSV | Refs #266 | 43406 | |
196526861 | 196526861 | Feature/in operator | 43406 | ||
196628304 | 196628304 | Speed up Travis by reusing pip wheel cache across builds | From https://atchai.com/blog/faster-ci/ - refs #323 | 43406 | |
201075532 | 201075532 | Bump aiohttp to fix compatibility with Python 3.7 | Tests failed here: https://travis-ci.org/simonw/datasette/jobs/403223333 | 43406 | |
201451332 | 201451332 | Allow app names for `datasette publish heroku` | Lets you supply the `-n` parameter for Heroku deploys, which also lets you update existing Heroku deployments. | 43406 | |
204029142 | 204029142 | publish_subcommand hook + default plugins mechanism, used for publish heroku/now | This change introduces a new plugin hook, publish_subcommand, which can be used to implement new subcommands for the "datasette publish" command family. I've used this new hook to refactor out the "publish now" and "publish heroku" implementations into separate modules. I've also added unit tests for these two publishers, mocking the subprocess.call and subprocess.check_output functions. As part of this, I introduced a mechanism for loading default plugins. These are defined in the new "default_plugins" list inside datasette/app.py Closes #217 (Plugin support for "datasette publish") Closes #348 (Unit tests for "datasette publish") Refs #14, #59, #102, #103, #146, #236, #347 | 43406 | |
204851511 | 204851511 | render_cell(value) plugin hook | Closes #352. | 43406 | |
205770996 | 205770996 | Make .indexes compatible with older SQLite versions | Older SQLite versions return a different set of columns from the PRAGMA we are using. | 43406 | |
206863803 | 206863803 | Bump versions of pytest, pluggy and beautifulsoup4 | 43406 | ||
208719043 | 208719043 | Import pysqlite3 if available, closes #360 | 43406 | ||
211860706 | 211860706 | Search all apps during heroku publish | Adds the `-A` option to include apps from all organizations when searching app names for publish. | 43406 | |
214653641 | 214653641 | Support for other types of databases using external connectors | This PR is related to #293, but now all commits have been merged. The purpose is to support other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. The modifications in the original datasette code are minimal and many are in a separated file. | 43406 | |
216651317 | 216651317 | fix small doc typo | 43406 | ||
226314862 | 226314862 | Mark codemirror files as vendored | GitHub lists datasette as a Javascript project, primarily because of the vendored codemirror files. This is somewhat confusing when you're looking for datasette, knowing it's written in Python. Luckily it's possible exclude certain files from GitHub's code statistics: https://github.com/github/linguist#using-gitattributes | 43406 | |
226315513 | 226315513 | Update installation instructions | I was writing this as a response to your tweet, but decided I might just make it a pull request. I feel like it might be confusing to those unfamiliar with Python's `-m` flag and the built-in `venv` module to omit the space between the flag and its argument. By adding a space and prefixing the second occurrence of `venv` with a `./` it's maybe a bit clearer what the arguments are and what they do. By also using `python3 -m pip` it becomes even clearer that `-m` is a special flag that makes the python executable do neat things. | 43406 | |
232172106 | 232172106 | Bump dependency versions | 43406 | ||
235194286 | 235194286 | tiny typo in customization docs | was looking to add some custom templates to my use of datasette and saw this small typo. | 43406 | |
241418443 | 241418443 | Fix some regex DeprecationWarnings | 43406 | ||
247576942 | 247576942 | Fts5 | 43406 | ||
247861419 | 247861419 | Run Travis tests against Python 3.8-dev | 43406 | ||
247923347 | 247923347 | Experiment: run Jinja in async mode | See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea. | 43406 | |
249680944 | 249680944 | :pencil: Updates my_database.py to my_database.db | I noticed that both `.py` and `.db` were used in the docs and assumed you'd prefer `.db`. | 43406 | |
250628275 | 250628275 | Heroku --include-vcs-ignore | Should mean `datasette publish heroku` can work under Travis, unlike this failure: https://travis-ci.org/simonw/fivethirtyeight-datasette/builds/488047550 ``` 2.25s$ datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette tar: unrecognized option '--exclude-vcs-ignores' Try 'tar --help' or 'tar --usage' for more information. ▸ Command failed: tar cz -C /tmp/tmpuaxm7i8f --exclude-vcs-ignores --exclude ▸ .git --exclude .gitmodules . > ▸ /tmp/f49440e0-1bf3-4d3f-9eb0-fbc2967d1fd4.tar.gz ▸ tar: unrecognized option '--exclude-vcs-ignores' ▸ Try 'tar --help' or 'tar --usage' for more information. ▸ The command "datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette" exited with 0. ``` The fix for that issue is to call the heroku command like this: heroku builds:create -a app_name --include-vcs-ignore | 43406 | |
255658112 | 255658112 | Support for numpy types, closes #11 | 43406 | ||
255725057 | 255725057 | Update spatialite.rst | a line of sql added to create the idx_<table_name> in the python recipe | 43406 | |
261418285 | 261418285 | URL hashing now optional: turn on with --config hash_urls:1 (#418) | 43406 | ||
266035382 | 266035382 | Column types in inspected metadata | This PR does two things: * Adds the sqlite column type for each column to the inspected table info. * Stops binary columns from being rendered to HTML, unless a plugin handles it. There's a bit more detail in the changeset descriptions. These changes are intended as a precursor to a plugin which adds first-class support for Spatialite geographic primitives, and perhaps more useful geo-stuff. | 43406 | |
269364924 | 269364924 | Upgrade to Jinja2==2.10.1 | https://nvd.nist.gov/vuln/detail/CVE-2019-10906 This is only a security issue of concern if evaluating templates from untrusted sources, which isn't something I would ever expect a Datasette user to do. | 43406 | |
270191084 | 270191084 | ?_where= parameter on table views, closes #429 | 43406 | ||
270251021 | 270251021 | Refactor facets to a class and new plugin, refs #427 | WIP for #427 | 43406 | |
271338405 | 271338405 | "datasette publish cloudrun" command to publish to Google Cloud Run | This is a very rough draft to start a discussion on a possible datasette cloud run publish plugin (see issue #400). The main change was to dynamically set the listening port in `make_dockerfile` to satisfy cloud run's [requirements](https://cloud.google.com/run/docs/reference/container-contract). This was done by running `datasette` through `sh` to get environment variable substitution. Not sure if that's the right approach? | 43406 | |
274174614 | 274174614 | Add inspect and prepare_sanic hooks | This adds two new plugin hooks: The `inspect` hook allows plugins to add data to the inspect dictionary. The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now... On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer). Ref #14 | 43406 | |
274313625 | 274313625 | [WIP] Add primary key to the extra_body_script hook arguments | This allows the row to be identified on row pages. The context here is that I want to access the row's data to plot it on a map. I considered passing the entire template context through to the hook function. This would expose the actual row data and potentially avoid a further fetch request in JS, but it does make the plugin API a lot more leaky. (At any rate, using the selected row data is tricky in my case because of Spatialite's infuriating custom binary representation...) | 43406 | |
274468836 | 274468836 | Add register_output_renderer hook | This changeset refactors out the JSON renderer and then adds a hook and dispatcher system to allow custom output renderers to be registered. The CSV output renderer is untouched because supporting streaming renderers through this system would be significantly more complex, and probably not worthwhile. We can't simply allow hooks to be called at request time because we need a list of supported file extensions when the request is being routed in order to resolve ambiguous database/table names. So, renderers need to be registered at startup. I've tried to make this API independent of Sanic's request/response objects so that this can remain stable during the switch to ASGI. I'm using dictionaries to keep it simple and to make adding additional options in the future easy. Fixes #440 | 43406 | |
274478761 | 274478761 | Suppress rendering of binary data | Binary columns (including spatialite geographies) get shown as ugly binary strings in the HTML by default. Nobody wants to see that mess. Show the size of the column in bytes instead. If you want to decode the binary data, you can use a plugin to do it. | 43406 | |
275275610 | 275275610 | Pass view_name to extra_body_script hook | At the moment it's not easy to tell whether the hook is being called in (for example) the row or table view, as in both cases the `database` and `table` parameters are provided. This passes the `view_name` added in #441 to the `extra_body_script` hook. | 43406 | |
275281307 | 275281307 | Add a max-line-length setting for flake8 | This stops my automatic editor linting from flagging lines which are too long. It's been lingering in my checkout for ages. 160 is an arbitrary large number - we could alter it if we have any opinions (but I find the line length limit to be my least favourite part of PEP8). | 43406 | |
275558612 | 275558612 | Extract facet code out into a new plugin hook, closes #427 | Datasette previously only supported one type of faceting: exact column value counting. With this change, faceting logic is extracted out into one or more separate classes which can implement other patterns of faceting - this is discussed in #427, but potential upcoming facet types include facet-by-date, facet-by-JSON-array, facet-by-many-2-many and more. A new plugin hook, register_facet_classes, can be used by plugins to add in additional facet classes. Each class must implement two methods: suggest(), which scans columns in the table to decide if they might be worth suggesting for faceting, and facet_results(), which executes the facet operation and returns results ready to be displayed in the UI. | 43406 | |
275801463 | 275801463 | Use dist: xenial and python: 3.7 on Travis | 43406 | ||
275861559 | 275861559 | Apply black to everything | I've been hesitating on this for literally months, because I'm not at all excited about the giant diff that will result. But I've been using black on many of my other projects (most actively [sqlite-utils](https://github.com/simonw/sqlite-utils)) and the productivity boost is undeniable: I don't have to spend a single second thinking about code formatting any more! So it's worth swallowing the one-off pain and moving on in a new, black-enabled world. | 43406 | |
275909197 | 275909197 | Coalesce hidden table count to 0 | For some reason I'm hitting a `None` here with a FTS table. I'm not entirely sure why but this makes the logic work the same as with non-hidden tables. | 43406 | |
275923066 | 275923066 | SQL builder utility classes | This adds a straightforward set of classes to aid in the construction of SQL queries. My plan for this was to allow plugins to manipulate the Datasette-generated SQL in a more structured way. I'm not sure that's going to work, but I feel like this is still a step forward - it reduces the number of intermediate variables in `TableView.data` which aids readability, and also factors out a lot of the boring string concatenation. There are a fair number of minor structure changes in here too as I've tried to make the ordering of `TableView.data` a bit more logical. As far as I can tell, I haven't broken anything... | 43406 | |
277524072 | 277524072 | setup: add tests to package exclusion | This PR fixes #456 by adding `tests` to the package exclusion list. Cheers | 43406 | |
280204276 | 280204276 | doc typo fix | Fix typo in performance doc page | 43406 | |
280205502 | 280205502 | Split pypi and docker travis tasks | Resolves #478 This *should* work, but because this is a change that'll only really be testable on a) this repo, b) master branch, this might fail fast if I didn't get the configurations right. Looking at #478 it should just be as simple as splitting out the docker and pypi processes into separate jobs, but it might end up being more complicated than that, depending on what pre-processes the pypi deployment needs, and how travisci treats deployment steps without scripts in general. | 43406 | |
284390197 | 284390197 | Upgrade pytest to 4.6.1 | 43406 | ||
284743794 | 284743794 | Fix typo in install step: should be install -e | 43406 | ||
285698310 | 285698310 | Test against Python 3.8-dev using Travis | 43406 | ||
290897104 | 290897104 | Port Datasette from Sanic to ASGI + Uvicorn | Most of the code here was fleshed out in comments on #272 (Port Datasette to ASGI) - this pull request will track the final pieces: - [x] Update test harness to more correctly simulate the `raw_path` issue - [x] Use `raw_path` so table names containing `/` can work correctly - [x] Bug: JSON not served with correct content-type - [x] Get ?_trace=1 working again - [x] Replacement for `@app.listener("before_server_start")` - [x] Bug: `/fixtures/table%2Fwith%2Fslashes.csv?_format=json` downloads as CSV - [x] Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency - [x] Final code tidy-up before merging to master | 43406 | |
290971295 | 290971295 | Sort commits using isort, refs #516 | Also added a lint unit test to ensure they stay sorted. #516 | 43406 | |
291534596 | 291534596 | Use keyed rows - fixes #521 | Supports template syntax like this: ``` {% for row in display_rows %} <h2 class="scientist">{{ row["First_Name"] }} {{ row["Last_Name"] }}</h2> ... ``` | 43406 | |
292879204 | 292879204 | db.add_foreign_keys() method | Refs #31. Still TODO: - [x] Unit tests - [x] Documentation | 43406 | |
293117183 | 293117183 | sqlite-utils index-foreign-keys / db.index_foreign_keys() | Refs #33 - [x] `sqlite-utils index-foreign-keys` command - [x] `db.index_foreign_keys()` method - [x] unit tests - [x] documentation | 43406 | |
293962405 | 293962405 | Support cleaner custom templates for rows and tables, closes #521 | - [x] Rename `_rows_and_columns.html` to `_table.html` - [x] Unit test - [x] Documentation | 43406 | |
293992382 | 293992382 | Added asgi_wrapper plugin hook, closes #520 | 43406 | ||
293994443 | 293994443 | Switch to ~= dependencies, closes #532 | 43406 | ||
294400446 | 294400446 | Secret plugin configuration options | Refs #538 | 43406 | |
294992578 | 294992578 | extra_template_vars plugin hook | Refs #541 | 43406 | |
295065796 | 295065796 | --plugin-secret option | Refs #543 - [x] Zeit Now v1 support - [x] Solve escaping of ENV in Dockerfile - [x] Heroku support - [x] Unit tests - [x] Cloud Run support - [x] Documentation | 43406 | |
295127213 | 295127213 | Facet by delimiter | Refs #510 | 43406 |
Advanced export
JSON shape: default, array, newline-delimited
CREATE VIRTUAL TABLE [pull_requests_fts] USING FTS5 ( [title], [body], content=[pull_requests] );