id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1822813627,I_kwDOBm6k_c5spe27,2108,some (many?) SQL syntax errors are not throwing errors with a .csv endpoint,536941,open,0,,,0,2023-07-26T16:57:45Z,2023-07-26T16:58:07Z,,CONTRIBUTOR,,"here's a CTE query that should always fail with a syntax error: ```sql with foo as (nonsense) select * from foo; ``` when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B same with this bad sql ```sql select a, from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B vs https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B but, datasette catches this bad sql at both endpoints: ```sql slect a from foo; ``` https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1783304750,I_kwDOBm6k_c5qSxIu,2094,JS Plugin Hooks for the Code Editor,15178711,open,0,,,0,2023-07-01T00:51:57Z,2023-07-01T00:51:57Z,,CONTRIBUTOR,,"When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor. I'd eventually like to build a JS plugin for [`sqlite-docs`](https://github.com/asg017/sqlite-docs), to add things like: - Inline documentation for tables/columns on hover - Inline docs for custom functions that are loaded in - More detailed autocomplete for tables/columns/functions I did some hacking to see what this would look like, see here: There can be a new hook that allows JS plugins to add new ""extension"" in the CodeMirror editorview here: https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25 Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of `""@codemirror/autocomplete""`, for example. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2094/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620515757,I_kwDOBm6k_c5glxut,2039,Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\`,15178711,open,0,,,0,2023-03-12T21:18:52Z,2023-03-12T21:18:52Z,,CONTRIBUTOR,,"From the Datasette discord: A user tried running the following command on windows: ``` datasette --load-extension=""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"" ``` This failed with `""The specified module could not be found""`, because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at `""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll""`, it instead tried to load the extension at `""C""` with entrypoint `""\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"". This is hard because most absolute windows paths have a colon in them, like `C:\foo.txt` or `D:\bar.txt`. I'd image the `--static` flag is also vulnerable to this type of bug. The ""solution"" is to use a relative path instead, but that doesn't feel that great. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1605959201,I_kwDOBm6k_c5fuP4h,2032,datasette errors when foreign key integrity is enabled,193185,open,0,,,0,2023-03-02T01:27:51Z,2023-03-02T01:31:58Z,,CONTRIBUTOR,,"By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running: ```sql PRAGMA foreign_keys = ON; ``` inside of a `prepare_connection` hook. If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with: ``` FOREIGN KEY constraint failed ``` This could be resolved by either: - deleting from the `tables` column last - changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions) Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571711808,I_kwDOBm6k_c5drmtA,2018,`check_visibility` gives confusing (wrong?) results if permission is `None`,193185,open,0,,,0,2023-02-06T01:03:08Z,2023-02-06T01:03:46Z,,CONTRIBUTOR,,"I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table. I expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ (""update-row"", (database, table)), ], ) if not visible: return None ``` But `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.) Based on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available. But `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2018/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1565179870,I_kwDOBm6k_c5dSr_e,2013,Datasette uses non-standard quoting for identifiers,193185,open,0,,,0,2023-02-01T00:05:39Z,2023-02-01T00:06:30Z,,CONTRIBUTOR,,"Related to #2001, but where #2001 was about literals, this is about identifiers From https://www.sqlite.org/lang_keywords.html: > ""keyword"" A keyword in double-quotes is an identifier. > [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility. Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/__init__.py#L345-L349, in some of the other DB access code, and in some of the test fixtures. Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2013/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1509783085,I_kwDOBm6k_c5Z_XYt,1969,sql-formatter javascript is not now working with CloudFlare rocketloader,536941,open,0,,,0,2022-12-23T21:14:06Z,2023-01-10T01:56:33Z,,CONTRIBUTOR,,"This is probably not a bug with datasette, but I thought you might want to know, @simonw. I noticed today that my CloudFlare proxied datasette instance lost the ""Format SQL"" option. I'm pretty sure it was there last week. In the CloudFlare settings, if I turn off [Rocket Loader](https://developers.cloudflare.com/fundamentals/speed/rocket-loader/), I get the ""Format SQL"" option back. Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading? I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1469821027,I_kwDOBm6k_c5Xm7Bj,1921,Document methods to get canned queries,25778,open,0,,,0,2022-11-30T15:26:33Z,2022-11-30T23:34:21Z,,CONTRIBUTOR,,"Two methods will get canned queries for a Datasette instance: [`Datasette.get_canned_queries`](https://github.com/simonw/datasette/blob/main/datasette/app.py#L575) will return all canned queries for a database that an `actor` can see. [`Datasette.get_canned_query`](https://github.com/simonw/datasette/blob/main/datasette/app.py#L592) will return a single canned query by name. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1921/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1469796454,I_kwDOBm6k_c5Xm1Bm,1920,Document Datasette.metadata() method,25778,open,0,,,0,2022-11-30T15:10:36Z,2022-11-30T15:10:36Z,,CONTRIBUTOR,,"Code is here: https://github.com/simonw/datasette/blob/main/datasette/app.py#L503 This will be the official way to access metadata from plugins.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1920/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1006781949,I_kwDOBm6k_c48AkX9,1478,Documentation Request: Feature alternative ID instead of default ID,192568,open,0,,,0,2021-09-24T19:56:13Z,2021-09-25T16:18:54Z,,CONTRIBUTOR,,"My data already has an ID that comes from a federal agency. Would love to have documentation on how to modify the template to: - Remove the generated ID from the table - Link the federal ID to the detail page - and to ensure that the JSON file uses that as the ID. I'd be happy to include the database ID in the export, but not as a key. I don't want to remove the ID from the database, though, because my experience with the federal agency is that data often has anomalies. I don't want all hell to break loose if they end up applying the same ID to multiple rows (which they haven't done yet). I just don't want it to display in the table or the data exports. Perhaps this isn't a template issue, maybe more of a db manipulation... Margie",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 991206402,MDExOlB1bGxSZXF1ZXN0NzI5NzA0NTM3,1465,add support for -o --get /path,51016,open,0,,,0,2021-09-08T14:30:42Z,2021-09-08T14:31:45Z,,CONTRIBUTOR,simonw/datasette/pulls/1465,"Fixes https://github.com/simonw/datasette/issues/1459 Adds support for `--open --get /path` to be used in combination. If `--open` is provided alone, datasette will open a web page to a default URL. If `--get ` is provided alone, datasette will output the result of doing a GET to that URL and then exit. If `--open --get ` are provided together, datasette will open a web page to that URL. TODO items: - [ ] update documentation - [ ] print out error message when `--root --open --get ` is used - [ ] adjust code to require that `` start with a `/` when `-o --get ` is used - [ ] add test(s) note, '@CTB' is used in this PR to flag code that needs revisiting.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1465/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 989109888,MDU6SXNzdWU5ODkxMDk4ODg=,1460,Override column metadata with metadata from another column,72577720,open,0,,,0,2021-09-06T12:13:33Z,2021-09-06T12:13:33Z,,CONTRIBUTOR,,"I have a table from the PUDL project (https://github.com/catalyst-cooperative/pudl) that looks like this: ``` CREATE TABLE fuel_ferc1 ( id INTEGER NOT NULL, record_id TEXT, utility_id_ferc1 INTEGER, report_year INTEGER, plant_name_ferc1 TEXT, fuel_type_code_pudl VARCHAR(7), fuel_unit VARCHAR(7), fuel_qty_burned FLOAT, fuel_mmbtu_per_unit FLOAT, fuel_cost_per_unit_burned FLOAT, fuel_cost_per_unit_delivered FLOAT, fuel_cost_per_mmbtu FLOAT, PRIMARY KEY (id), FOREIGN KEY(plant_name_ferc1, utility_id_ferc1) REFERENCES plants_ferc1 (plant_name_ferc1, utility_id_ferc1), CONSTRAINT fuel_ferc1_fuel_type_code_pudl_enum CHECK (fuel_type_code_pudl IN ('coal', 'oil', 'gas', 'solar', 'wind', 'hydro', 'nuclear', 'waste', 'unknown')), CONSTRAINT fuel_ferc1_fuel_unit_enum CHECK (fuel_unit IN ('ton', 'mcf', 'bbl', 'gal', 'kgal', 'gramsU', 'kgU', 'klbs', 'btu', 'mmbtu', 'mwdth', 'mwhth', 'unknown')) ); ``` Note that `fuel_unit` is a unit that **pint** can understand, and that `fuel_qty_burned` is a column of data that could be expressed in terms of actual units, not merely as a dimensionless number. Ditto the `fuel_cost_per_unit_...` columns. Is there a way to give a column a default metadata unit (such as *tons* or *USD/ton*) and then let that be overridden when the metadata in another column says *barrels* or *USD/gramsU*? @catalyst-cooperative",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 988552851,MDU6SXNzdWU5ODg1NTI4NTE=,1456,conda install results in non-functioning `datasette serve` due to out-of-date asgiref,51016,open,0,,,0,2021-09-05T16:59:55Z,2021-09-05T16:59:55Z,,CONTRIBUTOR,,"Over in https://github.com/ctb/2021-sourmash-datasette, I discovered that the following commands fail: ``` conda create -n datasette4 -y datasette=0.58.1 conda activate datasette4 datasette gathertax.db ``` with `ImportError: cannot import name 'WebSocketScope' from 'asgiref.typing'`. This appears to be because asgiref 3.3.4 doesn't have WebSocketScope, but later versions do - a simple ``` pip install asgiref==3.4.1 ``` fixes the problem for me, at least to the point where I can run datasette and poke around as usual. I note that over in the conda-forge recipe, https://github.com/conda-forge/datasette-feedstock/blob/master/recipe/meta.yaml pins asgiref to < 3.4.0, but I'm not sure why - so I'm not sure how to best resolve this issue :).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 860734722,MDU6SXNzdWU4NjA3MzQ3MjI=,1302,Fix disappearing facets,192568,open,0,,,0,2021-04-18T18:42:33Z,2021-04-20T07:40:15Z,,CONTRIBUTOR,,"1. Clone https://github.com/mroswell/list-N 2. Run `datasette disinfectants.db -o` 3. Select the `Safer_or_Toxic` facet. 4. Select `Toxic`. 5. Close out the `Safer_or_Toxic` facet. 6. Examine `Suggested facets` list. `Safer_or_Toxic` is GONE. 7. Try some other facets. When you select an element, and then close the list, in some cases, the facet properly returns to the `Suggested facet` list... Arrays and dates properly return to the list, but fields with strings don't return to the list. Since my site is devoted to whether disinfectants are Safer or Toxic, having the suggested facet disappear from the suggested facet list is very confusing* to end-users. This, along with a few other issues, unfortunately proved beyond my own programming ability to address. So I hired a Senior-level developer to address a number of issues, including this disappearing act. 8. Open a new terminal. Run `datasette disinfectants.db -m metadata.json --static static:static/ --template-dir templates/ --plugins-dir plugins/ -p 8001 -o` 9. Repeat steps 3-6, but this time, the Safer_or_Toxic facet returns to the list (and the related URL parameters are removed). I'm not sure how to do a pull request for this, because the plugin contains other functionality that goes beyond this bug. I wanted the facets sorted in a certain order (both in the suggested facet list, and the detail lists) (... the detail lists were hopping around all over the place before...) I wanted the duplicate facets removed (leaving only the one where you can facet by individual item in an array.) I wanted the arrays to be presented in a prettier fashion (I did that in the template... That could be moved over to the plugin at some point) I'm thinking it'll be very helpful if applicable parts of my project's plugin (sort_suggested_facets_plugin.py) will be able to be incorporated back into datasette, but I leave that to you to consider. (* The disappearing facet bug was especially confusing because I'm removing the filters and sql from the table page, at the request of the organization. The filters and sql detail created a lot of confusion for end users who try to find disinfectants used by Hospitals, for instance, as an '=' won't find them, since they are part of the Use_site array.) My disappearing-facet confusion was documented in my own issue: https://github.com/mroswell/list-N/issues/57 (addressed by the plugin). Other facet-related issues here: https://github.com/mroswell/list-N/issues/54 (addressed by the plugin); https://github.com/mroswell/list-N/issues/15 (addressed by template); https://github.com/mroswell/list-N/issues/53 (not yet addressed). ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855451460,MDU6SXNzdWU4NTU0NTE0NjA=,1297,"Documentation: json1, and introspection endpoints",192568,open,0,,,0,2021-04-12T00:38:00Z,2021-04-12T01:29:33Z,,CONTRIBUTOR,,"https://docs.datasette.io/en/stable/facets.html notes that: > If your SQLite installation provides the json1 extension (you can check using /-/versions) Datasette will automatically detect columns that contain JSON arrays... When I check -/versions I see two sections relevant to json1: ``` ""extensions"": { ""json1"": null }, ""compile_options"": [ ... ""ENABLE_JSON1"", ``` The ENABLE_JSON1 makes me think json1 is likely available. But the `""json1"": null` made me think it wasn't available (because of the `null`). It would help if the documentation provided clarity about how to know if json1 is installed. It would also be helpful if the `/-/versions` information signalled somehow that that is to be appended to the hostname or domain name (or whatever you want to call it, or simply show it, using `example.com/-/versions` instead of `/-/versions`. Likewise on that last point, for https://docs.datasette.io/en/stable/introspection.html#introspection , at least at some point on that page detailing where those introspection endpoints go. (Sometimes documentation can be so abbreviated that it's hard for new users to figure out what's going on.) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 853672224,MDU6SXNzdWU4NTM2NzIyMjQ=,1294,"""You can check out any time you like. But you can never leave!""",192568,open,0,,,0,2021-04-08T17:02:15Z,2021-04-08T18:35:50Z,,CONTRIBUTOR,,"(Feel free to rename this one.) - The column gear lets you ""Show not-blank rows."" Then it places a parameter in the URL, which a web developer would notice, but a lot of users won't notice, or know to delete it. Would be good to toggle ""Show not-blank rows"" with ""Show all rows."" (Also would be quite helpful to have a ""Show blank rows | Show all rows"" option) - The column gear lets you ""Sort ascending"" and ""Sort descending"" but then you're stuck with some sort of sorted version thereafter, unless you know to sort the ID column, or to remove the full _sort parameter and its value in the URL. Would be good to offer a ""Remove sort"" option in the gear. - These requests are in the same camp as: https://github.com/simonw/datasette-vega/issues/36 - I suspect there are other url parameter instances where similar analysis would be helpful, but the three above are the use cases I've run across. UPDATE: - It would be helpful to have a ""Previous page"" available for all but the first table page.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 847700726,MDU6SXNzdWU4NDc3MDA3MjY=,1285,Feature Request or Plugin Request: Numeric Range Facets,192568,open,0,,,0,2021-04-01T01:50:20Z,2021-04-01T02:28:19Z,,CONTRIBUTOR,,"It would be great to offer facets for numeric data ranges. The ranges could pull from typical GIS methods of creating choropleth maps. https://gisgeography.com/choropleth-maps-data-classification/ Of the following, for mapping, I've always preferred a Jenks Natural Breaks, or a cross between Jenks and Pretty breaks. - Equal Intervals - Quantile (equal count) - Standard Deviation - Natural Breaks (Jenks) Classification - Pretty Breaks - Some sort of Aggregate Jenks Classification (this isn't standard, but it would be nice to be able to set classification ranges that work across tables.) Here are some links for Natural Breaks, in case this method is unfamiliar. - https://en.wikipedia.org/wiki/Jenks_natural_breaks_optimization - http://wiki.gis.com/wiki/index.php/Jenks_Natural_Breaks_Classification - https://medium.com/analytics-vidhya/jenks-natural-breaks-best-range-finder-algorithm-8d1907192051 Per that last link, there is a Jenks Python module... They also describe it as data-intensive for larger datasets. Maybe this is a good plugin idea. An example of equal Intervals would be 0 – < 10 10 – < 20 20 – < 30 30 – < 40 It's kind of confusing to have that less-than sign in there. it could also be displayed as: 0 – 10 10 – 20 20 – 30 30 – 40 But then it's not completely clear which category 10 is in, for instance. (Best to right-justify.. and use an ""en dash"" between numbers.) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 843884745,MDU6SXNzdWU4NDM4ODQ3NDU=,1283,advanced #export causes unexpected scrolling,192568,open,0,,,0,2021-03-29T22:46:57Z,2021-03-29T22:46:57Z,,CONTRIBUTOR,,"1. Visit a datasette table page 2. Click on the ""(advanced)"" link. This adds a fragment identifier ""#export"" to the URL, and scrolls down to the ""Advanced export"" div with the ""export"" id. 3. Manually scroll back up, and click on a suggested facet. The fragment identifier is still present, and the app scrolls back down to the ""Advanced export"" div. I think this is unwanted behavior. The user remedy seems to be to manually remove the ""#export"" from the URL. This behavior happens in my project, and in: https://covid-19.datasettes.com/covid/economist_excess_deaths (for instance) but not in this table: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1283/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 756875827,MDU6SXNzdWU3NTY4NzU4Mjc=,1129,Fix footer to the bottom of the page,3243482,open,0,,,0,2020-12-04T07:28:07Z,2020-12-04T16:04:29Z,,CONTRIBUTOR,,"Footer doesn't stick to the bottom if the body content isn't long enough to reach the end of viewport. ![before & after](https://user-images.githubusercontent.com/3243482/101134785-f6595a80-361b-11eb-81ce-b8b5cb9c5bc2.png) This can be fixed using flexbox. ```css body { min-height: 100vh; display: flex; flex-direction: column; } .content { flex-grow: 1; } ``` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 440325850,MDExOlB1bGxSZXF1ZXN0Mjc1OTIzMDY2,452,SQL builder utility classes,45057,open,0,,,0,2019-05-04T13:57:47Z,2019-05-04T14:03:04Z,,CONTRIBUTOR,simonw/datasette/pulls/452,"This adds a straightforward set of classes to aid in the construction of SQL queries. My plan for this was to allow plugins to manipulate the Datasette-generated SQL in a more structured way. I'm not sure that's going to work, but I feel like this is still a step forward - it reduces the number of intermediate variables in `TableView.data` which aids readability, and also factors out a lot of the boring string concatenation. There are a fair number of minor structure changes in here too as I've tried to make the ordering of `TableView.data` a bit more logical. As far as I can tell, I haven't broken anything...",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/452/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 377166793,MDU6SXNzdWUzNzcxNjY3OTM=,372,Docker build tools,82988,open,0,,,0,2018-11-04T16:02:35Z,2018-11-04T16:02:35Z,,CONTRIBUTOR,,"In terms of small pieces lightly joined, I note that there are several tools starting to appear for building generating Dockerfiles and building Docker containers from simpler components such as `requirements.txt` files. If plugin/extensions builders want to include additional packages, then things like incremental builds of composable builds that add additional items into a base `datasette` container may be required. Examples of Dockerfile generators / container builders: - [openshift/source-to-image (s2i)](https://github.com/openshift/source-to-image) - [jupyter/repo2docker](https://github.com/jupyter/repo2docker) - [stencila/dockter](https://github.com/stencila/dockter) Discussions / threads (via Binderhub gitter) on: - [why `repo2docker` not `s2i`](http://words.yuvi.in/post/why-not-s2i/) - [why `dockter` not `repo2docker`](https://twitter.com/choldgraf/status/1058499607309647872) - [composability in `s2i`](https://trello.com/c/AexIVZNf/1008-8-composable-builds-builds-evg) Relates to things like: - https://github.com/simonw/datasette/pull/280",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/372/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",, 314834783,MDU6SXNzdWUzMTQ4MzQ3ODM=,219,Expose units in the JSON API?,45057,open,0,,,0,2018-04-16T22:04:25Z,2018-04-16T22:04:25Z,,CONTRIBUTOR,,"From #203: it would be nice for the JSON API to (optionally) return columns rendered with units in them - if, for example, you're consuming the JSON to render the rows on a map. I'm not entirely sure how useful this will be though - at the moment my map queries are custom SQL queries (a few have joins in, the rest might be fetching large amounts of data so it makes sense to limit columns fetched). Perhaps the SQL function is a better approach in general.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/219/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,