id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1558644003,I_kwDOBm6k_c5c5wUj,2006,Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release,9599,open,0,,3268330,2,2023-01-26T19:17:40Z,2023-01-26T19:20:53Z,,OWNER,,"I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes. I can hopefully help avoid that by shipping one last entry in the `0.x` series that ensures `datasette publish` pins to `<1.0` when it installs Datasette itself.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1174717287,I_kwDOBm6k_c5GBMNn,1674,Tweak design of /.json,9599,open,0,,3268330,1,2022-03-20T22:58:01Z,2022-03-20T22:58:40Z,,OWNER,,"https://latest.datasette.io/.json Currently: ```json { ""_memory"": { ""name"": ""_memory"", ""hash"": null, ""color"": ""a6c7b9"", ""path"": ""/_memory"", ""tables_and_views_truncated"": [], ""tables_and_views_more"": false, ""tables_count"": 0, ""table_rows_sum"": 0, ""show_table_row_counts"": false, ""hidden_table_rows_sum"": 0, ""hidden_tables_count"": 0, ""views_count"": 0, ""private"": false }, ""fixtures"": { ""name"": ""fixtures"", ""hash"": ""645005884646eb941c89997fbd1c0dd6be517cb1b493df9816ae497c0c5afbaa"", ""color"": ""645005"", ""path"": ""/fixtures"", ""tables_and_views_truncated"": [ { ""name"": ""compound_three_primary_keys"", ""columns"": [ ""pk1"", ""pk2"", ""pk3"", ""content"" ], ""primary_keys"": [ ""pk1"", ""pk2"", ""pk3"" ], ""count"": 1001, ""hidden"": false, ""fts_table"": null, ""num_relationships_for_sorting"": 0, ""private"": false }, ``` As-of this issue the `""path""` key is confusing, it doesn't match what https://latest.datasette.io/-/databases returns: - #1668",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1131295060,I_kwDOBm6k_c5DbjFU,1634,Update Dockerfile generated by `datasette publish`,9599,open,0,,3268330,4,2022-02-11T00:07:26Z,2022-03-11T17:38:08Z,,OWNER,,"The generated `Dockerfile` currently looks something like this: ```Dockerfile FROM python:3.8 COPY . /app WORKDIR /app ENV DATASETTE_SECRET 'edab49cbc5d5f6f33238f54852037e3fee710821960b73edd2ce743454182ae2' RUN pip install -U datasette datasette-auth-passwords datasette-tiddlywiki datasette-graphql RUN datasette inspect fixtures.db other.db --inspect-file inspect-data.json ENV PORT 8080 EXPOSE 8080 CMD datasette serve --host 0.0.0.0 -i fixtures.db -i other.db --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT /data/*.db ``` This is still on Python 3.8, and it generates a pretty large image compared to the `Dockerfile` used for https://hub.docker.com/datasetteproject/datasette - https://github.com/simonw/datasette/blob/0.60.2/Dockerfile Here's the code that generates it: https://github.com/simonw/datasette/blob/7d24fd405f3c60e4c852c5d746c91aa2ba23cf5b/datasette/utils/__init__.py#L389-L400",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1634/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1065429936,I_kwDOBm6k_c4_gSuw,1532,Use datasette-table Web Component to guide the design of the JSON API for 1.0,9599,open,0,,3268330,4,2021-11-28T20:37:18Z,2022-03-16T20:13:34Z,,OWNER,,"I realized that one of the reasons I'm having trouble committing to nailing down the JSON API for 1.0 is that I don't use it much myself - I use the `?_shape=array` one quite often, but I don't have any projects that are using the default, more fully-featured API. As an experiment I built a Web Component for embedding Datasette tables on pages - https://github.com/simonw/datasette-table - and I think it's actually going to be a really useful tool for helping me dog food the v1.0 API design.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 730210880,MDU6SXNzdWU3MzAyMTA4ODA=,1055,query.html and table.html should share the same table implementation,9599,open,0,,3268330,0,2020-10-27T07:58:21Z,2020-10-27T07:58:29Z,,OWNER,,In #998 I made a change that affected the table page but didn't affect the query page because I incorrectly assumed they shared rendering logic.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1055/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 749283032,MDU6SXNzdWU3NDkyODMwMzI=,1101,register_output_renderer() should support streaming data,9599,open,0,,3268330,13,2020-11-24T02:17:09Z,2023-01-21T22:07:19Z,,OWNER,,"> I'd like to implement this by first extending the `register_output_renderer()` hook to support streaming huge responses, then switching CSV to use the plugin hook in addition to TSV using it. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1096#issuecomment-732542285_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1101/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,