issue_comments
616 rows where author_association = "CONTRIBUTOR" sorted by reactions
This data as json, CSV (advanced)
Suggested facets: issue_url, reactions, updated_at (date)
issue 330
- Upgrade to CodeMirror 6, add SQL autocomplete 21
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 14
- Database page loads too slowly with many large tables (due to table counts) 13
- Stream all results for arbitrary SQL and canned queries 10
- docker image is duplicating db files somehow 10
- Handle spatialite geometry columns better 7
- base_url configuration setting 7
- Exceeding Cloud Run memory limits when deploying a 4.8G database 7
- create-index should run analyze after creating index 7
- Add new spatialite helper methods 7
- Add register_output_renderer hook 6
- Helper methods for working with SpatiaLite 6
- Plugin hook for dynamic metadata 6
- clean checkout & clean environment has test failures 6
- [WIP] Add publish to heroku support 5
- Scripted exports 5
- datasette publish lambda plugin 4
- Build Dockerfile with recent Sqlite + Spatialite 4
- Documentation with recommendations on running Datasette in production without using Docker 4
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 4
- Add insert --truncate option 4
- Make it easier to insert geometries, with documentation and maybe code 4
- Advanced class-based `conversions=` mechanism 4
- query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data 4
- Proposal: datasette query 4
- Writable canned queries fail with useless non-error against immutable databases 4
- Ability to merge databases and tables 4
- API to insert a single record into an existing table 4
- Exclude virtual tables from datasette inspect 4
- Call for birthday presents: if you're using Datasette, let us know how you're using it here 4
- array facet: don't materialize unnecessary columns 4
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 4
- Ship a Docker image of the whole thing 3
- Support for units 3
- Don't duplicate simple primary keys in the link column 3
- Add version number support with Versioneer 3
- Integration with JupyterLab 3
- Datasette Library 3
- Exposing Datasette via Jupyter-server-proxy 3
- Don't auto-format SQL on page load 3
- Try out ExifReader 3
- Consider dropping explicit CSRF protection entirely? 3
- Handle case where subsequent records (after first batch) include extra columns 3
- For 1.0 update trove classifier in setup.py 3
- Support linking to compound foreign keys 3
- register_output_renderer() should support streaming data 3
- Feature or Documentation Request: Individual table as home page template 3
- Make row available to `render_cell` plugin hook 3
- Serve all db files in a folder 3
- Allow routes to have extra options 3
- Datasette feature for publishing snapshots of query results 3
- CLI eats my cursor 3
- don't use immutable=1, only mode=ro 3
- Datasette is not compatible with SQLite's strict quoting compilation option 3
- Plugin system 3
- De-tangling Metadata before Datasette 1.0 3
- Dockerfile should build more recent SQLite with FTS5 and spatialite support 2
- Add NHS England Hospitals example to wiki 2
- add support for ?field__isnull=1 2
- Travis should push tagged images to Docker Hub for each release 2
- Column types in inspected metadata 2
- "datasette publish cloudrun" command to publish to Google Cloud Run 2
- [WIP] Add primary key to the extra_body_script hook arguments 2
- Define mechanism for plugins to return structured data 2
- Rename metadata.json to config.json 2
- Mechanism for turning nested JSON into foreign keys / many-to-many 2
- Every datasette plugin on the ecosystem page should have a screenshot 2
- Fix static mounts using relative paths and prevent traversal exploits 2
- Handle spaces in DB names 2
- allow leading comments in SQL input field 2
- Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column 2
- base_url doesn't entirely work for running Datasette inside Binder 2
- Option to automatically configure based on directory layout 2
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 2
- Incorrect URLs when served behind a proxy with base_url set 2
- Skip counting hidden tables 2
- asgi_wrapper plugin hook is crashing at startup 2
- Improved (and better documented) support for transactions 2
- insert_all(..., alter=True) should work for new columns introduced after the first 100 records 2
- datasette.urls.static_plugins(...) method 2
- Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows 2
- Fix footer not sticking to bottom in short pages 2
- Remove unneeded exists=True for -a/--auth flag. 2
- Improve the display of facets information 2
- Archive import appears to be broken on recent exports 2
- Fix archive imports 2
- Ability for plugins to collaborate when adding extra HTML to blocks in default templates 2
- Use Data from SQLite in other commands 2
- Tests are very slow. 2
- Installing datasette via docker: Path 'fixtures.db' does not exist 2
- Async support 2
- Better default display of arrays of items 2
- Dockerfile: use Ubuntu 20.10 as base 2
- DRAFT: A new plugin hook for dynamic metadata 2
- unordered list is not rendering bullet points in description_html on database page 2
- `publish cloudrun` should deploy a more recent SQLite version 2
- Rename Datasette.__init__(config=) parameter to settings= 2
- base logo link visits `undefined` rather than href url 2
- Update pyyaml requirement from ~=5.3 to >=5.3,<7.0 2
- Add new `"sql_file"` key to Canned Queries in metadata? 2
- Writable canned queries fail to load custom templates 2
- Redesign CSV export to improve usability 2
- Allow to set `facets_array` in metadata (like current `facets`) 2
- if csv export is truncated in non streaming mode set informative response header 2
- Add SpatiaLite helpers to CLI 2
- Document how to add a primary key to a rowid table using `sqlite-utils transform --pk` 2
- Add SpatiaLite helpers to CLI 2
- `with db:` for transactions 2
- [plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins? 2
- ?_trace=1 fails with datasette-geojson for some reason 2
- Utilities for duplicating tables and creating a table with the results of a query 2
- feature request: pivot command 2
- google cloudrun updated their limits on maxscale based on memory and cpu count 2
- Add new entrypoint option to `--load-extension` 2
- SITE-BUSTING ERROR: "render_template() called before await ds.invoke_startup()" 2
- [insert_all, upsert_all] IntegrityError: constraint failed 2
- Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused 2
- render_cell plugin hook's row object is not a sqlite.Row 2
- Show referring tables and rows when the referring foreign key is compound 2
- Repeated calls to `Table.convert()` fail 2
- Add paths for homebrew on Apple silicon 2
- Aliased ROWID option for tables created from alter=True commands 2
- Plugin hook for adding new output formats 2
- Bump sphinx, furo, blacken-docs dependencies 2
- Fix query for suggested facets with column named value 2
- Datasette Plugins 1
- Ability to plot a simple graph 1
- :fire: Removes DS_Store 1
- Consider data-package as a format for metadata 1
- Plot rows on a map with Leaflet and Leaflet.markercluster 1
- Ability to bundle and serve additional static files 1
- Document the querystring argument for setting a different time limit 1
- datasette skeleton command for kick-starting database and table metadata 1
- More metadata options for template authors 1
- Hide Spatialite system tables 1
- Raise 404 on nonexistent table URLs 1
- Support filtering with units and more 1
- Figure out a way to have /-/version return current git commit hash 1
- inspect should record column types 1
- Render boolean fields better by default 1
- datasette publish heroku fails without name provided 1
- Default built image size over Zeit Now 100MiB limit 1
- Interface should show same JSON shape options for custom SQL queries 1
- datasette publish digitalocean plugin 1
- Linked Data(sette) 1
- Default to opening files in mutable mode, special option for immutable files 1
- ?_where=sql-fragment parameter for table views 1
- Datasette doesn't reload when database file changes 1
- Refactor facets to a class and new plugin, refs #427 1
- Add inspect and prepare_sanic hooks 1
- Coalesce hidden table count to 0 1
- Installing installs the tests package 1
- Exporting sqlite database(s)? 1
- Get Datasette tests passing on Windows in GitHub Actions 1
- "about" parameter in metadata does not appear when alone 1
- Show total/unfiltered row count when filtering 1
- "Too many SQL variables" on large inserts 1
- First proof-of-concept of Datasette Library 1
- Escape the table name in populate_fts and search. 1
- Add triggers while enabling FTS 1
- importing CSV to SQLite as library 1
- Databases on index page should display in order they were passed to "datasette serve"? 1
- Offer to format readonly SQL 1
- `import` command fails on empty files 1
- updating metadata.json without recreating the app 1
- Provide a cookiecutter template for creating new plugins 1
- Validate metadata json on startup 1
- Display of the column definitions 1
- Use inspect-file, if possible, for total row count 1
- --reload sould reload server if code in --plugins-dir changes 1
- Import EXIF data into SQLite - lens used, ISO, aperture etc 1
- Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12 1
- Replace "datasette publish --extra-options" with "--setting" 1
- sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns 1
- Import machine-learning detected labels (dog, llama etc) from Apple Photos 1
- Only install osxphotos if running on macOS 1
- Enable wildcard-searches by default 1
- Allow to specify a URL fragment for canned queries 1
- Redesign register_facet_classes plugin hook 1
- Having trouble getting writable canned queries to work 1
- sqlite-utils insert: options for column types 1
- Add pull requests 1
- Support column descriptions in metadata.json 1
- Update black requirement from ~=19.10b0 to >=19.10,<21.0 1
- Bug when first record contains fewer columns than subsequent records 1
- favorites --stop_after=N stops after min(N, 200) 1
- github-to-sqlite should handle rate limits better 1
- Wide tables should scroll horizontally within the page 1
- Update utils.py to fix sqlite3.OperationalError 1
- "Edit SQL" button on canned queries 1
- Include LICENSE in sdist 1
- Add template block prior to extra URL loaders 1
- Allow iterables other than Lists in m2m records 1
- changes to allow for compound foreign keys 1
- Accessing a database's `.json` is slow for very large SQLite files 1
- Fix --metadata doc usage 1
- --load-extension=spatialite not working with datasetteproject/datasette docker image 1
- Make it easier to theme Datasette with CSS 1
- Use YAML examples in documentation by default, not JSON 1
- Update for Big Sur 1
- Modernize code to Python 3.6+ 1
- Add Prettier to contributing documentation 1
- Mechanism for storing metadata in _metadata tables 1
- Prettier package not actually being cached 1
- Install Prettier via package.json 1
- ?_size=10 option for the arbitrary query page would be useful 1
- A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper 1
- Use context manager instead of plain open 1
- photo-to-sqlite: command not found 1
- ensure immutable databses when starting in configuration directory mode with 1
- Ability to increase size of the SQL editor window 1
- Custom pages don't work with base_url setting 1
- Minor type in IP adress 1
- Allow canned query params to specify default values 1
- Plugin hook that could support 'order by random()' for table view 1
- Support for HTTP Basic Authentication 1
- Might there be some way to comment metadata.json? 1
- Ability to run CI against multiple SQLite versions 1
- improve table horizontal scroll experience 1
- Publishing to cloudrun with immutable mode? 1
- Bump black from 20.8b1 to 21.4b0 1
- Bump black from 20.8b1 to 21.4b1 1
- Bump black from 20.8b1 to 21.4b2 1
- Bump black from 21.4b2 to 21.5b0 1
- DRAFT: add test and scan for docker images 1
- Research: syntactic sugar for using --get with SQL queries, maybe "datasette query" 1
- Ensure db.path is a string before trying to insert into internal database 1
- Idea: import CSV to memory, run SQL, export in a single command 1
- Support db as first parameter before subcommand, or as environment variable 1
- Serve using UNIX domain socket 1
- "invalid reference format" publishing Docker image 1
- render_cell() hook should support returning an awaitable 1
- Bump black from 21.7b0 to 21.8b0 1
- Error: Use either --since or --since_id, not both - still broken 1
- Add scientists to target groups 1
- Test against 3.10-dev 1
- Bump black from 21.9b0 to 21.10b0 1
- Bump black from 21.9b0 to 21.11b0 1
- Deploy a live instance of demos/apache-proxy 1
- Support STRICT tables 1
- Command for creating an empty database 1
- Allow passing a file of code to "sqlite-utils convert" 1
- add hash id to "_memory" url if hashed url mode is turned on and crossdb is also turned on 1
- introduce new option for datasette package to use a slim base image 1
- when hashed urls are turned on, the _memory db has improperly long-lived cache expiry 1
- don't set far expiry if hash is '000' 1
- consider adding deletion step of cloudbuild artifacts to gcloud publish 1
- Maybe let plugins define custom serve options? 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18 1
- Add KNN and data_licenses to hidden tables list 1
- Move canned queries closer to the SQL input area 1
- Try again with SQLite codemirror support 1
- Tweak mobile keyboard settings 1
- Support for generated columns 1
- Optional Pandas integration 1
- Mechanism for disabling faceting on large tables only 1
- Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0 1
- Bump black from 22.1.0 to 22.3.0 1
- Show foreign key label when filtering 1
- .extract() doesn't set foreign key when extracted columns contain NULL value 1
- Research: demonstrate if parallel SQL queries are worthwhile 1
- Bump furo from 2022.4.7 to 2022.6.4.1 1
- Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 1
- Mechanism for ensuring a table has all the columns 1
- Research an upgrade to CodeMirror 6 1
- search_sql add include_rank option 1
- Update aiofiles requirement from <0.9,>=0.4 to >=0.4,<22.2 1
- Featured table(s) on the homepage 1
- missing next and next_url in JSON responses from an instance deployed on Fly 1
- Expose `sql` and `params` arguments to various plugin hooks 1
- [SPIKE] Don't truncate query CSVs 1
- Tiny typographical error in install/uninstall docs 1
- fix: enable-fts permanently save triggers 1
- feat: recreate fts triggers after table transform 1
- API explorer tool 1
- conn.execute: UnicodeEncodeError: 'utf-8' codec can't encode character 1
- Allow surrogates in parameters 1
- Cannot enable FTS5 despite it being available 1
- Autocomplete text entry for filter values that correspond to facets 1
- Serve schema JSON to the SQL editor to enable autocomplete 1
- Incorrect link from the API explorer to the JSON API documentation 1
- /db/table/-/upsert 1
- Bump sphinx from 5.3.0 to 6.0.0 1
- rows_from_file() raises confusing error if file-like object is not in binary mode 1
- Bump sphinx from 5.3.0 to 6.1.0 1
- Bump sphinx from 5.3.0 to 6.1.1 1
- Document datasette.urls.row and row_blob 1
- Bump sphinx from 5.3.0 to 6.1.2 1
- Make CustomJSONEncoder a documented public API 1
- rewrite_sql hook 1
- Feature request: trim all leading and trailing white space for all columns for all tables in a database 1
- Bump black from 22.12.0 to 23.1.0 1
- Transformation type `--type DATETIME` 1
- `Table.convert()` skips falsey values 1
- Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1
- Microsoft line endings 1
- `datasette install -r requirements.txt` 1
- Bump furo from 2022.12.7 to 2023.3.23 1
- rows: --transpose or psql extended view-like functionality 1
- Bump sphinx from 6.1.3 to 6.2.0 1
- Bump sphinx from 6.1.3 to 6.2.1 1
- Bump sphinx from 6.1.3 to 7.0.0 1
- Bump sphinx from 6.1.3 to 7.0.1 1
- Bump furo from 2023.3.27 to 2023.5.20 1
- Filter table by a large bunch of ids 1
- Support storing incrementally piped values 1
- `--settings settings.json` option 1
- Bump blacken-docs from 1.14.0 to 1.15.0 1
- Tables starting with an underscore should be treated as hidden 1
- Bump sphinx from 6.1.3 to 7.1.0 1
- Bump furo from 2023.3.27 to 2023.7.26 1
- Bump sphinx from 6.1.3 to 7.1.1 1
- Bump sphinx from 6.1.3 to 7.1.2 1
- Bump the python-packages group with 1 update 1
- Bump the python-packages group with 2 updates 1
- Bump the python-packages group with 3 updates 1
- If a row has a primary key of `null` various things break 1
- Bump the python-packages group with 3 updates 1
- Fix hupper.start_reloader entry point 1
- Proposal: Make the `_internal` database persistent, customizable, and hidden 1
- Bump the python-packages group with 2 updates 1
- `datasette.yaml` plugin support 1
- Plugin Hooks for "compile to SQL" languages 1
- Raise an exception if a "plugins" block exists in metadata.json 1
- Detailed upgrade instructions for metadata.yaml -> datasette.yaml 1
- Bump the python-packages group with 1 update 1
- Bump the python-packages group with 1 update 1
- No suggested facets when a column named 'value' is included 1
- Add more STRICT table support 1
- feature request: gzip compression of database downloads 1
user 67
- fgregg 82
- eyeseast 74
- russss 39
- dependabot[bot] 36
- abdusco 26
- asg017 25
- psychemedia 24
- bgrins 24
- mroswell 22
- chapmanjacobd 20
- cldellow 18
- brandonrobertz 15
- jacobian 14
- RhetTbull 14
- wragge 12
- rixx 11
- hydrosquall 11
- rgieseke 10
- bobwhitelock 9
- amjith 6
- jefftriplett 6
- tsibley 6
- simonwiles 6
- mcarpenter 6
- jaywgraves 6
- davidbgk 5
- dependabot-preview[bot] 5
- bollwyvl 4
- ctb 4
- r4vi 4
- jsfenfen 4
- glasnt 4
- kbaikov 4
- JBPressac 4
- benpickles 3
- ghing 3
- macropin 3
- blairdrummond 3
- kevindkeogh 3
- daniel-butler 3
- jayvdb 2
- ingenieroariel 2
- camallen 2
- gfrmin 2
- mcint 2
- frosencrantz 2
- adipasquale 2
- davidleejy 2
- raynae 2
- tkhattra 2
- MichaelTiemannOSC 2
- danp 1
- tomdyson 1
- adamwolf 1
- rubenv 1
- spookylukey 1
- aslakr 1
- meatcar 1
- mikepqr 1
- adamjonas 1
- louispotok 1
- chris48s 1
- eumiro 1
- rprimet 1
- mattiaborsoi 1
- abeyerpath 1
- b0b5h4rp13 1
id | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions ▼ | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
344125441 | https://github.com/simonw/datasette/pull/81#issuecomment-344125441 | https://api.github.com/repos/simonw/datasette/issues/81 | MDEyOklzc3VlQ29tbWVudDM0NDEyNTQ0MQ== | jefftriplett 50527 | 2017-11-14T02:24:54Z | 2017-11-14T02:24:54Z | CONTRIBUTOR | Oops, if I jumped the gun. I saw the project in my github activity feed and saw some low hanging fruit :) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
:fire: Removes DS_Store 273595473 | |
344145265 | https://github.com/simonw/datasette/issues/57#issuecomment-344145265 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0NTI2NQ== | macropin 247192 | 2017-11-14T04:45:38Z | 2017-11-14T04:45:38Z | CONTRIBUTOR | I'm happy to contribute this. Just let me know if you want a Dockerfile for development or production purposes, or both. If it's prod then we can just pip install the source from pypi, otherwise for dev we'll need a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ship a Docker image of the whole thing 273127694 | |
344147583 | https://github.com/simonw/datasette/issues/57#issuecomment-344147583 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0NzU4Mw== | macropin 247192 | 2017-11-14T05:03:47Z | 2017-11-14T05:03:47Z | CONTRIBUTOR | Let me know if you'd like a PR. The image is usable as
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ship a Docker image of the whole thing 273127694 | |
344151223 | https://github.com/simonw/datasette/issues/57#issuecomment-344151223 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE1MTIyMw== | macropin 247192 | 2017-11-14T05:32:28Z | 2017-11-14T05:33:03Z | CONTRIBUTOR | The pattern is called "multi-stage builds". And the result is a svelte 226MB image (201MB for 3.6-slim) vs 700MB+ for the full image. It's possible to get it even smaller, but that takes a lot more work. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ship a Docker image of the whole thing 273127694 | |
344430689 | https://github.com/simonw/datasette/issues/88#issuecomment-344430689 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQzMDY4OQ== | tomdyson 15543 | 2017-11-14T23:08:22Z | 2017-11-14T23:08:22Z | CONTRIBUTOR |
Sorry about that - here's a working version on Netlify: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add NHS England Hospitals example to wiki 273775212 | |
344710204 | https://github.com/simonw/datasette/pull/104#issuecomment-344710204 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NDcxMDIwNA== | jacobian 21148 | 2017-11-15T19:57:50Z | 2017-11-15T19:57:50Z | CONTRIBUTOR | A first basic stab at making this work, just to prove the approach. Right now this requires a Heroku CLI plugin, which seems pretty unreasonable. I think this can be replaced with direct API calls, which could clean up a lot of things. But I wanted to prove it worked first, and it does. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
344811268 | https://github.com/simonw/datasette/pull/107#issuecomment-344811268 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NDgxMTI2OA== | raynae 3433657 | 2017-11-16T04:17:45Z | 2017-11-16T04:17:45Z | CONTRIBUTOR | Thanks for the guidance. I added a unit test and made a slight change to utils.py. I didn't realize this, but evidently string.format only complains if you supply less arguments than there are format placeholders, so the original commit worked, but was adding a superfluous named param. I added a conditional that prevents the named param from being created and ensures the correct number of args are passed to sting.format. It has the side effect of hiding the SQL query in /templates/table.html when there are no other where clauses--not sure if that's the desired outcome here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
add support for ?field__isnull=1 274343647 | |
345002908 | https://github.com/simonw/datasette/issues/46#issuecomment-345002908 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NTAwMjkwOA== | ingenieroariel 54999 | 2017-11-16T17:47:49Z | 2017-11-16T17:47:49Z | CONTRIBUTOR | I'll try to find alternatives to the Dockerfile option - I also think we should not use that old one without sources or license. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345117690 | https://github.com/simonw/datasette/pull/107#issuecomment-345117690 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NTExNzY5MA== | raynae 3433657 | 2017-11-17T01:29:41Z | 2017-11-17T01:29:41Z | CONTRIBUTOR | Thanks for bearing with me. I was getting a message about my branch diverging when I tried to push after rebasing, so I merged master into isnull, seems like that did the trick. Let me know if I should make any corrections. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
add support for ?field__isnull=1 274343647 | |
345452669 | https://github.com/simonw/datasette/pull/104#issuecomment-345452669 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NTQ1MjY2OQ== | jacobian 21148 | 2017-11-18T16:18:45Z | 2017-11-18T16:18:45Z | CONTRIBUTOR | I'd like to do a bit of cleanup, and some error checking in case heroku/heroku-builds isn't installed. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
345503897 | https://github.com/simonw/datasette/issues/105#issuecomment-345503897 | https://api.github.com/repos/simonw/datasette/issues/105 | MDEyOklzc3VlQ29tbWVudDM0NTUwMzg5Nw== | rgieseke 198537 | 2017-11-19T09:38:08Z | 2017-11-19T09:38:08Z | CONTRIBUTOR | Thanks, I wrote this very simple reader because the default approach as described on the Datahub pages seemed to complicated. I had metadata from the This could also be useful for getting from Data Package to SQL db: https://github.com/frictionlessdata/tableschema-sql-py I maintain a few climate science related dataset at https://github.com/openclimatedata/ The Data Retriever (mainly ecological data) by @ethanwhite et al. is also using the Data Package format for metadata and has some tooling for different dbs: https://frictionlessdata.io/articles/the-data-retriever/ https://github.com/weecology/retriever The Open Power System Data project also has a couple of datasets that show nicely how CSV is great for assembling and then already make SQLite files available. It's one of the first data sets I tried with Datasette, perfect for the use case of getting an API for putting power stations on a map ... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Consider data-package as a format for metadata 274314940 | |
345652450 | https://github.com/simonw/datasette/issues/27#issuecomment-345652450 | https://api.github.com/repos/simonw/datasette/issues/27 | MDEyOklzc3VlQ29tbWVudDM0NTY1MjQ1MA== | rgieseke 198537 | 2017-11-20T10:19:39Z | 2017-11-20T10:19:39Z | CONTRIBUTOR | If Data Package metadata gets adopted (#105) the views spec work might also be worth a look: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to plot a simple graph 267886330 | |
346116745 | https://github.com/simonw/datasette/pull/104#issuecomment-346116745 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NjExNjc0NQ== | jacobian 21148 | 2017-11-21T18:23:25Z | 2017-11-21T18:23:25Z | CONTRIBUTOR | @simonw ready for a review and merge if you want. There's still some nasty duplicated code in cli.py and utils.py, which is just going to get worse if/when we start adding any other deploy targets (and I want to do one for cloud.gov, at least). I think there's an opportunity for some refactoring here. I'm happy to do that now as part of this PR, or if you merge this first I'll do it in a different one. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
346124073 | https://github.com/simonw/datasette/pull/104#issuecomment-346124073 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NjEyNDA3Mw== | jacobian 21148 | 2017-11-21T18:49:55Z | 2017-11-21T18:49:55Z | CONTRIBUTOR | Actually hang on, don't merge - there are some bugs that #141 masked when I tested this out elsewhere. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
346124764 | https://github.com/simonw/datasette/pull/104#issuecomment-346124764 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NjEyNDc2NA== | jacobian 21148 | 2017-11-21T18:52:14Z | 2017-11-21T18:52:14Z | CONTRIBUTOR | OK, now this should work. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
346244871 | https://github.com/simonw/datasette/issues/14#issuecomment-346244871 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0NjI0NDg3MQ== | jacobian 21148 | 2017-11-22T05:06:30Z | 2017-11-22T05:06:30Z | CONTRIBUTOR | I'd also suggest taking a look at stevedore, which has a ton of tools for doing plugin stuff. I've had good luck with it in the past. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette Plugins 267707940 | |
360535979 | https://github.com/simonw/datasette/issues/179#issuecomment-360535979 | https://api.github.com/repos/simonw/datasette/issues/179 | MDEyOklzc3VlQ29tbWVudDM2MDUzNTk3OQ== | psychemedia 82988 | 2018-01-25T17:18:24Z | 2018-01-25T17:18:24Z | CONTRIBUTOR | To summarise that thread:
It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
More metadata options for template authors 288438570 | |
380608372 | https://github.com/simonw/datasette/pull/200#issuecomment-380608372 | https://api.github.com/repos/simonw/datasette/issues/200 | MDEyOklzc3VlQ29tbWVudDM4MDYwODM3Mg== | russss 45057 | 2018-04-11T21:55:46Z | 2018-04-11T21:55:46Z | CONTRIBUTOR |
Or just see if there's a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Hide Spatialite system tables 313494458 | |
380966565 | https://github.com/simonw/datasette/issues/203#issuecomment-380966565 | https://api.github.com/repos/simonw/datasette/issues/203 | MDEyOklzc3VlQ29tbWVudDM4MDk2NjU2NQ== | russss 45057 | 2018-04-12T22:43:08Z | 2018-04-12T22:43:08Z | CONTRIBUTOR | Looks like pint is pretty good at this. ```python In [1]: import pint In [2]: ureg = pint.UnitRegistry() In [3]: q = 3e6 * ureg('Hz') In [4]: '{:~P}'.format(q.to_compact()) Out[4]: '3.0 MHz' In [5]: q = 0.3 * ureg('m') In [5]: '{:~P}'.format(q.to_compact()) Out[5]: '300.0 mm' In [6]: q = 5 * ureg('') In [7]: '{:~P}'.format(q.to_compact()) Out[7]: '5' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for units 313837303 | |
381237440 | https://github.com/simonw/datasette/pull/202#issuecomment-381237440 | https://api.github.com/repos/simonw/datasette/issues/202 | MDEyOklzc3VlQ29tbWVudDM4MTIzNzQ0MA== | russss 45057 | 2018-04-13T19:22:53Z | 2018-04-13T19:22:53Z | CONTRIBUTOR | I spotted you'd mentioned that in #184 but only after I'd written the patch! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Raise 404 on nonexistent table URLs 313785206 | |
381315675 | https://github.com/simonw/datasette/issues/203#issuecomment-381315675 | https://api.github.com/repos/simonw/datasette/issues/203 | MDEyOklzc3VlQ29tbWVudDM4MTMxNTY3NQ== | russss 45057 | 2018-04-14T09:14:45Z | 2018-04-14T09:27:30Z | CONTRIBUTOR |
<s>From a machine-readable perspective I'm not sure why it would be useful to decorate the values with units</s>. Edit: Should have had some coffee first. It's clearly useful for stuff like map rendering! I agree that the unit metadata should definitely be exposed in the JSON.
I'm thinking about a couple of approaches here. I think the simplest one is: if the column has a unit attached, optionally accept units in query fields: ```python column_units = ureg("Hz") # Create a unit object for the column's unit query_variable = ureg("4 GHz") # Supplied query variable Now we can convert the query units into column units before queryingsupplied_value.to(column_units).magnitude
If the user doesn't supply units, pint just returns the plainnumber and we can query as usual assuming it's the base unitquery_variable = ureg("50") query_variable
isinstance(query_variable, numbers.Number)
This also lets us do some nice unit conversion on querying: ```python column_units = ureg("m") query_variable = ureg("50 ft") supplied_value.to(column_units)
The alternative would be to provide a dropdown of units next to the query field (so a "Hz" field would give you "kHz", "MHz", "GHz"). Although this would be clearer to the user, it isn't so easy - we'd need to know more about the context of the field to give you sensible SI prefixes (I'm not so interested in nanoHertz, for example). You also lose the bonus of being able to convert - although pint will happily show you all the compatible units, it again suffers from a lack of context: ```python ureg("m").compatible_units()
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for units 313837303 | |
381332222 | https://github.com/simonw/datasette/pull/205#issuecomment-381332222 | https://api.github.com/repos/simonw/datasette/issues/205 | MDEyOklzc3VlQ29tbWVudDM4MTMzMjIyMg== | russss 45057 | 2018-04-14T14:16:35Z | 2018-04-14T14:16:35Z | CONTRIBUTOR | I've added some tests and that docs link. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support filtering with units and more 314319372 | |
381361734 | https://github.com/simonw/datasette/issues/125#issuecomment-381361734 | https://api.github.com/repos/simonw/datasette/issues/125 | MDEyOklzc3VlQ29tbWVudDM4MTM2MTczNA== | russss 45057 | 2018-04-14T21:26:30Z | 2018-04-14T21:26:30Z | CONTRIBUTOR | FWIW I am now doing this on my WTR app (instead of silently limiting maps to 1000). Telefonica now has about 4000 markers and good old BT has 22,000 or so. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plot rows on a map with Leaflet and Leaflet.markercluster 275135393 | |
381441392 | https://github.com/simonw/datasette/pull/209#issuecomment-381441392 | https://api.github.com/repos/simonw/datasette/issues/209 | MDEyOklzc3VlQ29tbWVudDM4MTQ0MTM5Mg== | russss 45057 | 2018-04-15T21:59:15Z | 2018-04-15T21:59:15Z | CONTRIBUTOR | I suspected this would cause some test failures, but I'll wait for opinions before attempting to fix them. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't duplicate simple primary keys in the link column 314455877 | |
381738137 | https://github.com/simonw/datasette/pull/209#issuecomment-381738137 | https://api.github.com/repos/simonw/datasette/issues/209 | MDEyOklzc3VlQ29tbWVudDM4MTczODEzNw== | russss 45057 | 2018-04-16T20:27:43Z | 2018-04-16T20:27:43Z | CONTRIBUTOR | Tests now fixed, honest. The failing test on Travis looks like an intermittent sqlite failure which should resolve itself on a retry... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't duplicate simple primary keys in the link column 314455877 | |
381763651 | https://github.com/simonw/datasette/issues/203#issuecomment-381763651 | https://api.github.com/repos/simonw/datasette/issues/203 | MDEyOklzc3VlQ29tbWVudDM4MTc2MzY1MQ== | russss 45057 | 2018-04-16T21:59:17Z | 2018-04-16T21:59:17Z | CONTRIBUTOR | Ah, I had no idea you could bind python functions into sqlite! I think the primary purpose of this issue has been served now - I'm going to close this and create a new issue for the only bit of this that hasn't been touched yet, which is (optionally) exposing units in the JSON API. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for units 313837303 | |
381905593 | https://github.com/simonw/datasette/pull/209#issuecomment-381905593 | https://api.github.com/repos/simonw/datasette/issues/209 | MDEyOklzc3VlQ29tbWVudDM4MTkwNTU5Mw== | russss 45057 | 2018-04-17T08:50:28Z | 2018-04-17T08:50:28Z | CONTRIBUTOR | I've added another commit which puts classes a class on each Unfortunately the tests are still failing on 3.6, which is weird. I can't reproduce locally... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't duplicate simple primary keys in the link column 314455877 | |
390250253 | https://github.com/simonw/datasette/issues/273#issuecomment-390250253 | https://api.github.com/repos/simonw/datasette/issues/273 | MDEyOklzc3VlQ29tbWVudDM5MDI1MDI1Mw== | rgieseke 198537 | 2018-05-18T15:49:52Z | 2018-05-18T15:49:52Z | CONTRIBUTOR | Shouldn't versioneer do that? E.g. 0.21+2.g1076c97 You'd need to install via |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out a way to have /-/version return current git commit hash 324451322 | |
390795067 | https://github.com/simonw/datasette/issues/276#issuecomment-390795067 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MDc5NTA2Nw== | russss 45057 | 2018-05-21T21:55:57Z | 2018-05-21T21:55:57Z | CONTRIBUTOR | Well, we do have the capability to detect spatialite so my intention certainly wasn't to require it. I can see the advantage of having it as a plugin but it does touch a number of points in the code. I think I'm going to attack this by refactoring the necessary bits and seeing where that leads (which was my plan anyway). I think my main concern is - if I add certain plugin hooks for this, is anything else ever going to use them? I'm not sure I have an answer to that question yet, either way. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
391050113 | https://github.com/simonw/datasette/issues/276#issuecomment-391050113 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MTA1MDExMw== | russss 45057 | 2018-05-22T16:13:00Z | 2018-05-22T16:13:00Z | CONTRIBUTOR | Yup, I'll have a think about it. My current thoughts are for spatialite we'll need to hook into the following places:
The rendering and querying hooks could also potentially be used to move the units support into a plugin. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
391059008 | https://github.com/simonw/datasette/pull/280#issuecomment-391059008 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTA1OTAwOA== | r4vi 565628 | 2018-05-22T16:40:27Z | 2018-05-22T16:40:27Z | CONTRIBUTOR | ```python
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391073009 | https://github.com/simonw/datasette/pull/279#issuecomment-391073009 | https://api.github.com/repos/simonw/datasette/issues/279 | MDEyOklzc3VlQ29tbWVudDM5MTA3MzAwOQ== | rgieseke 198537 | 2018-05-22T17:23:26Z | 2018-05-22T17:23:26Z | CONTRIBUTOR |
Yes! That's the default versioneer behaviour.
Should work now, it can be a two (for a tagged version), three or four items tuple. ``` In [2]: datasette.version Out[2]: '0.12+292.ga70c2a8.dirty' In [3]: datasette.version_info Out[3]: ('0', '12+292', 'ga70c2a8', 'dirty') ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add version number support with Versioneer 325352370 | |
391073267 | https://github.com/simonw/datasette/pull/279#issuecomment-391073267 | https://api.github.com/repos/simonw/datasette/issues/279 | MDEyOklzc3VlQ29tbWVudDM5MTA3MzI2Nw== | rgieseke 198537 | 2018-05-22T17:24:16Z | 2018-05-22T17:24:16Z | CONTRIBUTOR | Sorry, just realised you rely on |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add version number support with Versioneer 325352370 | |
391077700 | https://github.com/simonw/datasette/pull/279#issuecomment-391077700 | https://api.github.com/repos/simonw/datasette/issues/279 | MDEyOklzc3VlQ29tbWVudDM5MTA3NzcwMA== | rgieseke 198537 | 2018-05-22T17:38:17Z | 2018-05-22T17:38:17Z | CONTRIBUTOR | Alright, that should work now -- let me know if you would prefer any different behaviour. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add version number support with Versioneer 325352370 | |
391141391 | https://github.com/simonw/datasette/pull/280#issuecomment-391141391 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTE0MTM5MQ== | r4vi 565628 | 2018-05-22T21:08:39Z | 2018-05-22T21:08:39Z | CONTRIBUTOR | I'm going to clean this up for consistency tomorrow morning so hold off merging until then please On Tue, May 22, 2018 at 6:34 PM, Simon Willison notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391290271 | https://github.com/simonw/datasette/pull/280#issuecomment-391290271 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTI5MDI3MQ== | r4vi 565628 | 2018-05-23T09:53:38Z | 2018-05-23T09:53:38Z | CONTRIBUTOR | Running:
is now returning FTS5 enabled in the versions output:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391355030 | https://github.com/simonw/datasette/pull/280#issuecomment-391355030 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTM1NTAzMA== | r4vi 565628 | 2018-05-23T13:53:27Z | 2018-05-23T15:22:45Z | CONTRIBUTOR | No objections; It's good to go @simonw On Wed, 23 May 2018, 14:51 Simon Willison, notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391505930 | https://github.com/simonw/datasette/issues/276#issuecomment-391505930 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MTUwNTkzMA== | russss 45057 | 2018-05-23T21:41:37Z | 2018-05-23T21:41:37Z | CONTRIBUTOR |
Ah I didn't mean that - I meant altering the SELECT query to fetch the data so that it ran a spatialite function to transform that specific column. I think that's less useful as a general-purpose plugin hook though, and it's not that hard to parse the WKB in Python (my default approach would be to use shapely, which is great, but geomet looks like an interesting pure-python alternative). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
392825746 | https://github.com/simonw/datasette/issues/276#issuecomment-392825746 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MjgyNTc0Ng== | russss 45057 | 2018-05-29T15:42:53Z | 2018-05-29T15:42:53Z | CONTRIBUTOR | I haven't had time to look further into this, but if doing this as a plugin results in useful hooks then I think we should do it that way. We could always require the plugin as a standard dependency. I think this is going to result in quite a bit of refactoring anyway so it's a good time to add hooks regardless. On the other hand, if we have to add lots of specialist hooks for it then maybe it's worth integrating into the core. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
393106520 | https://github.com/simonw/datasette/issues/276#issuecomment-393106520 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MzEwNjUyMA== | russss 45057 | 2018-05-30T10:09:25Z | 2018-05-30T10:09:25Z | CONTRIBUTOR | I don't think it's unreasonable to only support spatialite geometries in a coordinate reference system which is at least transformable to WGS84. It would be nice to support different CRSes in the database so conversion to spatialite from the source data is lossless. I think the working CRS for datasette should be WGS84 though (leaflet requires it, for example) - it's just a case of calling |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
401310732 | https://github.com/simonw/datasette/issues/276#issuecomment-401310732 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDQwMTMxMDczMg== | psychemedia 82988 | 2018-06-29T10:05:04Z | 2018-06-29T10:07:25Z | CONTRIBUTOR | @russs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg kartena/Proj4Leaflet) although the leaflet side would need to detect or be informed of the original projection? Another possibility would be to provide an easy way/guidance for users to create an FK'd table containing the WGS84 projection of a non-WGS84 geometry in the original/principle table? This could then as a proxy for serving GeoJSON to the leaflet map? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
401312981 | https://github.com/simonw/datasette/issues/276#issuecomment-401312981 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDQwMTMxMjk4MQ== | russss 45057 | 2018-06-29T10:14:54Z | 2018-06-29T10:14:54Z | CONTRIBUTOR |
Well, as @simonw mentioned, GeoJSON only supports WGS84, and GeoJSON (and/or TopoJSON) is the standard we probably want to aim for. On-the-fly reprojection in spatialite is not an issue anyway, and in general I think you want to be serving stuff to web maps in WGS84 or Web Mercator. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
405022335 | https://github.com/simonw/datasette/issues/344#issuecomment-405022335 | https://api.github.com/repos/simonw/datasette/issues/344 | MDEyOklzc3VlQ29tbWVudDQwNTAyMjMzNQ== | russss 45057 | 2018-07-14T13:00:48Z | 2018-07-14T13:00:48Z | CONTRIBUTOR | Looks like this was a red herring actually, and heroku had a blip when I was testing it... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette publish heroku fails without name provided 341229113 | |
405026441 | https://github.com/simonw/datasette/issues/343#issuecomment-405026441 | https://api.github.com/repos/simonw/datasette/issues/343 | MDEyOklzc3VlQ29tbWVudDQwNTAyNjQ0MQ== | russss 45057 | 2018-07-14T14:17:14Z | 2018-07-14T14:17:14Z | CONTRIBUTOR | This probably depends on #294. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Render boolean fields better by default 341228846 | |
405026800 | https://github.com/simonw/datasette/issues/294#issuecomment-405026800 | https://api.github.com/repos/simonw/datasette/issues/294 | MDEyOklzc3VlQ29tbWVudDQwNTAyNjgwMA== | russss 45057 | 2018-07-14T14:24:31Z | 2018-07-14T14:24:31Z | CONTRIBUTOR | I had a quick look at this in relation to #343 and I feel like it might be worth modelling the inspected table metadata internally as an object rather than a dict. (We'd still have to serialise it back to JSON.) There are a few places where we rely on the structure of this metadata dict for various reasons, including in templates (and potentially also in user templates). It would be nice to have a reasonably well defined API for accessing metadata internally so that it's clearer what we're breaking. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
inspect should record column types 327365110 | |
422821483 | https://github.com/simonw/datasette/issues/329#issuecomment-422821483 | https://api.github.com/repos/simonw/datasette/issues/329 | MDEyOklzc3VlQ29tbWVudDQyMjgyMTQ4Mw== | jaywgraves 418191 | 2018-09-19T14:17:42Z | 2018-09-19T14:17:42Z | CONTRIBUTOR | I'm using the docker image (0.23.2) and notice some differences/bugs between the docs and the published version with canned queries. (submitted a tiny doc fix also) I was able to build the docker container locally using I would like to run this in our Kubernetes cluster but don't want to publish a version in our internal registry if I don't have to. Thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Travis should push tagged images to Docker Hub for each release 336465018 | |
422915450 | https://github.com/simonw/datasette/issues/329#issuecomment-422915450 | https://api.github.com/repos/simonw/datasette/issues/329 | MDEyOklzc3VlQ29tbWVudDQyMjkxNTQ1MA== | jaywgraves 418191 | 2018-09-19T18:45:02Z | 2018-09-20T10:50:50Z | CONTRIBUTOR | That works for me. Was able to pull the public image and no errors on my canned query. (~although a small rendering bug. I'll create an issue and if I have time today, a PR to fix~ this turned out to be my error.) Thanks for the quick response! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Travis should push tagged images to Docker Hub for each release 336465018 | |
429737929 | https://github.com/simonw/datasette/issues/366#issuecomment-429737929 | https://api.github.com/repos/simonw/datasette/issues/366 | MDEyOklzc3VlQ29tbWVudDQyOTczNzkyOQ== | gfrmin 416374 | 2018-10-15T07:32:57Z | 2018-10-15T07:32:57Z | CONTRIBUTOR | Very hacky solution is to write now.json file forcing the usage of v1 of Zeit cloud, see https://github.com/slygent/datasette/commit/3ab824793ec6534b6dd87078aa46b11c4fa78ea3 This does work, at least. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Default built image size over Zeit Now 100MiB limit 369716228 | |
435768450 | https://github.com/simonw/datasette/issues/369#issuecomment-435768450 | https://api.github.com/repos/simonw/datasette/issues/369 | MDEyOklzc3VlQ29tbWVudDQzNTc2ODQ1MA== | gfrmin 416374 | 2018-11-05T06:31:59Z | 2018-11-05T06:31:59Z | CONTRIBUTOR | That would be ideal, but you know better than me whether the CSV streaming trick works for custom SQL queries. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Interface should show same JSON shape options for custom SQL queries 374953006 | |
435862009 | https://github.com/simonw/datasette/issues/371#issuecomment-435862009 | https://api.github.com/repos/simonw/datasette/issues/371 | MDEyOklzc3VlQ29tbWVudDQzNTg2MjAwOQ== | psychemedia 82988 | 2018-11-05T12:48:35Z | 2018-11-05T12:48:35Z | CONTRIBUTOR | I think you need to register a domain name you own separately in order to get a non-IP address address? https://www.digitalocean.com/docs/networking/dns/ |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette publish digitalocean plugin 377156339 | |
436037692 | https://github.com/simonw/datasette/issues/370#issuecomment-436037692 | https://api.github.com/repos/simonw/datasette/issues/370 | MDEyOklzc3VlQ29tbWVudDQzNjAzNzY5Mg== | psychemedia 82988 | 2018-11-05T21:15:47Z | 2018-11-05T21:18:37Z | CONTRIBUTOR | In terms of integration with
The |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Integration with JupyterLab 377155320 | |
436042445 | https://github.com/simonw/datasette/issues/370#issuecomment-436042445 | https://api.github.com/repos/simonw/datasette/issues/370 | MDEyOklzc3VlQ29tbWVudDQzNjA0MjQ0NQ== | psychemedia 82988 | 2018-11-05T21:30:42Z | 2018-11-05T21:31:48Z | CONTRIBUTOR | Another route would be something like creating a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Integration with JupyterLab 377155320 | |
459915995 | https://github.com/simonw/datasette/issues/160#issuecomment-459915995 | https://api.github.com/repos/simonw/datasette/issues/160 | MDEyOklzc3VlQ29tbWVudDQ1OTkxNTk5NQ== | psychemedia 82988 | 2019-02-02T00:43:16Z | 2019-02-02T00:58:20Z | CONTRIBUTOR | Do you have any simple working examples of how to use If Use case is here: https://github.com/psychemedia/jupyterserverproxy-datasette-demo Trying to do a really simple |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to bundle and serve additional static files 278208011 | |
474280581 | https://github.com/simonw/datasette/issues/417#issuecomment-474280581 | https://api.github.com/repos/simonw/datasette/issues/417 | MDEyOklzc3VlQ29tbWVudDQ3NDI4MDU4MQ== | psychemedia 82988 | 2019-03-19T10:06:42Z | 2019-03-19T10:06:42Z | CONTRIBUTOR | This would be really interesting but several possibilities in use arise, I think? For example:
CSV files may also have messy names compared to the table you want. Or for an update CSV, may have the form |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette Library 421546944 | |
474282321 | https://github.com/simonw/datasette/issues/412#issuecomment-474282321 | https://api.github.com/repos/simonw/datasette/issues/412 | MDEyOklzc3VlQ29tbWVudDQ3NDI4MjMyMQ== | psychemedia 82988 | 2019-03-19T10:09:46Z | 2019-03-19T10:09:46Z | CONTRIBUTOR | Does this also relate to https://github.com/simonw/datasette/issues/283 and the ability to |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Linked Data(sette) 411257981 | |
483017176 | https://github.com/simonw/datasette/issues/431#issuecomment-483017176 | https://api.github.com/repos/simonw/datasette/issues/431 | MDEyOklzc3VlQ29tbWVudDQ4MzAxNzE3Ng== | psychemedia 82988 | 2019-04-14T16:58:37Z | 2019-04-14T16:58:37Z | CONTRIBUTOR | Hmm... nope... I see an updated timestamp from |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette doesn't reload when database file changes 432870248 | |
483202658 | https://github.com/simonw/datasette/issues/429#issuecomment-483202658 | https://api.github.com/repos/simonw/datasette/issues/429 | MDEyOklzc3VlQ29tbWVudDQ4MzIwMjY1OA== | psychemedia 82988 | 2019-04-15T10:48:01Z | 2019-04-15T10:48:01Z | CONTRIBUTOR | Minor UI observation:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
?_where=sql-fragment parameter for table views 432636432 | |
487537452 | https://github.com/simonw/datasette/pull/437#issuecomment-487537452 | https://api.github.com/repos/simonw/datasette/issues/437 | MDEyOklzc3VlQ29tbWVudDQ4NzUzNzQ1Mg== | russss 45057 | 2019-04-29T10:58:49Z | 2019-04-29T10:58:49Z | CONTRIBUTOR | I've just spotted that this implements #215. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add inspect and prepare_sanic hooks 438048318 | |
487542486 | https://github.com/simonw/datasette/pull/439#issuecomment-487542486 | https://api.github.com/repos/simonw/datasette/issues/439 | MDEyOklzc3VlQ29tbWVudDQ4NzU0MjQ4Ng== | russss 45057 | 2019-04-29T11:20:30Z | 2019-04-29T11:20:30Z | CONTRIBUTOR | Actually I think this is not the whole story because of the rowid issue. I'm going to think about this one a bit more. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add primary key to the extra_body_script hook arguments 438240541 | |
487686655 | https://github.com/simonw/datasette/pull/441#issuecomment-487686655 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzY4NjY1NQ== | russss 45057 | 2019-04-29T18:14:25Z | 2019-04-29T18:14:25Z | CONTRIBUTOR | Subsidiary note which I forgot in the commit message: I've decided to give each view a short string name to aid in differentiating which view a hook is being called from. Since hooks are functions and not subclasses, and can get called from different places in the URL hierarchy, it's sometimes difficult to distinguish what data you're actually operating on. I think this will come in handy for other hooks as well. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487689477 | https://github.com/simonw/datasette/pull/424#issuecomment-487689477 | https://api.github.com/repos/simonw/datasette/issues/424 | MDEyOklzc3VlQ29tbWVudDQ4NzY4OTQ3Nw== | russss 45057 | 2019-04-29T18:22:40Z | 2019-04-29T18:22:40Z | CONTRIBUTOR | This is pretty conflicty because I forgot how to use git fetch. If you're interested in merging this I'll rewrite it against an actual modern checkout... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Column types in inspected metadata 427429265 | |
487692377 | https://github.com/simonw/datasette/pull/424#issuecomment-487692377 | https://api.github.com/repos/simonw/datasette/issues/424 | MDEyOklzc3VlQ29tbWVudDQ4NzY5MjM3Nw== | russss 45057 | 2019-04-29T18:30:46Z | 2019-04-29T18:30:46Z | CONTRIBUTOR | Actually no, I ended up not using the inspected column types in my plugin, and the binary column issue can be solved a lot more simply, so I'll close this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Column types in inspected metadata 427429265 | |
487723476 | https://github.com/simonw/datasette/pull/441#issuecomment-487723476 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzcyMzQ3Ng== | russss 45057 | 2019-04-29T20:05:23Z | 2019-04-29T20:05:23Z | CONTRIBUTOR | This is the minimal example (I also included it in the docs): ```python from datasette import hookimpl def render_test(args, data, view_name): return { 'body': 'Hello World', 'content_type': 'text/plain' } @hookimpl def register_output_renderer(): return { 'extension': 'test', 'callback': render_test } ``` I'm working on the GeoJSON one now and it should be ready soon. (I forgot I was going to run into the same problem as before - that Spatialite's stupid binary format isn't WKB and I have no way of altering the query to change that - but I've just managed to write some code to rearrange the bytes from Spatialite blob-geometry into WKB...) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487724539 | https://github.com/simonw/datasette/pull/441#issuecomment-487724539 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzcyNDUzOQ== | russss 45057 | 2019-04-29T20:08:32Z | 2019-04-29T20:08:32Z | CONTRIBUTOR | I also just realised that I should be passing the datasette object into the hook function...as I just found I need it. So hold off merging until I've fixed that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487735247 | https://github.com/simonw/datasette/pull/441#issuecomment-487735247 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzczNTI0Nw== | russss 45057 | 2019-04-29T20:39:43Z | 2019-04-29T20:39:43Z | CONTRIBUTOR | I updated the hook to pass the datasette object through now. You can see the working GeoJSON render function here - the hook function is here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487748271 | https://github.com/simonw/datasette/pull/441#issuecomment-487748271 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4Nzc0ODI3MQ== | russss 45057 | 2019-04-29T21:20:17Z | 2019-04-29T21:20:17Z | CONTRIBUTOR | Also I just pushed a change to add registered output renderers to the templates: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487859345 | https://github.com/simonw/datasette/pull/439#issuecomment-487859345 | https://api.github.com/repos/simonw/datasette/issues/439 | MDEyOklzc3VlQ29tbWVudDQ4Nzg1OTM0NQ== | russss 45057 | 2019-04-30T08:21:19Z | 2019-04-30T08:21:19Z | CONTRIBUTOR | I think the best approach to this is to pass through the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add primary key to the extra_body_script hook arguments 438240541 | |
488247617 | https://github.com/simonw/datasette/pull/441#issuecomment-488247617 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4ODI0NzYxNw== | russss 45057 | 2019-05-01T09:57:50Z | 2019-05-01T09:57:50Z | CONTRIBUTOR | Just for the record, this PR is now finished and ready to merge from my perspective. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
488595724 | https://github.com/simonw/datasette/pull/432#issuecomment-488595724 | https://api.github.com/repos/simonw/datasette/issues/432 | MDEyOklzc3VlQ29tbWVudDQ4ODU5NTcyNA== | russss 45057 | 2019-05-02T08:50:53Z | 2019-05-02T08:50:53Z | CONTRIBUTOR |
I was thinking that it might be handy for datasette to have a request object which wraps the Sanic Request. This could include the datasette-specific querystring decoding and the This would mean that we could expose the request object to plugin hooks without coupling them to Sanic. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Refactor facets to a class and new plugin, refs #427 432893491 | |
489060765 | https://github.com/simonw/datasette/issues/419#issuecomment-489060765 | https://api.github.com/repos/simonw/datasette/issues/419 | MDEyOklzc3VlQ29tbWVudDQ4OTA2MDc2NQ== | russss 45057 | 2019-05-03T11:07:42Z | 2019-05-03T11:07:42Z | CONTRIBUTOR | Are you planning on removing inspect entirely? I didn't spot this work before I started on datasette-geo, but ironically I think it has a use case which really needs the inspect functionality (or some replacement). Datasette-geo uses it to store the bounding box of all the geographic features in the table. This is needed when rendering the map because it avoids having to send loads of tile requests for areas which are empty. Even with relatively small datasets, calculating the bounding box seems to take around 5 seconds, so I don't think it's really feasible to do this on page load. One possible fix would be to do this on startup, and then in a thread which watches the database for changes. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Default to opening files in mutable mode, special option for immutable files 421551434 | |
489105665 | https://github.com/simonw/datasette/pull/434#issuecomment-489105665 | https://api.github.com/repos/simonw/datasette/issues/434 | MDEyOklzc3VlQ29tbWVudDQ4OTEwNTY2NQ== | eyeseast 25778 | 2019-05-03T14:01:30Z | 2019-05-03T14:01:30Z | CONTRIBUTOR | This is exactly what I needed. Thank you. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette publish cloudrun" command to publish to Google Cloud Run 434321685 | |
489163939 | https://github.com/simonw/datasette/pull/434#issuecomment-489163939 | https://api.github.com/repos/simonw/datasette/issues/434 | MDEyOklzc3VlQ29tbWVudDQ4OTE2MzkzOQ== | rprimet 10352819 | 2019-05-03T16:49:45Z | 2019-05-03T16:50:03Z | CONTRIBUTOR |
Yes, I was able to reproduce this; I used to get prompted for a run region interactively by the Not sure which course of action is best: making |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette publish cloudrun" command to publish to Google Cloud Run 434321685 | |
489221481 | https://github.com/simonw/datasette/issues/446#issuecomment-489221481 | https://api.github.com/repos/simonw/datasette/issues/446 | MDEyOklzc3VlQ29tbWVudDQ4OTIyMTQ4MQ== | russss 45057 | 2019-05-03T19:58:31Z | 2019-05-03T19:58:31Z | CONTRIBUTOR | In this particular case I don't think there's an issue making all those required. However, I suspect we might have to allow optional values at some point - my preferred solution to russss/datasette-geo#2 would need one. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Define mechanism for plugins to return structured data 440134714 | |
489222223 | https://github.com/simonw/datasette/issues/446#issuecomment-489222223 | https://api.github.com/repos/simonw/datasette/issues/446 | MDEyOklzc3VlQ29tbWVudDQ4OTIyMjIyMw== | russss 45057 | 2019-05-03T20:01:19Z | 2019-05-03T20:01:29Z | CONTRIBUTOR | Also I have a slight preference against (ab)using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Define mechanism for plugins to return structured data 440134714 | |
489342728 | https://github.com/simonw/datasette/pull/450#issuecomment-489342728 | https://api.github.com/repos/simonw/datasette/issues/450 | MDEyOklzc3VlQ29tbWVudDQ4OTM0MjcyOA== | russss 45057 | 2019-05-04T16:37:35Z | 2019-05-04T16:37:35Z | CONTRIBUTOR | For a bit more context: this fixes a crash with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Coalesce hidden table count to 0 440304714 | |
499320973 | https://github.com/simonw/datasette/issues/394#issuecomment-499320973 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ5OTMyMDk3Mw== | kevindkeogh 13896256 | 2019-06-06T02:07:59Z | 2019-06-06T02:07:59Z | CONTRIBUTOR | Hey was this ever merged? Trying to run this behind nginx, and encountering this issue. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url configuration setting 396212021 | |
499923145 | https://github.com/simonw/datasette/issues/394#issuecomment-499923145 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ5OTkyMzE0NQ== | kevindkeogh 13896256 | 2019-06-07T15:10:57Z | 2019-06-07T15:11:07Z | CONTRIBUTOR | Putting this here in case anyone else encounters the same issue with nginx, I was able to resolve it by passing the header in the nginx proxy config (i.e., |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url configuration setting 396212021 | |
504662904 | https://github.com/simonw/datasette/issues/514#issuecomment-504662904 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY2MjkwNA== | russss 45057 | 2019-06-22T12:45:21Z | 2019-06-22T12:45:39Z | CONTRIBUTOR | On most modern Linux distros, systemd is the easiest answer. Example systemd unit file (save to [Service] Type=simple User=<username> WorkingDirectory=/path/to/data ExecStart=/path/to/datasette serve -h 0.0.0.0 ./my.db Restart=on-failure [Install] WantedBy=multi-user.target ``` Activate it with:
Logs are best viewed using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504663766 | https://github.com/simonw/datasette/issues/514#issuecomment-504663766 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY2Mzc2Ng== | russss 45057 | 2019-06-22T12:57:59Z | 2019-06-22T12:57:59Z | CONTRIBUTOR |
I wasn't even aware it was possible to add a systemd service at an arbitrary path, but it seems a little messy to me. Maybe worth noting that systemd does support per-user services which don't require root access. Cool but probably overkill for most people (especially when you're going to need root to listen on port 80 anyway, directly or via a reverse proxy). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504684831 | https://github.com/simonw/datasette/issues/514#issuecomment-504684831 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY4NDgzMQ== | russss 45057 | 2019-06-22T17:38:23Z | 2019-06-22T17:38:23Z | CONTRIBUTOR |
It's the working directory (cwd) of the spawned process. In this case if you set it to the directory your data is in, you can use relative paths to the db (and metadata/templates/etc) in the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504690927 | https://github.com/simonw/datasette/issues/514#issuecomment-504690927 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY5MDkyNw== | russss 45057 | 2019-06-22T19:06:07Z | 2019-06-22T19:06:07Z | CONTRIBUTOR | I'd rather not turn this into a systemd support thread, but you're trying to execute the package directory there. Your datasette executable is probably at |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504809397 | https://github.com/simonw/datasette/issues/523#issuecomment-504809397 | https://api.github.com/repos/simonw/datasette/issues/523 | MDEyOklzc3VlQ29tbWVudDUwNDgwOTM5Nw== | rixx 2657547 | 2019-06-24T01:38:14Z | 2019-06-24T01:38:14Z | CONTRIBUTOR | Ah, apologies – I had found and read those issues, but I was under the impression that they refered only to the filtered row count, not the unfiltered total row count. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Show total/unfiltered row count when filtering 459627549 | |
509013413 | https://github.com/simonw/datasette/issues/507#issuecomment-509013413 | https://api.github.com/repos/simonw/datasette/issues/507 | MDEyOklzc3VlQ29tbWVudDUwOTAxMzQxMw== | psychemedia 82988 | 2019-07-07T16:31:57Z | 2019-07-07T16:31:57Z | CONTRIBUTOR | Chrome and Firefox both support headless screengrabs from command line, but I don't know how parameterised they can be? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Every datasette plugin on the ecosystem page should have a screenshot 455852801 | |
509618339 | https://github.com/simonw/datasette/pull/554#issuecomment-509618339 | https://api.github.com/repos/simonw/datasette/issues/554 | MDEyOklzc3VlQ29tbWVudDUwOTYxODMzOQ== | abdusco 3243482 | 2019-07-09T12:16:32Z | 2019-07-09T12:16:32Z | CONTRIBUTOR | I've also added another fix for using static mounts with absolute paths on Windows. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix static mounts using relative paths and prevent traversal exploits 465728430 | |
509629331 | https://github.com/simonw/datasette/pull/554#issuecomment-509629331 | https://api.github.com/repos/simonw/datasette/issues/554 | MDEyOklzc3VlQ29tbWVudDUwOTYyOTMzMQ== | abdusco 3243482 | 2019-07-09T12:51:35Z | 2019-07-09T12:51:35Z | CONTRIBUTOR | I wanted to add a test for it too, but I've realized it's impossible to test a server process as we cannot get its exit code. ```python tests/test_cli.pydef test_static_mounts_on_windows(): if sys.platform != "win32": return runner = CliRunner() result = runner.invoke( cli, ["serve", "--static", r"s:C:\"] ) assert result.exit_code == 0 ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix static mounts using relative paths and prevent traversal exploits 465728430 | |
510730200 | https://github.com/simonw/datasette/issues/511#issuecomment-510730200 | https://api.github.com/repos/simonw/datasette/issues/511 | MDEyOklzc3VlQ29tbWVudDUxMDczMDIwMA== | abdusco 3243482 | 2019-07-12T03:23:22Z | 2019-07-12T03:23:22Z | CONTRIBUTOR | @simonw yes it works fine on Windows, but test suite doesn't run properly, for that I had to use WSL |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Get Datasette tests passing on Windows in GitHub Actions 456578474 | |
527209840 | https://github.com/simonw/sqlite-utils/pull/56#issuecomment-527209840 | https://api.github.com/repos/simonw/sqlite-utils/issues/56 | MDEyOklzc3VlQ29tbWVudDUyNzIwOTg0MA== | amjith 49260 | 2019-09-02T17:23:21Z | 2019-09-02T17:23:21Z | CONTRIBUTOR | I have updated the other PR with the changes from this one and added tests. I have also changed the escaping from double quotes to brackets. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Escape the table name in populate_fts and search. 487847945 | |
527211047 | https://github.com/simonw/sqlite-utils/pull/57#issuecomment-527211047 | https://api.github.com/repos/simonw/sqlite-utils/issues/57 | MDEyOklzc3VlQ29tbWVudDUyNzIxMTA0Nw== | amjith 49260 | 2019-09-02T17:30:43Z | 2019-09-02T17:30:43Z | CONTRIBUTOR | I have merged the other PR (#56) into this one. I have incorporated your suggestions. Cheers! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add triggers while enabling FTS 487987958 | |
533818697 | https://github.com/simonw/sqlite-utils/issues/61#issuecomment-533818697 | https://api.github.com/repos/simonw/sqlite-utils/issues/61 | MDEyOklzc3VlQ29tbWVudDUzMzgxODY5Nw== | amjith 49260 | 2019-09-21T18:09:01Z | 2019-09-21T18:09:28Z | CONTRIBUTOR | @witeshadow The library version doesn't have helpers around CSV (at least not from what I can see in the code). But here's a snippet that makes it easy to insert from CSV using the library. ``` import csv from sqlite_utils import Database CSV Readercsv_file = open("filename.csv") # open the csv file. reader = csv.reader(csv_file) # Create a CSV reader headers = next(reader) # First line is the header docs = (dict(zip(headers, row)) for row in reader) Now you can use the
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
importing CSV to SQLite as library 491219910 | |
541052329 | https://github.com/simonw/datasette/issues/585#issuecomment-541052329 | https://api.github.com/repos/simonw/datasette/issues/585 | MDEyOklzc3VlQ29tbWVudDU0MTA1MjMyOQ== | rixx 2657547 | 2019-10-11T12:53:51Z | 2019-10-11T12:53:51Z | CONTRIBUTOR | I think this would be good, yeah – currently, databases are explicitly sorted by name in the IndexView, we could just remove that part (and use an |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Databases on index page should display in order they were passed to "datasette serve"? 503217375 | |
541118904 | https://github.com/simonw/datasette/issues/507#issuecomment-541118904 | https://api.github.com/repos/simonw/datasette/issues/507 | MDEyOklzc3VlQ29tbWVudDU0MTExODkwNA== | rixx 2657547 | 2019-10-11T15:48:49Z | 2019-10-11T15:48:49Z | CONTRIBUTOR | Headless Chrome and Firefox via Selenium are a solid choice in my experience. You may be interested in how pretix and pretalx solve this problem: They use pytest to create those screenshots on release to make sure they are up to date. See this writeup and this repo. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Every datasette plugin on the ecosystem page should have a screenshot 455852801 | |
541119038 | https://github.com/simonw/datasette/issues/512#issuecomment-541119038 | https://api.github.com/repos/simonw/datasette/issues/512 | MDEyOklzc3VlQ29tbWVudDU0MTExOTAzOA== | rixx 2657547 | 2019-10-11T15:49:13Z | 2019-10-11T15:49:13Z | CONTRIBUTOR | How open are you to changing the config variable names (with appropriate deprecation, of course)? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"about" parameter in metadata does not appear when alone 457147936 | |
541562581 | https://github.com/simonw/datasette/pull/590#issuecomment-541562581 | https://api.github.com/repos/simonw/datasette/issues/590 | MDEyOklzc3VlQ29tbWVudDU0MTU2MjU4MQ== | rixx 2657547 | 2019-10-14T08:57:46Z | 2019-10-14T08:57:46Z | CONTRIBUTOR | Ah, thank you – I saw the need for unit tests but wasn't sure what the best way to add one would be. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spaces in DB names 505818256 | |
541587823 | https://github.com/simonw/datasette/pull/590#issuecomment-541587823 | https://api.github.com/repos/simonw/datasette/issues/590 | MDEyOklzc3VlQ29tbWVudDU0MTU4NzgyMw== | rixx 2657547 | 2019-10-14T09:58:23Z | 2019-10-14T09:58:23Z | CONTRIBUTOR | Added tests. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spaces in DB names 505818256 | |
544008463 | https://github.com/simonw/datasette/pull/601#issuecomment-544008463 | https://api.github.com/repos/simonw/datasette/issues/601 | MDEyOklzc3VlQ29tbWVudDU0NDAwODQ2Mw== | rixx 2657547 | 2019-10-18T23:39:21Z | 2019-10-18T23:39:21Z | CONTRIBUTOR | That looks right, and I completely agree with the intent. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't auto-format SQL on page load 509340359 | |
544008944 | https://github.com/simonw/datasette/pull/601#issuecomment-544008944 | https://api.github.com/repos/simonw/datasette/issues/601 | MDEyOklzc3VlQ29tbWVudDU0NDAwODk0NA== | rixx 2657547 | 2019-10-18T23:40:48Z | 2019-10-18T23:40:48Z | CONTRIBUTOR | The only negative impact that comes to mind is that now you have no way to get the read-only query to be formatted nicely, I think, so maybe a second PR adding the formatting functionality even to the read-only page would be good? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't auto-format SQL on page load 509340359 | |
544214418 | https://github.com/simonw/datasette/pull/601#issuecomment-544214418 | https://api.github.com/repos/simonw/datasette/issues/601 | MDEyOklzc3VlQ29tbWVudDU0NDIxNDQxOA== | rixx 2657547 | 2019-10-20T02:29:49Z | 2019-10-20T02:29:49Z | CONTRIBUTOR | Submitted in #602! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't auto-format SQL on page load 509340359 | |
549246007 | https://github.com/simonw/datasette/pull/602#issuecomment-549246007 | https://api.github.com/repos/simonw/datasette/issues/602 | MDEyOklzc3VlQ29tbWVudDU0OTI0NjAwNw== | rixx 2657547 | 2019-11-04T07:29:33Z | 2019-11-04T07:29:33Z | CONTRIBUTOR | Not sure – I'm always a bit weirded out when elements that I clicked disappear on me. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Offer to format readonly SQL 509535510 | |
552134876 | https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552134876 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29 | MDEyOklzc3VlQ29tbWVudDU1MjEzNDg3Ng== | jacobian 21148 | 2019-11-09T20:33:38Z | 2019-11-09T20:33:38Z | CONTRIBUTOR | ❤️ thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`import` command fails on empty files 518725064 | |
556749086 | https://github.com/simonw/datasette/issues/394#issuecomment-556749086 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDU1Njc0OTA4Ng== | jsfenfen 639012 | 2019-11-21T01:15:34Z | 2019-11-21T01:21:45Z | CONTRIBUTOR | Hey @simonw is the url_prefix config option available in another branch, it looks like you've written some tests for it above? In 0.32 I get "url_prefix is not a valid option". I think this would be really helpful! This would be really handy for proxying datasette in another domain's subdirectory I believe this will allow folks to run upstream authentication, but the links break if the url_prefix doesn't match. I'd prefer not to host a proxied version of datasette on a subdomain (e.g. datasette.myurl.com b/c then I gotta worry about sharing authorization cookies with the subdomain, which I just assume not do, but...) Edit: I see the wip-url-prefix branch, I may try with that https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url configuration setting 396212021 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
created_at (date) 379 ✖