issue_comments
10,495 rows sorted by author_association
This data as json, CSV (advanced)
Suggested facets: reactions, performed_via_github_app, created_at (date)
issue >1000
- Redesign default .json format 55
- Show column metadata plus links for foreign keys on arbitrary query results 51
- ?_extra= support (draft) 49
- Rethink how .ext formats (v.s. ?_format=) works before 1.0 48
- Upgrade to CodeMirror 6, add SQL autocomplete 48
- JavaScript plugin hooks mechanism similar to pluggy 47
- Updated Dockerfile with SpatiaLite version 5.0 45
- Complete refactor of TableView and table.html template 45
- Port Datasette to ASGI 42
- Authentication (and permissions) as a core concept 40
- invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 36
- Deploy a live instance of demos/apache-proxy 34
- await datasette.client.get(path) mechanism for executing internal requests 33
- Maintain an in-memory SQLite table of connected databases and their tables 32
- Research: demonstrate if parallel SQL queries are worthwhile 32
- Ability to sort (and paginate) by column 31
- Server hang on parallel execution of queries to named in-memory databases 31
- Default API token authentication mechanism 30
- Port as many tests as possible to async def tests against ds_client 29
- link_or_copy_directory() error - Invalid cross-device link 28
- Add ?_extra= mechanism for requesting extra properties in JSON 27
- Export to CSV 27
- base_url configuration setting 27
- Documentation with recommendations on running Datasette in production without using Docker 27
- Optimize all those calls to index_list and foreign_key_list 27
- Support cross-database joins 26
- Ability for a canned query to write to the database 26
- table.transform() method for advanced alter table 26
- New pattern for views that return either JSON or HTML, available for plugins 26
- Proof of concept for Datasette on AWS Lambda with EFS 25
- WIP: Add Gmail takeout mbox import 25
- Make it easier to insert geometries, with documentation and maybe code 25
- DeprecationWarning: pkg_resources is deprecated as an API 25
- Redesign register_output_renderer callback 24
- API explorer tool 24
- De-tangling Metadata before Datasette 1.0 24
- Stream all results for arbitrary SQL and canned queries 23
- "datasette insert" command and plugin hook 23
- Option for importing CSV data using the SQLite .import mechanism 23
- If a row has a primary key of `null` various things break 23
- Datasette Plugins 22
- .json and .csv exports fail to apply base_url 22
- Use YAML examples in documentation by default, not JSON 22
- Idea: import CSV to memory, run SQL, export in a single command 22
- Plugin hook for dynamic metadata 22
- base_url is omitted in JSON and CSV views 22
- API to insert a single record into an existing table 22
- UI to create reduced scope tokens from the `/-/create-token` page 22
- Handle spatialite geometry columns better 21
- table.extract(...) method and "sqlite-utils extract" command 21
- Database page loads too slowly with many large tables (due to table counts) 21
- ?sort=colname~numeric to sort by by column cast to real 21
- Mechanism for storing metadata in _metadata tables 21
- Switch documentation theme to Furo 21
- "flash messages" mechanism 20
- Move CI to GitHub Issues 20
- load_template hook doesn't work for include/extends 20
- Introduce concept of a database `route`, separate from its name 20
- CSV files with too many values in a row cause errors 20
- register_permissions(datasette) plugin hook 20
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 20
- API tokens with view-table but not view-database/view-instance cannot access the table 20
- Better way of representing binary data in .csv output 19
- Introspect if table is FTS4 or FTS5 19
- A proper favicon 19
- `datasette create-token` ability to create tokens with a reduced set of permissions 19
- Package as standalone binary 18
- Ability to ship alpha and beta releases 18
- Magic parameters for canned queries 18
- Support column descriptions in metadata.json 18
- datasette.client internal requests mechanism 18
- Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 18
- Figure out why SpatiaLite 5.0 hangs the database page on Linux 18
- Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" 18
- Update screenshots in documentation to match latest designs 18
- Consider using CSP to protect against future XSS 17
- Facets 16
- ?_col= and ?_nocol= support for toggling columns on table view 16
- Support "allow" block on root, databases and tables, not just queries 16
- Action menu for table columns 16
- Make it possible to download BLOB data from the Datasette UI 16
- sqlite-utils extract could handle nested objects 16
- `--batch-size 1` doesn't seem to commit for every item 16
- create-index should run analyze after creating index 16
- Add new spatialite helper methods 16
- Intermittent "Too many open files" error running tests 16
- Update a single record in an existing table 16
- Autocomplete text entry for filter values that correspond to facets 16
- Resolve the difference between `wrap_view()` and `BaseView` 16
- Bug: Sort by column with NULL in next_page URL 15
- Mechanism for customizing the SQL used to select specific columns in the table view 15
- The ".upsert()" method is misnamed 15
- Prototoype for Datasette on PostgreSQL 15
- --dirs option for scanning directories for SQLite databases 15
- Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread() 15
- latest.datasette.io is no longer updating 15
- "sqlite-utils convert" command to replace the separate "sqlite-transform" tool 15
- --lines and --text and --convert and --import 15
- Documentation should clarify /stable/ vs /latest/ 15
- Tests reliably failing on Python 3.7 15
- Transformation type `--type DATETIME` 15
- Ability to customize presentation of specific columns in HTML view 14
- Allow plugins to define additional URL routes and views 14
- Mechanism for turning nested JSON into foreign keys / many-to-many 14
- "Invalid SQL" page should let you edit the SQL 14
- .execute_write() and .execute_write_fn() methods on Database 14
- Upload all my photos to a secure S3 bucket 14
- Canned query permissions mechanism 14
- Incorrect URLs when served behind a proxy with base_url set 14
- "datasette -p 0 --root" gives the wrong URL 14
- Plugin hook for loading templates 14
- Support STRICT tables 14
- Advanced class-based `conversions=` mechanism 14
- "permissions" propery in metadata for configuring arbitrary permissions 14
- Design plugin hook for extras 14
- `handle_exception` plugin hook for custom error handling 14
- Refactor out the keyset pagination code 14
- Potential feature: special support for `?a=1&a=2` on the query page 14
- Dockerfile should build more recent SQLite with FTS5 and spatialite support 13
- Fix all the places that currently use .inspect() data 13
- Plugin hook: filters_from_request 13
- Get Datasette tests passing on Windows in GitHub Actions 13
- If you apply ?_facet_array=tags then &_facet=tags does nothing 13
- Mechanism for adding arbitrary pages like /about 13
- Import machine-learning detected labels (dog, llama etc) from Apple Photos 13
- Mechanism for skipping CSRF checks on API posts 13
- table.transform() method 13
- register_output_renderer() should support streaming data 13
- Policy on documenting "public" datasette.utils functions 13
- Async support 13
- Serve using UNIX domain socket 13
- `register_commands()` plugin hook to register extra CLI commands 13
- Fix compatibility with Python 3.10 13
- Optional Pandas integration 13
- Refactor TableView to use asyncinject 13
- Writable canned queries fail with useless non-error against immutable databases 13
- google cloudrun updated their limits on maxscale based on memory and cpu count 13
- Mechanism for ensuring a table has all the columns 13
- docker image is duplicating db files somehow 13
- Write API in Datasette core 13
- Errors when using table filters behind a proxy 13
- Call for birthday presents: if you're using Datasette, let us know how you're using it here 13
- Make sure CORS works for write APIs 13
- Make detailed notes on how table, query and row views work right now 13
- .transform() instead of modifying sqlite_master for add_foreign_keys 13
- Add “updated” to metadata 12
- Metadata should be a nested arbitrary KV store 12
- Mechanism for ranking results from SQLite full-text search 12
- Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1 12
- Package datasette for installation using homebrew 12
- Datasette Library 12
- _facet_array should work against views 12
- Full text search of all tables at once? 12
- Port Datasette from Sanic to ASGI + Uvicorn 12
- Populate "endpoint" key in ASGI scope 12
- --cp option for datasette publish and datasette package for shipping additional files and directories 12
- base_url doesn't entirely work for running Datasette inside Binder 12
- Having view-table permission but NOT view-database should still grant access to /db/table 12
- Efficiently calculate list of databases/tables a user can view 12
- Ability for plugins to collaborate when adding extra HTML to blocks in default templates 12
- Support creating descending order indexes 12
- Rethink approach to [ and ] in column names (currently throws error) 12
- Research: CTEs and union all to calculate facets AND query at the same time 12
- Traces should include SQL executed by subtasks created with `asyncio.gather` 12
- Ensure "pip install datasette" still works with Python 3.6 12
- Tilde encoding: use ~ instead of - for dash-encoding 12
- Document how to use a `--convert` function that runs initialization code first 12
- Code examples in the documentation should be formatted with Black 12
- Implement ?_extra and new API design for TableView 12
- Misleading progress bar against utf-16-le CSV input 12
- sqlite-utils query --functions mechanism for registering extra functions 12
- SQL query field can't begin by a comment 12
- API for bulk inserting records into a table 12
- `/db/-/create` API for creating tables 12
- WIP new JSON for queries 12
- Implement command-line tool interface 11
- Option to expose expanded foreign keys in JSON/CSV 11
- datasette publish lambda plugin 11
- Mechanism for checking if a SQLite database file is safe to open 11
- Expand plugins documentation to multiple pages 11
- Mechanism for plugins to add action menu items for various things 11
- --since feature can be confused by retweets 11
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 11
- Datasette secret mechanism - initially for signed cookies 11
- Writable canned queries live demo on Glitch 11
- base_url doesn't seem to work when adding criteria and clicking "apply" 11
- POST to /db/canned-query that returns JSON should be supported (for API clients) 11
- datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates 11
- Writable canned queries with magic parameters fail if POST body is empty 11
- Database class mechanism for cross-connection in-memory databases 11
- Race condition errors in new refresh_schemas() mechanism 11
- "Query parameters" form shows wrong input fields if query contains "03:31" style times 11
- render_cell() hook should support returning an awaitable 11
- sqlite-utils index-foreign-keys fails due to pre-existing index 11
- `sqlite-utils insert --convert` option 11
- Research how much of a difference analyze / sqlite_stat1 makes 11
- Table+query JSON and CSV links broken when using `base_url` setting 11
- Options for how `r.parsedate()` should handle invalid dates 11
- Research: how much overhead does the n=1 time limit have? 11
- Expose convert recipes to `sqlite-utils --functions` 11
- `prepare_jinja2_environment()` hook should take `datasette` argument 11
- Ensure insert API has good tests for rowid and compound primark key tables 11
- datasette package --spatialite throws error during build 11
- datasette --root running in Docker doesn't reliably show the magic URL 11
- New JSON design for query views 11
- Set up some example datasets on a Cloudflare-backed domain 10
- Filter UI on table page 10
- Support for units 10
- Build Dockerfile with recent Sqlite + Spatialite 10
- Table view should support filtering via many-to-many relationships 10
- Default to opening files in mutable mode, special option for immutable files 10
- New design for facet abstraction, including querystring and metadata.json 10
- Improvements to table label detection 10
- Syntactic sugar for creating m2m records 10
- Option to display binary data 10
- extracts= should support multiple-column extracts 10
- Documented internals API for use in plugins 10
- Mechanism for writing to database via a queue 10
- See if I can get Datasette working on Zeit Now v2 10
- Ability to serve thumbnailed Apple Photo from its place on disk 10
- Release Datasette 0.44 10
- Rename master branch to main 10
- Plugin hook for instance/database/table metadata 10
- Refactor default views to use register_routes 10
- CLI utility for inserting binary files into SQLite 10
- FTS table with 7 rows has _fts_docsize table with 9,141 rows 10
- Switch to .blob render extension for BLOB downloads 10
- Navigation menu plus plugin hook 10
- Adopt Prettier for JavaScript code formatting 10
- --no-headers option for CSV and TSV 10
- Add support for Jinja2 version 3.0 10
- Test Datasette Docker images built for different architectures 10
- Research: syntactic sugar for using --get with SQL queries, maybe "datasette query" 10
- `default_allow_sql` setting (a re-imagining of the old `allow_sql` setting) 10
- Add reference page to documentation using Sphinx autodoc 10
- [Enhancement] Please allow 'insert-files' to insert content as text. 10
- Docker configuration for exercising Datasette behind Apache mod_proxy 10
- Python library methods for calling ANALYZE 10
- Scripted exports 10
- Remove Hashed URL mode 10
- CLI eats my cursor 10
- If user can see table but NOT database/instance nav links should not display 10
- test_recreate failing on Windows Python 3.11 10
- `.json` errors should be returned as JSON 10
- Failing test: httpx.InvalidURL: URL too long 10
- Use sqlean if available in environment 10
- Config file with support for defining canned queries 9
- Datasette serve should accept paths/URLs to CSVs and other file formats 9
- Figure out some interesting example SQL queries 9
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 9
- Handle really wide tables better 9
- Refactor TableView.data() method 9
- Set up a live demo Datasette instance 9
- Move hashed URL mode out to a plugin 9
- ?_searchmode=raw option for running FTS searches without escaping characters 9
- Option to automatically configure based on directory layout 9
- Replace "datasette publish --extra-options" with "--setting" 9
- New WIP writable canned queries 9
- Example permissions plugin 9
- Research feasibility of 100% test coverage 9
- canned_queries() plugin hook 9
- Consider dropping explicit CSRF protection entirely? 9
- Add insert --truncate option 9
- .delete_where() does not auto-commit (unlike .insert() or .upsert()) 9
- Improve performance of extract operations 9
- Figure out how to run an environment that exercises the base_url proxy setting 9
- sqlite-utils search command 9
- GENERATED column support 9
- Datasette on Amazon Linux on ARM returns 404 for static assets 9
- "Stream all rows" is not at all obvious 9
- Better internal database_name for _internal database 9
- Improve the display of facets information 9
- Mechanism for minifying JavaScript that ships with Datasette 9
- Mechanism for executing JavaScript unit tests 9
- Use _counts to speed up counts 9
- Use force_https_urls on when deploying with Cloud Run 9
- Ability to increase size of the SQL editor window 9
- Custom pages don't work with base_url setting 9
- CSV ?_stream=on redundantly calculates facets for every page 9
- "invalid reference format" publishing Docker image 9
- Show count of facet values if ?_facet_size=max 9
- Manage /robots.txt in Datasette core, block robots by default 9
- Exceeding Cloud Run memory limits when deploying a 4.8G database 9
- Test against pysqlite3 running SQLite 3.37 9
- Allow to set `facets_array` in metadata (like current `facets`) 9
- Add SpatiaLite helpers to CLI 9
- Get Datasette compatible with Pyodide 9
- Add --ignore option to more commands 9
- Ability to set a custom favicon 9
- i18n support 9
- Add new entrypoint option to `--load-extension` 9
- Ability to load JSON records held in a file with a single top level key that is a list of objects 9
- Tool for simulating permission checks against actors 9
- Serve schema JSON to the SQL editor to enable autocomplete 9
- Release Datasette 1.0a0 9
- Refactor test suite to use mostly `async def` tests 9
- 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" 9
- `table.upsert_all` fails to write rows when `not_null` is present 9
- Plugin system 9
- Get `add_foreign_keys()` to work without modifying `sqlite_master` 9
- [feature request]`datasette install plugins.json` options 9
- Bump sphinx, furo, blacken-docs dependencies 9
- `datasette publish` needs support for the new config/metadata split 9
- Make URLs immutable 8
- datasette publish heroku 8
- Ability to bundle and serve additional static files 8
- Add GraphQL endpoint 8
- prepare_context() plugin hook 8
- Wildcard support in query parameters 8
- URL hashing now optional: turn on with --config hash_urls:1 (#418) 8
- "datasette publish cloudrun" command to publish to Google Cloud Run 8
- Add register_output_renderer hook 8
- sqlite-utils create-table command 8
- Enforce import sort order with isort 8
- Add a universal navigation bar which can be modified by plugins 8
- Command to fetch stargazers for one or more repos 8
- allow leading comments in SQL input field 8
- Helper methods for working with SpatiaLite 8
- datasette publish cloudrun --memory option 8
- Commits in GitHub API can have null author 8
- extra_template_vars() sending wrong view_name for index 8
- Import photo metadata from Apple Photos into SQLite 8
- Visually distinguish integer and text columns 8
- sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns 8
- Allow-list pragma_table_info(tablename) and similar 8
- Rename project to dogsheep-photos 8
- Consolidate request.raw_args and request.args 8
- Group permission checks by request on /-/permissions debug page 8
- Upgrade CodeMirror 8
- Mechanism for defining custom display of results 8
- the JSON object must be str, bytes or bytearray, not 'Undefined' 8
- Wide tables should scroll horizontally within the page 8
- OPTIONS requests return a 500 error 8
- Bring date parsing into Datasette core 8
- Establish pattern for release branches to support bug fixes 8
- GitHub Actions workflow to build and sign macOS binary executables 8
- Make original path available to render hooks 8
- --sniff option for sniffing delimiters 8
- --crossdb option for joining across databases 8
- sqlite-utils memory command for directly querying CSV/JSON data 8
- absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 8
- Tests failing with FileNotFoundError in runner.isolated_filesystem 8
- Rename Datasette.__init__(config=) parameter to settings= 8
- Allow passing a file of code to "sqlite-utils convert" 8
- Documented JavaScript variables on different templates made available for plugins 8
- Support for generated columns 8
- Get rid of the no-longer necessary ?_format=json hack for tables called x.json 8
- query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data 8
- Refactor and simplify Datasette routing and views 8
- Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply 8
- "Error: near "(": syntax error" when using sqlite-utils indexes CLI 8
- Table/database that is private due to inherited permissions does not show padlock 8
- /db/table/-/upsert API 8
- Some plugins show "home" breadcrumbs twice in the top left 8
- /db/table/-/upsert 8
- array facet: don't materialize unnecessary columns 8
- Datasette should serve Access-Control-Max-Age 8
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 8
- Cascade for restricted token view-table/view-database/view-instance operations 8
- Deploy failing with "plugins/alternative_route.py: Not a directory" 8
- ?_group_count=country - return counts by specific column(s) 7
- Ship a Docker image of the whole thing 7
- add "format sql" button to query page, uses sql-formatter 7
- Windows installation error 7
- Keyset pagination doesn't work correctly for compound primary keys 7
- Validate metadata.json on startup 7
- inspect should record column types 7
- Travis should push tagged images to Docker Hub for each release 7
- Improve and document foreign_keys=... argument to insert/create/etc 7
- ?_where=sql-fragment parameter for table views 7
- Define mechanism for plugins to return structured data 7
- Rename metadata.json to config.json 7
- Utility mechanism for plugins to render templates 7
- Syntax for ?_through= that works as a form field 7
- Problem with square bracket in CSV column name 7
- Update SQLite bundled with Docker container 7
- index.html is not reliably loaded from a plugin 7
- .columns_dict doesn't work for all possible column types 7
- Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records 7
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 7
- publish heroku does not work on Windows 10 7
- Demo is failing to deploy 7
- Support reverse pagination (previous page, has-previous-items) 7
- Docker container is no longer being pushed (it's stuck on 0.45) 7
- insert_all(..., alter=True) should work for new columns introduced after the first 100 records 7
- Push to Docker Hub failed - but it shouldn't run for alpha releases anyway 7
- Simplify imports of common classes 7
- SQLITE_MAX_VARS maybe hard-coded too low 7
- Commands for making authenticated API calls 7
- Pagination 7
- Support the dbstat table 7
- Much, much faster extract() implementation 7
- Documented HTML hooks for JavaScript plugin authors 7
- Redesign application homepage 7
- "Edit SQL" button on canned queries 7
- Fix last remaining links to "/" that do not respect base_url 7
- .extract() shouldn't extract null values 7
- export.xml file name varies with different language settings 7
- Documentation and unit tests for urls.row() urls.row_blob() methods 7
- "View all" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them 7
- changes to allow for compound foreign keys 7
- Update for Big Sur 7
- table.pks_and_rows_where() method returning primary keys along with the rows 7
- Invalid SQL: "no such table: pragma_database_list" on database page 7
- Latest Datasette tags missing from Docker Hub 7
- "More" link for facets that shows _facet_size=max results 7
- ?_nocol= does not interact well with default facets 7
- sqlite-utils memory should handle TSV and JSON in addition to CSV 7
- Introspection property for telling if a table is a rowid table 7
- Add Gmail takeout mbox import (v2) 7
- Query page .csv and .json links are not correctly URL-encoded on Vercel under unknown specific conditions 7
- Win32 "used by another process" error with datasette publish 7
- New pattern for async view classes 7
- Extra options to `lookup()` which get passed to `insert()` 7
- Columns starting with an underscore behave poorly in filters 7
- Test failure in test_rebuild_fts 7
- Support for CHECK constraints 7
- `.execute_write(... block=True)` should be the default behaviour 7
- Maybe let plugins define custom serve options? 7
- Link to stable docs from older versions 7
- Add SpatiaLite helpers to CLI 7
- Use dash encoding for table names and row primary keys in URLs 7
- I forgot to include the changelog in the 3.25.1 release 7
- Remove hashed URL mode 7
- Extract out `check_permissions()` from `BaseView 7
- `--nolock` feature for opening locked databases 7
- truncate_cells_html does not work for links? 7
- progressbar for inserts/upserts of all fileformats, closes #485 7
- Ability to merge databases and tables 7
- Expose `sql` and `params` arguments to various plugin hooks 7
- Upgrade Datasette Docker to Python 3.11 7
- don't use immutable=1, only mode=ro 7
- Figure out design for JSON errors (consider RFC 7807) 7
- Hacker News Datasette write demo 7
- First working version 7
- table.create(..., replace=True) 7
- CLI equivalents to `transform(add_foreign_keys=)` 7
- Detailed upgrade instructions for metadata.yaml -> datasette.yaml 7
- Addressable pages for every row in a table 6
- Default HTML/CSS needs to look reasonable and be responsive 6
- Support Django-style filters in querystring arguments 6
- Detect foreign keys and use them to link HTML pages together 6
- [WIP] Add publish to heroku support 6
- Nasty bug: last column not being correctly displayed 6
- Figure out how to bundle a more up-to-date SQLite 6
- Don't duplicate simple primary keys in the link column 6
- Load plugins from a `--plugins-dir=plugins/` directory 6
- Ability for plugins to define extra JavaScript and CSS 6
- inspect() should detect many-to-many relationships 6
- Deploy demo of Datasette on every commit that passes tests 6
- Plugin hook for loading metadata.json 6
- Faceted browse against a JSON list of tags 6
- CSV export in "Advanced export" pane doesn't respect query 6
- Additional Column Constraints? 6
- Easier way of creating custom row templates 6
- Experiment with type hints 6
- Command for running a search and saving tweets for that search 6
- Ways to improve fuzzy search speed on larger data sets? 6
- Improve UI of "datasette publish cloudrun" to reduce chances of accidentally over-writing a service 6
- Mechanism for indicating foreign key relationships in the table and query page URLs 6
- updating metadata.json without recreating the app 6
- Provide a cookiecutter template for creating new plugins 6
- upsert_all() throws issue when upserting to empty table 6
- "Templates considered" comment broken in >=0.35 6
- Documentation for the "request" object 6
- Support YAML in metadata - metadata.yaml 6
- Command for retrieving dependents for a repo 6
- Question: Access to immutable database-path 6
- Support decimal.Decimal type 6
- allow_by_query setting for configuring permissions with a SQL statement 6
- python tests/fixtures.py command has a bug 6
- Mechanism for specifying allow_sql permission in metadata.json 6
- Way to enable a default=False permission for anonymous users 6
- Ability to set ds_actor cookie such that it expires 6
- startup() plugin hook 6
- "Too many open files" error running tests 6
- datasette.add_message() doesn't work inside plugins 6
- Feature: pull request reviews and comments 6
- Datasette sdist is missing templates (hence broken when installing from Homebrew) 6
- End-user documentation 6
- extra_ plugin hooks should take the same arguments 6
- Private/secret databases: database files that are only visible to plugins 6
- Mechanism for differentiating between "by me" and "liked by me" 6
- Progress bar for sqlite-utils insert 6
- Rendering glitch with column headings on mobile 6
- Change "--config foo:bar" to "--setting foo bar" 6
- Add Link: pagination HTTP headers 6
- Figure out how to display images from <en-media> tags inline in Datasette 6
- Method for datasette.client() to forward on authentication 6
- Fallback to databases in inspect-data.json when no -i options are passed 6
- Better display of binary data on arbitrary query results page 6
- Table actions menu on view pages, not on query pages 6
- load_template() plugin hook 6
- PrefixedUrlString mechanism broke everything 6
- Support order by relevance against FTS4 6
- Support linking to compound foreign keys 6
- Support for generated columns 6
- sqlite-utils analyze-tables command and table.analyze_column() method 6
- More flexible CORS support in core, to encourage good security practices 6
- Share button for copying current URL 6
- Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions 6
- `sqlite-utils indexes` command 6
- Error: Use either --since or --since_id, not both 6
- `db.query()` method (renamed `db.execute_returning_dicts()`) 6
- "searchmode": "raw" in table metadata 6
- `table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option 6
- sqlite-utils insert errors should show SQL and parameters, if possible 6
- Mechanism to cause specific branches to deploy their own demos 6
- clean checkout & clean environment has test failures 6
- ReadTheDocs build failed for 0.59.2 release 6
- Command for creating an empty database 6
- Idea: hover to reveal details of linked row 6
- Writable canned queries fail to load custom templates 6
- filters_from_request plugin hook, now used in TableView 6
- Release Datasette 0.60 6
- introduce new option for datasette package to use a slim base image 6
- Drop support for Python 3.6 6
- Support mutating row in `--convert` without returning it 6
- Reconsider policy on blocking queries containing the string "pragma" 6
- datasette one.db one.db opens database twice, as one and one_2 6
- `deterministic=True` fails on versions of SQLite prior to 3.8.3 6
- Ship Datasette 0.61 6
- Proposal: datasette query 6
- .db downloads should be served with an ETag 6
- db[table].create(..., transform=True) and create-table --transform 6
- Upgrade `--load-extension` to accept entrypoints like Datasette 6
- Ability to set a custom facet_size per table 6
- Exclude virtual tables from datasette inspect 6
- Interactive demo of Datasette 1.0 write APIs 6
- register_permissions() plugin hook 6
- `datasette.create_token(...)` method for creating signed API tokens 6
- `publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 6
- Folder support 6
- GitHub Action to lint Python code with ruff 6
- Try out Trogon for a tui interface 6
- Make as many examples in the CLI docs as possible copy-and-pastable 6
- Table renaming: db.rename_table() and sqlite-utils rename-table 6
- Plugin hook for database queries that are run 6
- DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins 6
- Consider a request/response wrapping hook slightly higher level than asgi_wrapper() 6
- `table.transform()` should preserve `rowid` values 6
- Plugin hook: `actors_from_ids()` 6
- "Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 6
- Experiment with patterns for concurrent long running queries 5
- Create neat example database 5
- Redesign JSON output, ditch jsono, offer variants controlled by parameter instead 5
- Option to open readonly but not immutable 5
- datasette publish can fail if /tmp is on a different device 5
- Refactor views 5
- Add links to example Datasette instances to appropiate places in docs 5
- Ability to enable/disable specific features via --config 5
- Custom URL routing with independent tests 5
- datasette inspect takes a very long time on large dbs 5
- Get Datasette working with Zeit Now v2's 100MB image size limit 5
- Hashed URLs should be optional 5
- Plugin for allowing CORS from specified hosts 5
- Design changes to homepage to support mutable files 5
- Option to facet by date using month or year 5
- extra_template_vars plugin hook 5
- Ability to list views, and to access db["view_name"].rows / rows_where / etc 5
- Rethink progress bars for various commands 5
- [enhancement] Method to delete a row in python 5
- Testing utilities should be available to plugins 5
- If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar 5
- Don't auto-format SQL on page load 5
- stargazers command, refs #4 5
- Add this view for seeing new releases 5
- Escape_fts5_query-hookimplementation does not work with queries to standard tables 5
- on_create mechanism for after table creation 5
- Datasette.render_template() method 5
- Rethink how sanity checks work 5
- Release automation: automate the bit that posts the GitHub release 5
- table.disable_fts() method and "sqlite-utils disable-fts ..." command 5
- twitter-to-sqlite user-timeline [screen_names] --sql / --attach 5
- Option in metadata.json to set default sort order for a table 5
- Feature: record history of follower counts 5
- Custom CSS class on body for styling canned queries 5
- Repos have a big blob of JSON in the organization column 5
- Annotate photos using the Google Cloud Vision API 5
- Create a public demo 5
- Unit test that checks that all plugin hooks have corresponding unit tests 5
- Ability to sign in to Datasette as a root account 5
- CSRF protection 5
- Consider using enable_callback_tracebacks(True) 5
- Fix the demo - it breaks because of the tags table change 5
- Mechanism for passing additional options to `datasette my.db` that affect plugins 5
- sqlite-utils insert: options for column types 5
- Features for enabling and disabling WAL mode 5
- Add homebrew installation to documentation 5
- 'datasette --get' option, refs #926 5
- Path parameters for custom pages 5
- Try out CodeMirror SQL hints 5
- Handle case where subsequent records (after first batch) include extra columns 5
- Better handling of encodings other than utf-8 for "sqlite-utils insert" 5
- For 1.0 update trove classifier in setup.py 5
- How should datasette.client interact with base_url 5
- Add documentation on serving Datasette behind a proxy using base_url 5
- Add search highlighting snippets 5
- datasette.urls.static_plugins(...) method 5
- Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows 5
- Default menu links should check a real permission 5
- Rethink how table.search() method works 5
- Foreign key links break for compound foreign keys 5
- Rename datasette.config() method to datasette.setting() 5
- Show pysqlite3 version on /-/versions 5
- UNIQUE constraint failed: workouts.id 5
- Feature Request: Gmail 5
- Archive import appears to be broken on recent exports 5
- Release notes for Datasette 0.54 5
- 500 error caused by faceting if a column called `n` exists 5
- Allow canned query params to specify default values 5
- Research using CTEs for faster facet counts 5
- Better default display of arrays of items 5
- Upgrade to Python 3.9.4 5
- Make row available to `render_cell` plugin hook 5
- Add Docker multi-arch support with Buildx 5
- ?_facet_size=X to increase number of facets results on the page 5
- `table.xindexes` using `PRAGMA index_xinfo(table)` 5
- DRAFT: A new plugin hook for dynamic metadata 5
- feature: support "events" 5
- Serve all db files in a folder 5
- .transform(types=) turns rowid into a concrete column 5
- Stop using generated columns in fixtures.db 5
- `datasette publish cloudrun --cpu X` option 5
- Ability to search for text across all columns in a table 5
- Ability to insert file contents as text, in addition to blob 5
- Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects') 5
- Allow routes to have extra options 5
- Way to test SQLite 3.37 (and potentially other versions) in CI 5
- Redesign CSV export to improve usability 5
- Add KNN and data_licenses to hidden tables list 5
- Move canned queries closer to the SQL input area 5
- Improvements to help make Datasette a better tool for learning SQL 5
- JSON link on row page is 404 if base_url setting is used 5
- Creating tables with custom datatypes 5
- Test failures with SQLite 3.37.0+ due to column affinity case 5
- Implement redirects from old % encoding to new dash encoding 5
- Adopt a code of conduct 5
- Display autodoc type information more legibly 5
- `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 5
- Datasette feature for publishing snapshots of query results 5
- Research running SQL in table view in parallel using `asyncio.gather()` 5
- Support `rows_where()`, `delete_where()` etc for attached alias databases 5
- CSV `extras_key=` and `ignore_extras=` equivalents for CLI tool 5
- Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 5
- Upgrade to 3.10.6-slim-bullseye Docker base image 5
- minor a11y: <select> has no visual indicator when tabbed to 5
- 500 error in github-to-sqlite demo 5
- feature request: pivot command 5
- Link from documentation to source code 5
- Move "datasette --get" from Getting Started to CLI Reference 5
- Database() constructor currently defaults is_mutable to False 5
- `sqlite-utils transform` should set empty strings to null when converting text columns to integer/float 5
- NoneType' object has no attribute 'actor' 5
- Create a new table from one or more records, `sqlite-utils` style 5
- Design URLs for the write API 5
- Make it easier to fix URL proxy problems 5
- upsert of new row with check constraints fails 5
- ignore:true/replace:true options for /db/-/create API 5
- More useful error message if enable_load_extension is not available 5
- `Table.convert()` skips falsey values 5
- Expand foreign key references in row view as well 5
- codespell test failure 5
- Plugin hook for adding new output formats 5
- Plan for getting the new JSON format query views working 5
- Build HTML version of /content?sql=... 5
- Add writable canned query demo to latest.datasette.io 5
- Datasette --get --actor option 5
- Bump sphinx, furo, blacken-docs dependencies 5
- Don't show foreign key links to tables the user cannot access 5
- Raise an exception if a "plugins" block exists in metadata.json 5
- Add spatialite arm64 linux path 5
- Protect against malicious SQL that causes damage even though our DB is immutable 4
- Homepage UI for editing metadata file 4
- Switch to ujson 4
- Pick a name 4
- datasette publish hyper 4
- Support for title/source/license metadata 4
- Enforce pagination (or at least limits) for arbitrary custom SQL 4
- Add NHS England Hospitals example to wiki 4
- Consider data-package as a format for metadata 4
- add support for ?field__isnull=1 4
- Plugin that adds an authentication layer of some sort 4
- ?_json=foo&_json=bar query string argument 4
- A primary key column that has foreign key restriction associated won't rendering label column 4
- Support WITH query 4
- 500 from missing table name 4
- Ability to bundle metadata and templates inside the SQLite file 4
- Ability to apply sort on mobile in portrait mode 4
- metadata.json support for plugin configuration options 4
- Explore "distinct values for column" in inspect() 4
- Escaping named parameters in canned queries 4
- Mechanism for automatically picking up changes when on-disk .db file changes 4
- Add version number support with Versioneer 4
- Support table names ending with .json or .csv 4
- Explore if SquashFS can be used to shrink size of packaged Docker containers 4
- Installation instructions, including how to use the docker image 4
- Limit text display in cells containing large amounts of text 4
- Datasette on Zeit Now returns http URLs for facet and next links 4
- Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 4
- Requesting support for query description 4
- render_cell(value) plugin hook 4
- Ability to display facet counts for many-to-many relationships 4
- Integration with JupyterLab 4
- add_column() should support REFERENCES {other_table}({other_column}) 4
- Figure out what to do about table counts in a mutable world 4
- Refactor facets to a class and new plugin, refs #427 4
- Tracing support for seeing what SQL queries were executed 4
- Paginate + search for databases/tables on the homepage 4
- Replace most of `.inspect()` (and `datasette inspect`) with table counting 4
- Decide what to do about /-/inspect 4
- Allow .insert(..., foreign_keys=()) to auto-detect table and primary key 4
- Facets not correctly persisted in hidden form fields 4
- Every datasette plugin on the ecosystem page should have a screenshot 4
- Support opening multiple databases with the same stem 4
- Decide what goes into Datasette 1.0 4
- Fix static mounts using relative paths and prevent traversal exploits 4
- Get tests running on Windows using Travis CI 4
- Support unicode in url 4
- Too many SQL variables 4
- "Too many SQL variables" on large inserts 4
- More advanced connection pooling 4
- Option to fetch only checkins more recent than the current max checkin 4
- Add triggers while enabling FTS 4
- --sql and --attach options for feeding commands from SQL queries 4
- Use better pagination (and implement progress bar) 4
- Command to import home-timeline 4
- retweets-of-me command 4
- `import` command fails on empty files 4
- Failed to import workout points 4
- Datasette should work with Python 3.8 (and drop compatibility with Python 3.5) 4
- Mechanism for register_output_renderer to suggest extension or not 4
- Assets table with downloads 4
- Exception running first command: IndexError: list index out of range 4
- Allow creation of virtual tables at startup 4
- order_by mechanism 4
- Remove .detect_column_types() from table, make it a documented API 4
- Cashe-header missing in http-response 4
- Add documentation on Database introspection methods to internals.rst 4
- Adding a "recreate" flag to the `Database` constructor 4
- Custom pages mechanism, refs #648 4
- escape_fts() does not correctly escape * wildcards 4
- Fall back to authentication via ENV 4
- Directory configuration mode should support metadata.yaml 4
- Cloud Run fails to serve database files larger than 32MB 4
- [Feature Request] Support Repo Name in Search 🥺 4
- Ability to set custom default _size on a per-table basis 4
- Try out ExifReader 4
- add_foreign_key(...., ignore=True) 4
- register_output_renderer can_render mechanism 4
- Error pages not correctly loading CSS 4
- Publish secrets 4
- Example authentication plugin 4
- /-/metadata and so on should respect view-instance permission 4
- Log out mechanism for clearing ds_actor cookie 4
- Take advantage of .coverage being a SQLite database 4
- Skip counting hidden tables 4
- Use white-space: pre-wrap on ALL table cell contents 4
- github-to-sqlite tags command for fetching tags 4
- Output binary columns in "sqlite-utils query" JSON 4
- Security issue: read-only canned queries leak CSRF token in URL 4
- Test failures caused by failed attempts to mock pip 4
- --load-extension option for sqlite-utils query 4
- github-to-sqlite should handle rate limits better 4
- request an "-o" option on "datasette server" to open the default browser at the running url 4
- Idea: conversions= could take Python functions 4
- sqlite-utils transform sub-command 4
- sqlite-utils transform/insert --detect-types 4
- from_json jinja2 filter 4
- column name links broken in 0.50.1 4
- extra_js_urls and extra_css_urls should respect base_url setting 4
- Some workout columns should be float, not text 4
- Include LICENSE in sdist 4
- Add template block prior to extra URL loaders 4
- .blob output renderer 4
- Table/database action menu cut off if too short 4
- Rebrand and redirect config.rst as settings.rst 4
- --load-extension=spatialite not working with datasetteproject/datasette docker image 4
- Fix footer not sticking to bottom in short pages 4
- "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 4
- sqlite-utils should suggest --csv if JSON parsing fails 4
- sqlite-utils analyze-tables command 4
- Searching for "github-to-sqlite" throws an error 4
- Modernize code to Python 3.6+ 4
- Prettier package not actually being cached 4
- reset_counts() method and command 4
- Use structlog for logging 4
- Certain database names results in 404: "Database not found: None" 4
- view_name = "query" for the query page 4
- Tests are very slow. 4
- Possible to deploy as a python app (for Rstudio connect server)? 4
- photo-to-sqlite: command not found 4
- Installing datasette via docker: Path 'fixtures.db' does not exist 4
- Support SSL/TLS directly 4
- Error reading csv files with large column data 4
- --port option should validate port is between 0 and 65535 4
- Escaping FTS search strings 4
- Refresh SpatiaLite documentation 4
- Feature or Documentation Request: Individual table as home page template 4
- Dockerfile: use Ubuntu 20.10 as base 4
- improve table horizontal scroll experience 4
- Document how to send multiple values for "Named parameters" 4
- Avoid error sorting by relationships if related tables are not allowed 4
- Can't use apt-get in Dockerfile when using datasetteproj/datasette as base 4
- Figure out how to publish alpha/beta releases to Docker Hub 4
- Intermittent CI failure: restore_working_directory FileNotFoundError 4
- row.update() or row.pk 4
- db.schema property and sqlite-utils schema command 4
- Cannot set type JSON 4
- Automatic type detection for CSV data 4
- Big performance boost on faceting: skip the inner order by 4
- Command for fetching Hacker News threads from the search API 4
- feature request: document minimum permissions for service account for cloudrun 4
- Ability to default to hiding the SQL for a canned query 4
- Document exceptions that can be raised by db.execute() and friends 4
- Add reference documentation generated from docstrings 4
- xml.etree.ElementTree.ParseError: not well-formed (invalid token) 4
- sqlite-utils memory can't deal with multiple files with the same name 4
- ?_sort=rowid with _next= returns error 4
- `table.lookup()` option to populate additional columns when creating a record 4
- Improve Apache proxy documentation, link to demo 4
- Provide function to generate hash_id from specified columns 4
- Use datasette-table Web Component to guide the design of the JSON API for 1.0 4
- Add `Link: rel="alternate"` header pointing to JSON for a table/query 4
- Maybe return JSON from HTML pages if `Accept: application/json` is sent 4
- `sqlite-utils insert --extract colname` 4
- Allow users to pass a full convert() function definition 4
- Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 4
- Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key 4
- Better error message if `--convert` code fails to return a dict 4
- `--fmt` should imply `-t` 4
- Add documentation page with the output of `--help` 4
- Release notes for 0.60 4
- `sqlite-utils bulk --batch-size` option 4
- Document how to add a primary key to a rowid table using `sqlite-utils transform --pk` 4
- Update Dockerfile generated by `datasette publish` 4
- Sensible `cache-control` headers for static assets, including those served by plugins 4
- [feature] immutable mode for a directory, not just individual sqlite file 4
- Automated test for Pyodide compatibility 4
- ?_trace=1 fails with datasette-geojson for some reason 4
- Combining `rows_where()` and `search()` to limit which rows are searched 4
- Utilities for duplicating tables and creating a table with the results of a query 4
- 500 error if sorted by a column not in the ?_col= list 4
- Cross-link CLI to Python docs 4
- Adjust height of textarea for no JS case 4
- Research an upgrade to CodeMirror 6 4
- search_sql add include_rank option 4
- Parts of YAML file do not work when db name is "off" 4
- fails before generating views. ERR: table sqlite_master may not be modified 4
- Featured table(s) on the homepage 4
- Ability to insert multi-line files 4
- Setting to turn off table row counts entirely 4
- devrel/python api: Pylance type hinting 4
- Turn --flatten into a documented utility function 4
- Tests failing due to updated tabulate library 4
- `max_signed_tokens_ttl` setting for a maximum duration on API tokens 4
- Delete a single record from an existing table 4
- API to drop a table 4
- Datasette with many and large databases > Memory use 4
- 1.0a0 release notes 4
- Extract logic for resolving a URL to a database / table / row 4
- Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused 4
- `publish heroku` failing due to old Python version 4
- Docs for replace:true and ignore:true options for insert API 4
- Incorrect link from the API explorer to the JSON API documentation 4
- Feature request: output number of ignored/replaced rows for insert command 4
- render_cell plugin hook's row object is not a sqlite.Row 4
- installpython3.com is now a spam website 4
- Reconsider pattern where plugins could break existing template context 4
- Datasette is not compatible with SQLite's strict quoting compilation option 4
- Repeated calls to `Table.convert()` fail 4
- How to redirect from "/" to a specific db/table 4
- Add paths for homebrew on Apple silicon 4
- Custom SQL queries should use new JSON ?_extra= format 4
- Datasette cannot be installed with Rye 4
- `--raw-lines` option, like `--raw` for multiple lines 4
- sphinx.builders.linkcheck build error 4
- feat: Implement a prepare_connection plugin hook 4
- Implement new /content.json?sql=... 4
- Query view shouldn't return `columns` 4
- form label { width: 15% } is a bad default 4
- datasette -s/--setting option for setting nested configuration options 4
- Add new `--internal internal.db` option, deprecate legacy `_internal` database 4
- `datasette.yaml` plugin support 4
- Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml` 4
- Add more STRICT table support 4
- Implement sensible query pagination 3
- Command line tool for uploading one or more DBs to Now 3
- Ability to plot a simple graph 3
- date, year, month and day querystring lookups 3
- Implement a better database index page 3
- Add more detailed API documentation to the README 3
- UI for editing named parameters 3
- Link to JSON for the list of tables 3
- UI support for running FTS searches 3
- If view is filtered, search should apply within those filtered rows 3
- ?_search=x should work if used directly against a FTS virtual table 3
- Show extra instructions with the interrupted 3
- apsw as alternative sqlite3 binding (for full text search) 3
- _group_count= feature improvements 3
- Datasette CSS should include content hash in the URL 3
- datasette skeleton command for kick-starting database and table metadata 3
- Custom template for named canned query 3
- proposal new option to disable user agents cache 3
- Cleaner mechanism for handling custom errors 3
- Run pks_for_table in inspect, executing once at build time rather than constantly 3
- Hide Spatialite system tables 3
- Support filtering with units and more 3
- Allow plugins to add new cli sub commands 3
- datasette publish --install=name-of-plugin 3
- label_column option in metadata.json 3
- External metadata.json 3
- Add new metadata key persistent_urls which removes the hash from all database urls 3
- Facets should not execute for ?shape=array|object 3
- Documentation for URL hashing, redirects and cache policy 3
- "config" section in metadata.json (root, database and table level) 3
- Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0 3
- Support multiple filters of the same type 3
- ?_ttl= parameter to control caching 3
- Avoid plugins accidentally loading dependencies twice 3
- Per-database and per-table /-/ URL namespace 3
- Ability to configure SQLite cache_size 3
- Ensure --help examples in docs are always up to date 3
- Use pysqlite3 if available 3
- datasette publish digitalocean plugin 3
- Update official datasetteproject/datasette Docker container to SQLite 3.26.0 3
- Ensure downloading a 100+MB SQLite database file works 3
- How to pass configuration to plugins? 3
- Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs 3
- Experiment: run Jinja in async mode 3
- .insert_all() should accept a generator and process it efficiently 3
- Problems handling column names containing spaces or - 3
- Zeit API v1 does not work for new users - need to migrate to v2 3
- How to pass named parameter into spatialite MakePoint() function 3
- Utilities for adding indexes 3
- Add query parameter to hide SQL textarea 3
- Datasette doesn't reload when database file changes 3
- Installing installs the tests package 3
- Fix the "datasette now publish ... --alias=x" option 3
- Make it so Docker build doesn't delay PyPI release 3
- Option to ignore inserts if primary key exists already 3
- Accessibility for non-techie newsies? 3
- Test against Python 3.8-dev using Travis 3
- Exporting sqlite database(s)? 3
- "about" parameter in metadata does not appear when alone 3
- asgi_wrapper plugin hook 3
- Unable to use rank when fts-table generated with csvs-to-sqlite 3
- Mechanism for secrets in plugin configuration 3
- datasette publish option for setting plugin configuration secrets 3
- Potential improvements to facet-by-date 3
- CodeMirror fails to load on database page 3
- .add_column() doesn't match indentation of initial creation 3
- extracts= option for insert/update/etc 3
- Script uses a lot of RAM 3
- Datasette Edit 3
- "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user 3
- Exposing Datasette via Jupyter-server-proxy 3
- Added support for multi arch builds 3
- Extract "source" into a separate lookup table 3
- Track and use the 'since' value 3
- Queries per DB table in metadata.json 3
- Handle spaces in DB names 3
- since_id support for home-timeline 3
- make uvicorn optional dependancy (because not ok on windows python yet) 3
- --since support for various commands for refresh-by-cron 3
- upgrade to uvicorn-0.9 to be Python-3.8 friendly 3
- Offer to format readonly SQL 3
- _where= parameter is not persisted in hidden form fields 3
- /-/plugins shows incorrect name for plugins 3
- Static assets no longer loading for installed plugins 3
- Add this repos_starred view 3
- Publish to Heroku is broken: "WARNING: You must pass the application as an import string to enable 'reload' or 'workers" 3
- rowid is not included in dropdown filter menus 3
- Custom queries with 0 results should say "0 results" 3
- Don't suggest column for faceting if all values are 1 3
- Command for importing events 3
- Make database level information from metadata.json available in the index.html template 3
- Feature request: enable extensions loading 3
- Add a glossary to the documentation 3
- fts5 syntax error when using punctuation 3
- Template debug mode that outputs template context 3
- Copy and paste doesn't work reliably on iPhone for SQL editor 3
- Tests are failing due to missing FTS5 3
- Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column 3
- --port option to expose a port other than 8001 in "datasette package" 3
- Tutorial command no longer works 3
- Use inspect-file, if possible, for total row count 3
- prepare_connection() plugin hook should accept optional datasette argument 3
- Ability to customize columns used by extracts= feature 3
- Variables from extra_template_vars() not exposed in _context=1 3
- Search box CSS doesn't look great on OS X Safari 3
- Handle "User not found" error 3
- WIP implementation of writable canned queries 3
- --plugin-secret over-rides existing metadata.json plugin config 3
- Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6 3
- Pull repository contributors 3
- Mechanism for forcing column-type, over-riding auto-detection 3
- Issue and milestone should have foreign key to repo 3
- Issue comments don't appear to populate issues foreign key 3
- strange behavior using accented characters 3
- …
user 393
- simonw 8,883
- codecov[bot] 240
- fgregg 82
- eyeseast 74
- russss 39
- dependabot[bot] 36
- psychemedia 35
- abdusco 26
- asg017 25
- bgrins 24
- cldellow 24
- mroswell 22
- chapmanjacobd 22
- aborruso 19
- chrismp 18
- brandonrobertz 15
- hydrosquall 15
- RhetTbull 15
- jacobian 14
- carlmjohnson 14
- tballison 13
- wragge 12
- tsibley 11
- rixx 11
- stonebig 11
- frafra 10
- maxhawkins 10
- terrycojones 10
- dracos 10
- rgieseke 10
- rayvoelker 10
- 20after4 9
- clausjuhl 9
- bobwhitelock 9
- UtahDave 8
- tomchristie 8
- bsilverm 8
- 4l1fe 8
- zaneselvans 7
- mhalle 7
- zeluspudding 7
- cobiadigital 7
- amjith 6
- jefftriplett 6
- simonwiles 6
- mcarpenter 6
- khimaros 6
- jaywgraves 6
- CharlesNepote 6
- ocdtrekkie 6
- davidbgk 5
- khusmann 5
- rdmurphy 5
- MarkusH 5
- lovasoa 5
- Mjboothaus 5
- dazzag24 5
- ar-jan 5
- xavdid 5
- davidhaley 5
- SteadBytes 5
- dependabot-preview[bot] 5
- jayvdb 4
- fs111 4
- bollwyvl 4
- ctb 4
- yozlet 4
- Btibert3 4
- dholth 4
- r4vi 4
- jsfenfen 4
- glasnt 4
- jungle-boogie 4
- ColinMaudry 4
- kbaikov 4
- JBPressac 4
- nitinpaultifr 4
- Kabouik 4
- dvizard 4
- henry501 4
- pjamargh 4
- benpickles 3
- frankieroberto 3
- obra 3
- janimo 3
- atomotic 3
- ghing 3
- briandorsey 3
- pkoppstein 3
- yschimke 3
- philroche 3
- macropin 3
- camallen 3
- coldclimate 3
- wsxiaoys 3
- johnfelipe 3
- mdrovdahl 3
- xrotwang 3
- robroc 3
- dmick 3
- betatim 3
- dufferzafar 3
- Florents-Tselai 3
- aki-k 3
- ashishdotme 3
- yejiyang 3
- henrikek 3
- swyxio 3
- Segerberg 3
- blairdrummond 3
- jsancho-gpl 3
- kevindkeogh 3
- gk7279 3
- daniel-butler 3
- learning4life 3
- mattmalcher 3
- FabianHertwig 3
- polyrand 3
- justmars 3
- garethr 2
- danp 2
- nelsonjchen 2
- dsisnero 2
- hubgit 2
- jackowayed 2
- ftrain 2
- chrishas35 2
- tannewt 2
- HaveF 2
- ingenieroariel 2
- pkulchenko 2
- coleifer 2
- gavinband 2
- aviflax 2
- iloveitaly 2
- tholo 2
- mungewell 2
- frankier 2
- lchski 2
- tmaier 2
- hcarter333 2
- gfrmin 2
- amitkoth 2
- mcint 2
- frosencrantz 2
- eads 2
- virtadpt 2
- leafgarland 2
- glyph 2
- rafguns 2
- strada 2
- adipasquale 2
- eelkevdbos 2
- ligurio 2
- n8henrie 2
- soobrosa 2
- nathancahill 2
- mustafa0x 2
- davidleejy 2
- bsmithgall 2
- noslouch 2
- willingc 2
- nattaylor 2
- durkie 2
- raynae 2
- cclauss 2
- wulfmann 2
- philshem 2
- bram2000 2
- zzeleznick 2
- chris48s 2
- plpxsk 2
- jeqo 2
- nickvazz 2
- rprimet 2
- aaronyih1 2
- luxint 2
- jussiarpalahti 2
- tkhattra 2
- sachaj 2
- lagolucas 2
- stevecrawshaw 2
- chekos 2
- ctsrc 2
- ad-si 2
- smithdc1 2
- gsajko 2
- jcmkk3 2
- null92 2
- publicmatt 2
- rachelmarconi 2
- tunguyenatwork 2
- LVerneyPEReN 2
- MichaelTiemannOSC 2
- tmcl-it 2
- anotherjesse 1
- jarib 1
- jokull 1
- fernand0 1
- precipice 1
- llimllib 1
- gijs 1
- blaine 1
- ashanan 1
- gravis 1
- nkirsch 1
- tomdyson 1
- mrchrisadams 1
- dkam 1
- harperreed 1
- nileshtrivedi 1
- chrismytton 1
- nedbat 1
- furilo 1
- kindly 1
- adamwolf 1
- prabhur 1
- palfrey 1
- dmd 1
- pquentin 1
- rubenv 1
- Uninen 1
- rtanglao 1
- carsonyl 1
- nryberg 1
- step21 1
- stefanocudini 1
- rcoup 1
- spookylukey 1
- scoates 1
- hpk42 1
- annapowellsmith 1
- cadeef 1
- aslakr 1
- thorn0 1
- yurivish 1
- pax 1
- lucapette 1
- jmelloy 1
- Krazybug 1
- dvhthomas 1
- dckc 1
- phubbard 1
- sethvincent 1
- andrewdotn 1
- meatcar 1
- aitoehigie 1
- julienma 1
- michaelmcandrew 1
- drewda 1
- stiles 1
- saulpw 1
- adamalton 1
- terinjokes 1
- thadk 1
- robintw 1
- astrojuanlu 1
- ipmb 1
- steren 1
- aidansteele 1
- mikepqr 1
- 0x1997 1
- jonafato 1
- gwk 1
- knutwannheden 1
- davidszotten 1
- chrislkeller 1
- kevboh 1
- eaubin 1
- yunzheng 1
- mhkeller 1
- lfdebrux 1
- karlcow 1
- heyarne 1
- ryanfox 1
- sopel 1
- cephillips 1
- ryascott 1
- sirnacnud 1
- simonrjones 1
- justinpinkney 1
- merwok 1
- mattkiefer 1
- snth 1
- adarshp 1
- joshmgrant 1
- bcongdon 1
- nickdirienzo 1
- adamjonas 1
- hannseman 1
- kaihendry 1
- urbas 1
- metamoof 1
- brimstone 1
- adamchainz 1
- PabloLerma 1
- heussd 1
- RayBB 1
- BryantD 1
- limar 1
- drkane 1
- Gagravarr 1
- radusuciu 1
- esagara 1
- agguser 1
- rclement 1
- dyllan-to-you 1
- justinallen 1
- jordaneremieff 1
- wdccdw 1
- wpears 1
- progpow 1
- DavidPratten 1
- ltrgoddard 1
- costrouc 1
- jratike80 1
- ment4list 1
- ccorcos 1
- choldgraf 1
- Olshansk 1
- qqilihq 1
- jdangerx 1
- fidiego 1
- OverkillGuy 1
- QAInsights 1
- secretGeek 1
- fkuhn 1
- jameslittle230 1
- Profpatsch 1
- dskrad 1
- kwladyka 1
- Carib0u 1
- fatihky 1
- phoenixjun 1
- JesperTreetop 1
- wenhoujx 1
- bapowell 1
- yairlenga 1
- louispotok 1
- ChristopherWilks 1
- Maltazar 1
- hueyy 1
- eumiro 1
- wuhland 1
- eric-burel 1
- foscoj 1
- dvot197007 1
- kokes 1
- RamiAwar 1
- csusanu 1
- metab0t 1
- spdkils 1
- sturzl 1
- jrdmb 1
- robmarkcole 1
- jfeiwell 1
- coisnepe 1
- chmaynard 1
- erlend-aasland 1
- amlestin 1
- tf13 1
- alecstein 1
- bendnorman 1
- noklam 1
- jakewilkins 1
- Thomascountz 1
- eigenfoo 1
- GmGniap 1
- rdtq 1
- AnkitKundariya 1
- LucasElArruda 1
- duarteocarmo 1
- mattiaborsoi 1
- sarcasticadmin 1
- yqlbu 1
- abeyerpath 1
- b0b5h4rp13 1
- Rik-de-Kort 1
- patricktrainer 1
- xmichele 1
- miuku 1
- philipp-heinrich 1
- jimmybutton 1
- thewchan 1
- izzues 1
- thisismyfuckingusername 1
- kirajano 1
- J450n-4-W 1
- mlaparie 1
- Dhyanesh97 1
- knowledgecamp12 1
- McEazy2700 1
- cycle-data 1
id | html_url | issue_url | node_id | user | created_at | updated_at | author_association ▼ | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
344125441 | https://github.com/simonw/datasette/pull/81#issuecomment-344125441 | https://api.github.com/repos/simonw/datasette/issues/81 | MDEyOklzc3VlQ29tbWVudDM0NDEyNTQ0MQ== | jefftriplett 50527 | 2017-11-14T02:24:54Z | 2017-11-14T02:24:54Z | CONTRIBUTOR | Oops, if I jumped the gun. I saw the project in my github activity feed and saw some low hanging fruit :) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
:fire: Removes DS_Store 273595473 | |
344145265 | https://github.com/simonw/datasette/issues/57#issuecomment-344145265 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0NTI2NQ== | macropin 247192 | 2017-11-14T04:45:38Z | 2017-11-14T04:45:38Z | CONTRIBUTOR | I'm happy to contribute this. Just let me know if you want a Dockerfile for development or production purposes, or both. If it's prod then we can just pip install the source from pypi, otherwise for dev we'll need a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ship a Docker image of the whole thing 273127694 | |
344147583 | https://github.com/simonw/datasette/issues/57#issuecomment-344147583 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0NzU4Mw== | macropin 247192 | 2017-11-14T05:03:47Z | 2017-11-14T05:03:47Z | CONTRIBUTOR | Let me know if you'd like a PR. The image is usable as
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ship a Docker image of the whole thing 273127694 | |
344151223 | https://github.com/simonw/datasette/issues/57#issuecomment-344151223 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE1MTIyMw== | macropin 247192 | 2017-11-14T05:32:28Z | 2017-11-14T05:33:03Z | CONTRIBUTOR | The pattern is called "multi-stage builds". And the result is a svelte 226MB image (201MB for 3.6-slim) vs 700MB+ for the full image. It's possible to get it even smaller, but that takes a lot more work. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ship a Docker image of the whole thing 273127694 | |
344430689 | https://github.com/simonw/datasette/issues/88#issuecomment-344430689 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQzMDY4OQ== | tomdyson 15543 | 2017-11-14T23:08:22Z | 2017-11-14T23:08:22Z | CONTRIBUTOR |
Sorry about that - here's a working version on Netlify: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add NHS England Hospitals example to wiki 273775212 | |
344710204 | https://github.com/simonw/datasette/pull/104#issuecomment-344710204 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NDcxMDIwNA== | jacobian 21148 | 2017-11-15T19:57:50Z | 2017-11-15T19:57:50Z | CONTRIBUTOR | A first basic stab at making this work, just to prove the approach. Right now this requires a Heroku CLI plugin, which seems pretty unreasonable. I think this can be replaced with direct API calls, which could clean up a lot of things. But I wanted to prove it worked first, and it does. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
344810525 | https://github.com/simonw/datasette/issues/46#issuecomment-344810525 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDgxMDUyNQ== | ingenieroariel 54999 | 2017-11-16T04:11:25Z | 2017-11-16T04:11:25Z | CONTRIBUTOR | @simonw On the spatialite support, here is some info to make it work and a screenshot: I used the following Dockerfile: ``` FROM prolocutor/python3-sqlite-ext:3.5.1-spatialite as build RUN mkdir /code ADD . /code/ RUN pip install /code/ EXPOSE 8001 CMD ["datasette", "serve", "/code/ne.sqlite", "--host", "0.0.0.0"] ``` and added this to |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344811268 | https://github.com/simonw/datasette/pull/107#issuecomment-344811268 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NDgxMTI2OA== | raynae 3433657 | 2017-11-16T04:17:45Z | 2017-11-16T04:17:45Z | CONTRIBUTOR | Thanks for the guidance. I added a unit test and made a slight change to utils.py. I didn't realize this, but evidently string.format only complains if you supply less arguments than there are format placeholders, so the original commit worked, but was adding a superfluous named param. I added a conditional that prevents the named param from being created and ensures the correct number of args are passed to sting.format. It has the side effect of hiding the SQL query in /templates/table.html when there are no other where clauses--not sure if that's the desired outcome here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
add support for ?field__isnull=1 274343647 | |
345002908 | https://github.com/simonw/datasette/issues/46#issuecomment-345002908 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NTAwMjkwOA== | ingenieroariel 54999 | 2017-11-16T17:47:49Z | 2017-11-16T17:47:49Z | CONTRIBUTOR | I'll try to find alternatives to the Dockerfile option - I also think we should not use that old one without sources or license. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345117690 | https://github.com/simonw/datasette/pull/107#issuecomment-345117690 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NTExNzY5MA== | raynae 3433657 | 2017-11-17T01:29:41Z | 2017-11-17T01:29:41Z | CONTRIBUTOR | Thanks for bearing with me. I was getting a message about my branch diverging when I tried to push after rebasing, so I merged master into isnull, seems like that did the trick. Let me know if I should make any corrections. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
add support for ?field__isnull=1 274343647 | |
345452669 | https://github.com/simonw/datasette/pull/104#issuecomment-345452669 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NTQ1MjY2OQ== | jacobian 21148 | 2017-11-18T16:18:45Z | 2017-11-18T16:18:45Z | CONTRIBUTOR | I'd like to do a bit of cleanup, and some error checking in case heroku/heroku-builds isn't installed. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
345503897 | https://github.com/simonw/datasette/issues/105#issuecomment-345503897 | https://api.github.com/repos/simonw/datasette/issues/105 | MDEyOklzc3VlQ29tbWVudDM0NTUwMzg5Nw== | rgieseke 198537 | 2017-11-19T09:38:08Z | 2017-11-19T09:38:08Z | CONTRIBUTOR | Thanks, I wrote this very simple reader because the default approach as described on the Datahub pages seemed to complicated. I had metadata from the This could also be useful for getting from Data Package to SQL db: https://github.com/frictionlessdata/tableschema-sql-py I maintain a few climate science related dataset at https://github.com/openclimatedata/ The Data Retriever (mainly ecological data) by @ethanwhite et al. is also using the Data Package format for metadata and has some tooling for different dbs: https://frictionlessdata.io/articles/the-data-retriever/ https://github.com/weecology/retriever The Open Power System Data project also has a couple of datasets that show nicely how CSV is great for assembling and then already make SQLite files available. It's one of the first data sets I tried with Datasette, perfect for the use case of getting an API for putting power stations on a map ... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Consider data-package as a format for metadata 274314940 | |
345652450 | https://github.com/simonw/datasette/issues/27#issuecomment-345652450 | https://api.github.com/repos/simonw/datasette/issues/27 | MDEyOklzc3VlQ29tbWVudDM0NTY1MjQ1MA== | rgieseke 198537 | 2017-11-20T10:19:39Z | 2017-11-20T10:19:39Z | CONTRIBUTOR | If Data Package metadata gets adopted (#105) the views spec work might also be worth a look: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to plot a simple graph 267886330 | |
346116745 | https://github.com/simonw/datasette/pull/104#issuecomment-346116745 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NjExNjc0NQ== | jacobian 21148 | 2017-11-21T18:23:25Z | 2017-11-21T18:23:25Z | CONTRIBUTOR | @simonw ready for a review and merge if you want. There's still some nasty duplicated code in cli.py and utils.py, which is just going to get worse if/when we start adding any other deploy targets (and I want to do one for cloud.gov, at least). I think there's an opportunity for some refactoring here. I'm happy to do that now as part of this PR, or if you merge this first I'll do it in a different one. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
346124073 | https://github.com/simonw/datasette/pull/104#issuecomment-346124073 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NjEyNDA3Mw== | jacobian 21148 | 2017-11-21T18:49:55Z | 2017-11-21T18:49:55Z | CONTRIBUTOR | Actually hang on, don't merge - there are some bugs that #141 masked when I tested this out elsewhere. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
346124764 | https://github.com/simonw/datasette/pull/104#issuecomment-346124764 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NjEyNDc2NA== | jacobian 21148 | 2017-11-21T18:52:14Z | 2017-11-21T18:52:14Z | CONTRIBUTOR | OK, now this should work. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add publish to heroku support 274284246 | |
346244871 | https://github.com/simonw/datasette/issues/14#issuecomment-346244871 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0NjI0NDg3MQ== | jacobian 21148 | 2017-11-22T05:06:30Z | 2017-11-22T05:06:30Z | CONTRIBUTOR | I'd also suggest taking a look at stevedore, which has a ton of tools for doing plugin stuff. I've had good luck with it in the past. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette Plugins 267707940 | |
360535979 | https://github.com/simonw/datasette/issues/179#issuecomment-360535979 | https://api.github.com/repos/simonw/datasette/issues/179 | MDEyOklzc3VlQ29tbWVudDM2MDUzNTk3OQ== | psychemedia 82988 | 2018-01-25T17:18:24Z | 2018-01-25T17:18:24Z | CONTRIBUTOR | To summarise that thread:
It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
More metadata options for template authors 288438570 | |
380608372 | https://github.com/simonw/datasette/pull/200#issuecomment-380608372 | https://api.github.com/repos/simonw/datasette/issues/200 | MDEyOklzc3VlQ29tbWVudDM4MDYwODM3Mg== | russss 45057 | 2018-04-11T21:55:46Z | 2018-04-11T21:55:46Z | CONTRIBUTOR |
Or just see if there's a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Hide Spatialite system tables 313494458 | |
380966565 | https://github.com/simonw/datasette/issues/203#issuecomment-380966565 | https://api.github.com/repos/simonw/datasette/issues/203 | MDEyOklzc3VlQ29tbWVudDM4MDk2NjU2NQ== | russss 45057 | 2018-04-12T22:43:08Z | 2018-04-12T22:43:08Z | CONTRIBUTOR | Looks like pint is pretty good at this. ```python In [1]: import pint In [2]: ureg = pint.UnitRegistry() In [3]: q = 3e6 * ureg('Hz') In [4]: '{:~P}'.format(q.to_compact()) Out[4]: '3.0 MHz' In [5]: q = 0.3 * ureg('m') In [5]: '{:~P}'.format(q.to_compact()) Out[5]: '300.0 mm' In [6]: q = 5 * ureg('') In [7]: '{:~P}'.format(q.to_compact()) Out[7]: '5' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for units 313837303 | |
381237440 | https://github.com/simonw/datasette/pull/202#issuecomment-381237440 | https://api.github.com/repos/simonw/datasette/issues/202 | MDEyOklzc3VlQ29tbWVudDM4MTIzNzQ0MA== | russss 45057 | 2018-04-13T19:22:53Z | 2018-04-13T19:22:53Z | CONTRIBUTOR | I spotted you'd mentioned that in #184 but only after I'd written the patch! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Raise 404 on nonexistent table URLs 313785206 | |
381315675 | https://github.com/simonw/datasette/issues/203#issuecomment-381315675 | https://api.github.com/repos/simonw/datasette/issues/203 | MDEyOklzc3VlQ29tbWVudDM4MTMxNTY3NQ== | russss 45057 | 2018-04-14T09:14:45Z | 2018-04-14T09:27:30Z | CONTRIBUTOR |
<s>From a machine-readable perspective I'm not sure why it would be useful to decorate the values with units</s>. Edit: Should have had some coffee first. It's clearly useful for stuff like map rendering! I agree that the unit metadata should definitely be exposed in the JSON.
I'm thinking about a couple of approaches here. I think the simplest one is: if the column has a unit attached, optionally accept units in query fields: ```python column_units = ureg("Hz") # Create a unit object for the column's unit query_variable = ureg("4 GHz") # Supplied query variable Now we can convert the query units into column units before queryingsupplied_value.to(column_units).magnitude
If the user doesn't supply units, pint just returns the plainnumber and we can query as usual assuming it's the base unitquery_variable = ureg("50") query_variable
isinstance(query_variable, numbers.Number)
This also lets us do some nice unit conversion on querying: ```python column_units = ureg("m") query_variable = ureg("50 ft") supplied_value.to(column_units)
The alternative would be to provide a dropdown of units next to the query field (so a "Hz" field would give you "kHz", "MHz", "GHz"). Although this would be clearer to the user, it isn't so easy - we'd need to know more about the context of the field to give you sensible SI prefixes (I'm not so interested in nanoHertz, for example). You also lose the bonus of being able to convert - although pint will happily show you all the compatible units, it again suffers from a lack of context: ```python ureg("m").compatible_units()
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for units 313837303 | |
381332222 | https://github.com/simonw/datasette/pull/205#issuecomment-381332222 | https://api.github.com/repos/simonw/datasette/issues/205 | MDEyOklzc3VlQ29tbWVudDM4MTMzMjIyMg== | russss 45057 | 2018-04-14T14:16:35Z | 2018-04-14T14:16:35Z | CONTRIBUTOR | I've added some tests and that docs link. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support filtering with units and more 314319372 | |
381361734 | https://github.com/simonw/datasette/issues/125#issuecomment-381361734 | https://api.github.com/repos/simonw/datasette/issues/125 | MDEyOklzc3VlQ29tbWVudDM4MTM2MTczNA== | russss 45057 | 2018-04-14T21:26:30Z | 2018-04-14T21:26:30Z | CONTRIBUTOR | FWIW I am now doing this on my WTR app (instead of silently limiting maps to 1000). Telefonica now has about 4000 markers and good old BT has 22,000 or so. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plot rows on a map with Leaflet and Leaflet.markercluster 275135393 | |
381441392 | https://github.com/simonw/datasette/pull/209#issuecomment-381441392 | https://api.github.com/repos/simonw/datasette/issues/209 | MDEyOklzc3VlQ29tbWVudDM4MTQ0MTM5Mg== | russss 45057 | 2018-04-15T21:59:15Z | 2018-04-15T21:59:15Z | CONTRIBUTOR | I suspected this would cause some test failures, but I'll wait for opinions before attempting to fix them. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't duplicate simple primary keys in the link column 314455877 | |
381738137 | https://github.com/simonw/datasette/pull/209#issuecomment-381738137 | https://api.github.com/repos/simonw/datasette/issues/209 | MDEyOklzc3VlQ29tbWVudDM4MTczODEzNw== | russss 45057 | 2018-04-16T20:27:43Z | 2018-04-16T20:27:43Z | CONTRIBUTOR | Tests now fixed, honest. The failing test on Travis looks like an intermittent sqlite failure which should resolve itself on a retry... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't duplicate simple primary keys in the link column 314455877 | |
381763651 | https://github.com/simonw/datasette/issues/203#issuecomment-381763651 | https://api.github.com/repos/simonw/datasette/issues/203 | MDEyOklzc3VlQ29tbWVudDM4MTc2MzY1MQ== | russss 45057 | 2018-04-16T21:59:17Z | 2018-04-16T21:59:17Z | CONTRIBUTOR | Ah, I had no idea you could bind python functions into sqlite! I think the primary purpose of this issue has been served now - I'm going to close this and create a new issue for the only bit of this that hasn't been touched yet, which is (optionally) exposing units in the JSON API. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for units 313837303 | |
381905593 | https://github.com/simonw/datasette/pull/209#issuecomment-381905593 | https://api.github.com/repos/simonw/datasette/issues/209 | MDEyOklzc3VlQ29tbWVudDM4MTkwNTU5Mw== | russss 45057 | 2018-04-17T08:50:28Z | 2018-04-17T08:50:28Z | CONTRIBUTOR | I've added another commit which puts classes a class on each Unfortunately the tests are still failing on 3.6, which is weird. I can't reproduce locally... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't duplicate simple primary keys in the link column 314455877 | |
390250253 | https://github.com/simonw/datasette/issues/273#issuecomment-390250253 | https://api.github.com/repos/simonw/datasette/issues/273 | MDEyOklzc3VlQ29tbWVudDM5MDI1MDI1Mw== | rgieseke 198537 | 2018-05-18T15:49:52Z | 2018-05-18T15:49:52Z | CONTRIBUTOR | Shouldn't versioneer do that? E.g. 0.21+2.g1076c97 You'd need to install via |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out a way to have /-/version return current git commit hash 324451322 | |
390795067 | https://github.com/simonw/datasette/issues/276#issuecomment-390795067 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MDc5NTA2Nw== | russss 45057 | 2018-05-21T21:55:57Z | 2018-05-21T21:55:57Z | CONTRIBUTOR | Well, we do have the capability to detect spatialite so my intention certainly wasn't to require it. I can see the advantage of having it as a plugin but it does touch a number of points in the code. I think I'm going to attack this by refactoring the necessary bits and seeing where that leads (which was my plan anyway). I think my main concern is - if I add certain plugin hooks for this, is anything else ever going to use them? I'm not sure I have an answer to that question yet, either way. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
391050113 | https://github.com/simonw/datasette/issues/276#issuecomment-391050113 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MTA1MDExMw== | russss 45057 | 2018-05-22T16:13:00Z | 2018-05-22T16:13:00Z | CONTRIBUTOR | Yup, I'll have a think about it. My current thoughts are for spatialite we'll need to hook into the following places:
The rendering and querying hooks could also potentially be used to move the units support into a plugin. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
391059008 | https://github.com/simonw/datasette/pull/280#issuecomment-391059008 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTA1OTAwOA== | r4vi 565628 | 2018-05-22T16:40:27Z | 2018-05-22T16:40:27Z | CONTRIBUTOR | ```python
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391073009 | https://github.com/simonw/datasette/pull/279#issuecomment-391073009 | https://api.github.com/repos/simonw/datasette/issues/279 | MDEyOklzc3VlQ29tbWVudDM5MTA3MzAwOQ== | rgieseke 198537 | 2018-05-22T17:23:26Z | 2018-05-22T17:23:26Z | CONTRIBUTOR |
Yes! That's the default versioneer behaviour.
Should work now, it can be a two (for a tagged version), three or four items tuple. ``` In [2]: datasette.version Out[2]: '0.12+292.ga70c2a8.dirty' In [3]: datasette.version_info Out[3]: ('0', '12+292', 'ga70c2a8', 'dirty') ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add version number support with Versioneer 325352370 | |
391073267 | https://github.com/simonw/datasette/pull/279#issuecomment-391073267 | https://api.github.com/repos/simonw/datasette/issues/279 | MDEyOklzc3VlQ29tbWVudDM5MTA3MzI2Nw== | rgieseke 198537 | 2018-05-22T17:24:16Z | 2018-05-22T17:24:16Z | CONTRIBUTOR | Sorry, just realised you rely on |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add version number support with Versioneer 325352370 | |
391077700 | https://github.com/simonw/datasette/pull/279#issuecomment-391077700 | https://api.github.com/repos/simonw/datasette/issues/279 | MDEyOklzc3VlQ29tbWVudDM5MTA3NzcwMA== | rgieseke 198537 | 2018-05-22T17:38:17Z | 2018-05-22T17:38:17Z | CONTRIBUTOR | Alright, that should work now -- let me know if you would prefer any different behaviour. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add version number support with Versioneer 325352370 | |
391141391 | https://github.com/simonw/datasette/pull/280#issuecomment-391141391 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTE0MTM5MQ== | r4vi 565628 | 2018-05-22T21:08:39Z | 2018-05-22T21:08:39Z | CONTRIBUTOR | I'm going to clean this up for consistency tomorrow morning so hold off merging until then please On Tue, May 22, 2018 at 6:34 PM, Simon Willison notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391290271 | https://github.com/simonw/datasette/pull/280#issuecomment-391290271 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTI5MDI3MQ== | r4vi 565628 | 2018-05-23T09:53:38Z | 2018-05-23T09:53:38Z | CONTRIBUTOR | Running:
is now returning FTS5 enabled in the versions output:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391355030 | https://github.com/simonw/datasette/pull/280#issuecomment-391355030 | https://api.github.com/repos/simonw/datasette/issues/280 | MDEyOklzc3VlQ29tbWVudDM5MTM1NTAzMA== | r4vi 565628 | 2018-05-23T13:53:27Z | 2018-05-23T15:22:45Z | CONTRIBUTOR | No objections; It's good to go @simonw On Wed, 23 May 2018, 14:51 Simon Willison, notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Build Dockerfile with recent Sqlite + Spatialite 325373747 | |
391505930 | https://github.com/simonw/datasette/issues/276#issuecomment-391505930 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MTUwNTkzMA== | russss 45057 | 2018-05-23T21:41:37Z | 2018-05-23T21:41:37Z | CONTRIBUTOR |
Ah I didn't mean that - I meant altering the SELECT query to fetch the data so that it ran a spatialite function to transform that specific column. I think that's less useful as a general-purpose plugin hook though, and it's not that hard to parse the WKB in Python (my default approach would be to use shapely, which is great, but geomet looks like an interesting pure-python alternative). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
392825746 | https://github.com/simonw/datasette/issues/276#issuecomment-392825746 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MjgyNTc0Ng== | russss 45057 | 2018-05-29T15:42:53Z | 2018-05-29T15:42:53Z | CONTRIBUTOR | I haven't had time to look further into this, but if doing this as a plugin results in useful hooks then I think we should do it that way. We could always require the plugin as a standard dependency. I think this is going to result in quite a bit of refactoring anyway so it's a good time to add hooks regardless. On the other hand, if we have to add lots of specialist hooks for it then maybe it's worth integrating into the core. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
393106520 | https://github.com/simonw/datasette/issues/276#issuecomment-393106520 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDM5MzEwNjUyMA== | russss 45057 | 2018-05-30T10:09:25Z | 2018-05-30T10:09:25Z | CONTRIBUTOR | I don't think it's unreasonable to only support spatialite geometries in a coordinate reference system which is at least transformable to WGS84. It would be nice to support different CRSes in the database so conversion to spatialite from the source data is lossless. I think the working CRS for datasette should be WGS84 though (leaflet requires it, for example) - it's just a case of calling |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
401310732 | https://github.com/simonw/datasette/issues/276#issuecomment-401310732 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDQwMTMxMDczMg== | psychemedia 82988 | 2018-06-29T10:05:04Z | 2018-06-29T10:07:25Z | CONTRIBUTOR | @russs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg kartena/Proj4Leaflet) although the leaflet side would need to detect or be informed of the original projection? Another possibility would be to provide an easy way/guidance for users to create an FK'd table containing the WGS84 projection of a non-WGS84 geometry in the original/principle table? This could then as a proxy for serving GeoJSON to the leaflet map? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
401312981 | https://github.com/simonw/datasette/issues/276#issuecomment-401312981 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDQwMTMxMjk4MQ== | russss 45057 | 2018-06-29T10:14:54Z | 2018-06-29T10:14:54Z | CONTRIBUTOR |
Well, as @simonw mentioned, GeoJSON only supports WGS84, and GeoJSON (and/or TopoJSON) is the standard we probably want to aim for. On-the-fly reprojection in spatialite is not an issue anyway, and in general I think you want to be serving stuff to web maps in WGS84 or Web Mercator. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
405022335 | https://github.com/simonw/datasette/issues/344#issuecomment-405022335 | https://api.github.com/repos/simonw/datasette/issues/344 | MDEyOklzc3VlQ29tbWVudDQwNTAyMjMzNQ== | russss 45057 | 2018-07-14T13:00:48Z | 2018-07-14T13:00:48Z | CONTRIBUTOR | Looks like this was a red herring actually, and heroku had a blip when I was testing it... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette publish heroku fails without name provided 341229113 | |
405026441 | https://github.com/simonw/datasette/issues/343#issuecomment-405026441 | https://api.github.com/repos/simonw/datasette/issues/343 | MDEyOklzc3VlQ29tbWVudDQwNTAyNjQ0MQ== | russss 45057 | 2018-07-14T14:17:14Z | 2018-07-14T14:17:14Z | CONTRIBUTOR | This probably depends on #294. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Render boolean fields better by default 341228846 | |
405026800 | https://github.com/simonw/datasette/issues/294#issuecomment-405026800 | https://api.github.com/repos/simonw/datasette/issues/294 | MDEyOklzc3VlQ29tbWVudDQwNTAyNjgwMA== | russss 45057 | 2018-07-14T14:24:31Z | 2018-07-14T14:24:31Z | CONTRIBUTOR | I had a quick look at this in relation to #343 and I feel like it might be worth modelling the inspected table metadata internally as an object rather than a dict. (We'd still have to serialise it back to JSON.) There are a few places where we rely on the structure of this metadata dict for various reasons, including in templates (and potentially also in user templates). It would be nice to have a reasonably well defined API for accessing metadata internally so that it's clearer what we're breaking. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
inspect should record column types 327365110 | |
422821483 | https://github.com/simonw/datasette/issues/329#issuecomment-422821483 | https://api.github.com/repos/simonw/datasette/issues/329 | MDEyOklzc3VlQ29tbWVudDQyMjgyMTQ4Mw== | jaywgraves 418191 | 2018-09-19T14:17:42Z | 2018-09-19T14:17:42Z | CONTRIBUTOR | I'm using the docker image (0.23.2) and notice some differences/bugs between the docs and the published version with canned queries. (submitted a tiny doc fix also) I was able to build the docker container locally using I would like to run this in our Kubernetes cluster but don't want to publish a version in our internal registry if I don't have to. Thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Travis should push tagged images to Docker Hub for each release 336465018 | |
422915450 | https://github.com/simonw/datasette/issues/329#issuecomment-422915450 | https://api.github.com/repos/simonw/datasette/issues/329 | MDEyOklzc3VlQ29tbWVudDQyMjkxNTQ1MA== | jaywgraves 418191 | 2018-09-19T18:45:02Z | 2018-09-20T10:50:50Z | CONTRIBUTOR | That works for me. Was able to pull the public image and no errors on my canned query. (~although a small rendering bug. I'll create an issue and if I have time today, a PR to fix~ this turned out to be my error.) Thanks for the quick response! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Travis should push tagged images to Docker Hub for each release 336465018 | |
429737929 | https://github.com/simonw/datasette/issues/366#issuecomment-429737929 | https://api.github.com/repos/simonw/datasette/issues/366 | MDEyOklzc3VlQ29tbWVudDQyOTczNzkyOQ== | gfrmin 416374 | 2018-10-15T07:32:57Z | 2018-10-15T07:32:57Z | CONTRIBUTOR | Very hacky solution is to write now.json file forcing the usage of v1 of Zeit cloud, see https://github.com/slygent/datasette/commit/3ab824793ec6534b6dd87078aa46b11c4fa78ea3 This does work, at least. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Default built image size over Zeit Now 100MiB limit 369716228 | |
435768450 | https://github.com/simonw/datasette/issues/369#issuecomment-435768450 | https://api.github.com/repos/simonw/datasette/issues/369 | MDEyOklzc3VlQ29tbWVudDQzNTc2ODQ1MA== | gfrmin 416374 | 2018-11-05T06:31:59Z | 2018-11-05T06:31:59Z | CONTRIBUTOR | That would be ideal, but you know better than me whether the CSV streaming trick works for custom SQL queries. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Interface should show same JSON shape options for custom SQL queries 374953006 | |
435862009 | https://github.com/simonw/datasette/issues/371#issuecomment-435862009 | https://api.github.com/repos/simonw/datasette/issues/371 | MDEyOklzc3VlQ29tbWVudDQzNTg2MjAwOQ== | psychemedia 82988 | 2018-11-05T12:48:35Z | 2018-11-05T12:48:35Z | CONTRIBUTOR | I think you need to register a domain name you own separately in order to get a non-IP address address? https://www.digitalocean.com/docs/networking/dns/ |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette publish digitalocean plugin 377156339 | |
436037692 | https://github.com/simonw/datasette/issues/370#issuecomment-436037692 | https://api.github.com/repos/simonw/datasette/issues/370 | MDEyOklzc3VlQ29tbWVudDQzNjAzNzY5Mg== | psychemedia 82988 | 2018-11-05T21:15:47Z | 2018-11-05T21:18:37Z | CONTRIBUTOR | In terms of integration with
The |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Integration with JupyterLab 377155320 | |
436042445 | https://github.com/simonw/datasette/issues/370#issuecomment-436042445 | https://api.github.com/repos/simonw/datasette/issues/370 | MDEyOklzc3VlQ29tbWVudDQzNjA0MjQ0NQ== | psychemedia 82988 | 2018-11-05T21:30:42Z | 2018-11-05T21:31:48Z | CONTRIBUTOR | Another route would be something like creating a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Integration with JupyterLab 377155320 | |
459915995 | https://github.com/simonw/datasette/issues/160#issuecomment-459915995 | https://api.github.com/repos/simonw/datasette/issues/160 | MDEyOklzc3VlQ29tbWVudDQ1OTkxNTk5NQ== | psychemedia 82988 | 2019-02-02T00:43:16Z | 2019-02-02T00:58:20Z | CONTRIBUTOR | Do you have any simple working examples of how to use If Use case is here: https://github.com/psychemedia/jupyterserverproxy-datasette-demo Trying to do a really simple |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to bundle and serve additional static files 278208011 | |
474280581 | https://github.com/simonw/datasette/issues/417#issuecomment-474280581 | https://api.github.com/repos/simonw/datasette/issues/417 | MDEyOklzc3VlQ29tbWVudDQ3NDI4MDU4MQ== | psychemedia 82988 | 2019-03-19T10:06:42Z | 2019-03-19T10:06:42Z | CONTRIBUTOR | This would be really interesting but several possibilities in use arise, I think? For example:
CSV files may also have messy names compared to the table you want. Or for an update CSV, may have the form |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette Library 421546944 | |
474282321 | https://github.com/simonw/datasette/issues/412#issuecomment-474282321 | https://api.github.com/repos/simonw/datasette/issues/412 | MDEyOklzc3VlQ29tbWVudDQ3NDI4MjMyMQ== | psychemedia 82988 | 2019-03-19T10:09:46Z | 2019-03-19T10:09:46Z | CONTRIBUTOR | Does this also relate to https://github.com/simonw/datasette/issues/283 and the ability to |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Linked Data(sette) 411257981 | |
483017176 | https://github.com/simonw/datasette/issues/431#issuecomment-483017176 | https://api.github.com/repos/simonw/datasette/issues/431 | MDEyOklzc3VlQ29tbWVudDQ4MzAxNzE3Ng== | psychemedia 82988 | 2019-04-14T16:58:37Z | 2019-04-14T16:58:37Z | CONTRIBUTOR | Hmm... nope... I see an updated timestamp from |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette doesn't reload when database file changes 432870248 | |
483202658 | https://github.com/simonw/datasette/issues/429#issuecomment-483202658 | https://api.github.com/repos/simonw/datasette/issues/429 | MDEyOklzc3VlQ29tbWVudDQ4MzIwMjY1OA== | psychemedia 82988 | 2019-04-15T10:48:01Z | 2019-04-15T10:48:01Z | CONTRIBUTOR | Minor UI observation:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
?_where=sql-fragment parameter for table views 432636432 | |
487537452 | https://github.com/simonw/datasette/pull/437#issuecomment-487537452 | https://api.github.com/repos/simonw/datasette/issues/437 | MDEyOklzc3VlQ29tbWVudDQ4NzUzNzQ1Mg== | russss 45057 | 2019-04-29T10:58:49Z | 2019-04-29T10:58:49Z | CONTRIBUTOR | I've just spotted that this implements #215. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add inspect and prepare_sanic hooks 438048318 | |
487542486 | https://github.com/simonw/datasette/pull/439#issuecomment-487542486 | https://api.github.com/repos/simonw/datasette/issues/439 | MDEyOklzc3VlQ29tbWVudDQ4NzU0MjQ4Ng== | russss 45057 | 2019-04-29T11:20:30Z | 2019-04-29T11:20:30Z | CONTRIBUTOR | Actually I think this is not the whole story because of the rowid issue. I'm going to think about this one a bit more. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add primary key to the extra_body_script hook arguments 438240541 | |
487686655 | https://github.com/simonw/datasette/pull/441#issuecomment-487686655 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzY4NjY1NQ== | russss 45057 | 2019-04-29T18:14:25Z | 2019-04-29T18:14:25Z | CONTRIBUTOR | Subsidiary note which I forgot in the commit message: I've decided to give each view a short string name to aid in differentiating which view a hook is being called from. Since hooks are functions and not subclasses, and can get called from different places in the URL hierarchy, it's sometimes difficult to distinguish what data you're actually operating on. I think this will come in handy for other hooks as well. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487689477 | https://github.com/simonw/datasette/pull/424#issuecomment-487689477 | https://api.github.com/repos/simonw/datasette/issues/424 | MDEyOklzc3VlQ29tbWVudDQ4NzY4OTQ3Nw== | russss 45057 | 2019-04-29T18:22:40Z | 2019-04-29T18:22:40Z | CONTRIBUTOR | This is pretty conflicty because I forgot how to use git fetch. If you're interested in merging this I'll rewrite it against an actual modern checkout... |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Column types in inspected metadata 427429265 | |
487692377 | https://github.com/simonw/datasette/pull/424#issuecomment-487692377 | https://api.github.com/repos/simonw/datasette/issues/424 | MDEyOklzc3VlQ29tbWVudDQ4NzY5MjM3Nw== | russss 45057 | 2019-04-29T18:30:46Z | 2019-04-29T18:30:46Z | CONTRIBUTOR | Actually no, I ended up not using the inspected column types in my plugin, and the binary column issue can be solved a lot more simply, so I'll close this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Column types in inspected metadata 427429265 | |
487723476 | https://github.com/simonw/datasette/pull/441#issuecomment-487723476 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzcyMzQ3Ng== | russss 45057 | 2019-04-29T20:05:23Z | 2019-04-29T20:05:23Z | CONTRIBUTOR | This is the minimal example (I also included it in the docs): ```python from datasette import hookimpl def render_test(args, data, view_name): return { 'body': 'Hello World', 'content_type': 'text/plain' } @hookimpl def register_output_renderer(): return { 'extension': 'test', 'callback': render_test } ``` I'm working on the GeoJSON one now and it should be ready soon. (I forgot I was going to run into the same problem as before - that Spatialite's stupid binary format isn't WKB and I have no way of altering the query to change that - but I've just managed to write some code to rearrange the bytes from Spatialite blob-geometry into WKB...) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487724539 | https://github.com/simonw/datasette/pull/441#issuecomment-487724539 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzcyNDUzOQ== | russss 45057 | 2019-04-29T20:08:32Z | 2019-04-29T20:08:32Z | CONTRIBUTOR | I also just realised that I should be passing the datasette object into the hook function...as I just found I need it. So hold off merging until I've fixed that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487735247 | https://github.com/simonw/datasette/pull/441#issuecomment-487735247 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4NzczNTI0Nw== | russss 45057 | 2019-04-29T20:39:43Z | 2019-04-29T20:39:43Z | CONTRIBUTOR | I updated the hook to pass the datasette object through now. You can see the working GeoJSON render function here - the hook function is here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487748271 | https://github.com/simonw/datasette/pull/441#issuecomment-487748271 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4Nzc0ODI3MQ== | russss 45057 | 2019-04-29T21:20:17Z | 2019-04-29T21:20:17Z | CONTRIBUTOR | Also I just pushed a change to add registered output renderers to the templates: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
487859345 | https://github.com/simonw/datasette/pull/439#issuecomment-487859345 | https://api.github.com/repos/simonw/datasette/issues/439 | MDEyOklzc3VlQ29tbWVudDQ4Nzg1OTM0NQ== | russss 45057 | 2019-04-30T08:21:19Z | 2019-04-30T08:21:19Z | CONTRIBUTOR | I think the best approach to this is to pass through the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[WIP] Add primary key to the extra_body_script hook arguments 438240541 | |
488247617 | https://github.com/simonw/datasette/pull/441#issuecomment-488247617 | https://api.github.com/repos/simonw/datasette/issues/441 | MDEyOklzc3VlQ29tbWVudDQ4ODI0NzYxNw== | russss 45057 | 2019-05-01T09:57:50Z | 2019-05-01T09:57:50Z | CONTRIBUTOR | Just for the record, this PR is now finished and ready to merge from my perspective. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add register_output_renderer hook 438437973 | |
488595724 | https://github.com/simonw/datasette/pull/432#issuecomment-488595724 | https://api.github.com/repos/simonw/datasette/issues/432 | MDEyOklzc3VlQ29tbWVudDQ4ODU5NTcyNA== | russss 45057 | 2019-05-02T08:50:53Z | 2019-05-02T08:50:53Z | CONTRIBUTOR |
I was thinking that it might be handy for datasette to have a request object which wraps the Sanic Request. This could include the datasette-specific querystring decoding and the This would mean that we could expose the request object to plugin hooks without coupling them to Sanic. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Refactor facets to a class and new plugin, refs #427 432893491 | |
489060765 | https://github.com/simonw/datasette/issues/419#issuecomment-489060765 | https://api.github.com/repos/simonw/datasette/issues/419 | MDEyOklzc3VlQ29tbWVudDQ4OTA2MDc2NQ== | russss 45057 | 2019-05-03T11:07:42Z | 2019-05-03T11:07:42Z | CONTRIBUTOR | Are you planning on removing inspect entirely? I didn't spot this work before I started on datasette-geo, but ironically I think it has a use case which really needs the inspect functionality (or some replacement). Datasette-geo uses it to store the bounding box of all the geographic features in the table. This is needed when rendering the map because it avoids having to send loads of tile requests for areas which are empty. Even with relatively small datasets, calculating the bounding box seems to take around 5 seconds, so I don't think it's really feasible to do this on page load. One possible fix would be to do this on startup, and then in a thread which watches the database for changes. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Default to opening files in mutable mode, special option for immutable files 421551434 | |
489105665 | https://github.com/simonw/datasette/pull/434#issuecomment-489105665 | https://api.github.com/repos/simonw/datasette/issues/434 | MDEyOklzc3VlQ29tbWVudDQ4OTEwNTY2NQ== | eyeseast 25778 | 2019-05-03T14:01:30Z | 2019-05-03T14:01:30Z | CONTRIBUTOR | This is exactly what I needed. Thank you. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette publish cloudrun" command to publish to Google Cloud Run 434321685 | |
489163939 | https://github.com/simonw/datasette/pull/434#issuecomment-489163939 | https://api.github.com/repos/simonw/datasette/issues/434 | MDEyOklzc3VlQ29tbWVudDQ4OTE2MzkzOQ== | rprimet 10352819 | 2019-05-03T16:49:45Z | 2019-05-03T16:50:03Z | CONTRIBUTOR |
Yes, I was able to reproduce this; I used to get prompted for a run region interactively by the Not sure which course of action is best: making |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette publish cloudrun" command to publish to Google Cloud Run 434321685 | |
489221481 | https://github.com/simonw/datasette/issues/446#issuecomment-489221481 | https://api.github.com/repos/simonw/datasette/issues/446 | MDEyOklzc3VlQ29tbWVudDQ4OTIyMTQ4MQ== | russss 45057 | 2019-05-03T19:58:31Z | 2019-05-03T19:58:31Z | CONTRIBUTOR | In this particular case I don't think there's an issue making all those required. However, I suspect we might have to allow optional values at some point - my preferred solution to russss/datasette-geo#2 would need one. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Define mechanism for plugins to return structured data 440134714 | |
489222223 | https://github.com/simonw/datasette/issues/446#issuecomment-489222223 | https://api.github.com/repos/simonw/datasette/issues/446 | MDEyOklzc3VlQ29tbWVudDQ4OTIyMjIyMw== | russss 45057 | 2019-05-03T20:01:19Z | 2019-05-03T20:01:29Z | CONTRIBUTOR | Also I have a slight preference against (ab)using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Define mechanism for plugins to return structured data 440134714 | |
489342728 | https://github.com/simonw/datasette/pull/450#issuecomment-489342728 | https://api.github.com/repos/simonw/datasette/issues/450 | MDEyOklzc3VlQ29tbWVudDQ4OTM0MjcyOA== | russss 45057 | 2019-05-04T16:37:35Z | 2019-05-04T16:37:35Z | CONTRIBUTOR | For a bit more context: this fixes a crash with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Coalesce hidden table count to 0 440304714 | |
499320973 | https://github.com/simonw/datasette/issues/394#issuecomment-499320973 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ5OTMyMDk3Mw== | kevindkeogh 13896256 | 2019-06-06T02:07:59Z | 2019-06-06T02:07:59Z | CONTRIBUTOR | Hey was this ever merged? Trying to run this behind nginx, and encountering this issue. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url configuration setting 396212021 | |
499923145 | https://github.com/simonw/datasette/issues/394#issuecomment-499923145 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ5OTkyMzE0NQ== | kevindkeogh 13896256 | 2019-06-07T15:10:57Z | 2019-06-07T15:11:07Z | CONTRIBUTOR | Putting this here in case anyone else encounters the same issue with nginx, I was able to resolve it by passing the header in the nginx proxy config (i.e., |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url configuration setting 396212021 | |
504662904 | https://github.com/simonw/datasette/issues/514#issuecomment-504662904 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY2MjkwNA== | russss 45057 | 2019-06-22T12:45:21Z | 2019-06-22T12:45:39Z | CONTRIBUTOR | On most modern Linux distros, systemd is the easiest answer. Example systemd unit file (save to [Service] Type=simple User=<username> WorkingDirectory=/path/to/data ExecStart=/path/to/datasette serve -h 0.0.0.0 ./my.db Restart=on-failure [Install] WantedBy=multi-user.target ``` Activate it with:
Logs are best viewed using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504663766 | https://github.com/simonw/datasette/issues/514#issuecomment-504663766 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY2Mzc2Ng== | russss 45057 | 2019-06-22T12:57:59Z | 2019-06-22T12:57:59Z | CONTRIBUTOR |
I wasn't even aware it was possible to add a systemd service at an arbitrary path, but it seems a little messy to me. Maybe worth noting that systemd does support per-user services which don't require root access. Cool but probably overkill for most people (especially when you're going to need root to listen on port 80 anyway, directly or via a reverse proxy). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504684831 | https://github.com/simonw/datasette/issues/514#issuecomment-504684831 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY4NDgzMQ== | russss 45057 | 2019-06-22T17:38:23Z | 2019-06-22T17:38:23Z | CONTRIBUTOR |
It's the working directory (cwd) of the spawned process. In this case if you set it to the directory your data is in, you can use relative paths to the db (and metadata/templates/etc) in the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504690927 | https://github.com/simonw/datasette/issues/514#issuecomment-504690927 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY5MDkyNw== | russss 45057 | 2019-06-22T19:06:07Z | 2019-06-22T19:06:07Z | CONTRIBUTOR | I'd rather not turn this into a systemd support thread, but you're trying to execute the package directory there. Your datasette executable is probably at |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504809397 | https://github.com/simonw/datasette/issues/523#issuecomment-504809397 | https://api.github.com/repos/simonw/datasette/issues/523 | MDEyOklzc3VlQ29tbWVudDUwNDgwOTM5Nw== | rixx 2657547 | 2019-06-24T01:38:14Z | 2019-06-24T01:38:14Z | CONTRIBUTOR | Ah, apologies – I had found and read those issues, but I was under the impression that they refered only to the filtered row count, not the unfiltered total row count. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Show total/unfiltered row count when filtering 459627549 | |
509013413 | https://github.com/simonw/datasette/issues/507#issuecomment-509013413 | https://api.github.com/repos/simonw/datasette/issues/507 | MDEyOklzc3VlQ29tbWVudDUwOTAxMzQxMw== | psychemedia 82988 | 2019-07-07T16:31:57Z | 2019-07-07T16:31:57Z | CONTRIBUTOR | Chrome and Firefox both support headless screengrabs from command line, but I don't know how parameterised they can be? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Every datasette plugin on the ecosystem page should have a screenshot 455852801 | |
509618339 | https://github.com/simonw/datasette/pull/554#issuecomment-509618339 | https://api.github.com/repos/simonw/datasette/issues/554 | MDEyOklzc3VlQ29tbWVudDUwOTYxODMzOQ== | abdusco 3243482 | 2019-07-09T12:16:32Z | 2019-07-09T12:16:32Z | CONTRIBUTOR | I've also added another fix for using static mounts with absolute paths on Windows. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix static mounts using relative paths and prevent traversal exploits 465728430 | |
509629331 | https://github.com/simonw/datasette/pull/554#issuecomment-509629331 | https://api.github.com/repos/simonw/datasette/issues/554 | MDEyOklzc3VlQ29tbWVudDUwOTYyOTMzMQ== | abdusco 3243482 | 2019-07-09T12:51:35Z | 2019-07-09T12:51:35Z | CONTRIBUTOR | I wanted to add a test for it too, but I've realized it's impossible to test a server process as we cannot get its exit code. ```python tests/test_cli.pydef test_static_mounts_on_windows(): if sys.platform != "win32": return runner = CliRunner() result = runner.invoke( cli, ["serve", "--static", r"s:C:\"] ) assert result.exit_code == 0 ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix static mounts using relative paths and prevent traversal exploits 465728430 | |
510730200 | https://github.com/simonw/datasette/issues/511#issuecomment-510730200 | https://api.github.com/repos/simonw/datasette/issues/511 | MDEyOklzc3VlQ29tbWVudDUxMDczMDIwMA== | abdusco 3243482 | 2019-07-12T03:23:22Z | 2019-07-12T03:23:22Z | CONTRIBUTOR | @simonw yes it works fine on Windows, but test suite doesn't run properly, for that I had to use WSL |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Get Datasette tests passing on Windows in GitHub Actions 456578474 | |
527209840 | https://github.com/simonw/sqlite-utils/pull/56#issuecomment-527209840 | https://api.github.com/repos/simonw/sqlite-utils/issues/56 | MDEyOklzc3VlQ29tbWVudDUyNzIwOTg0MA== | amjith 49260 | 2019-09-02T17:23:21Z | 2019-09-02T17:23:21Z | CONTRIBUTOR | I have updated the other PR with the changes from this one and added tests. I have also changed the escaping from double quotes to brackets. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Escape the table name in populate_fts and search. 487847945 | |
527211047 | https://github.com/simonw/sqlite-utils/pull/57#issuecomment-527211047 | https://api.github.com/repos/simonw/sqlite-utils/issues/57 | MDEyOklzc3VlQ29tbWVudDUyNzIxMTA0Nw== | amjith 49260 | 2019-09-02T17:30:43Z | 2019-09-02T17:30:43Z | CONTRIBUTOR | I have merged the other PR (#56) into this one. I have incorporated your suggestions. Cheers! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add triggers while enabling FTS 487987958 | |
533818697 | https://github.com/simonw/sqlite-utils/issues/61#issuecomment-533818697 | https://api.github.com/repos/simonw/sqlite-utils/issues/61 | MDEyOklzc3VlQ29tbWVudDUzMzgxODY5Nw== | amjith 49260 | 2019-09-21T18:09:01Z | 2019-09-21T18:09:28Z | CONTRIBUTOR | @witeshadow The library version doesn't have helpers around CSV (at least not from what I can see in the code). But here's a snippet that makes it easy to insert from CSV using the library. ``` import csv from sqlite_utils import Database CSV Readercsv_file = open("filename.csv") # open the csv file. reader = csv.reader(csv_file) # Create a CSV reader headers = next(reader) # First line is the header docs = (dict(zip(headers, row)) for row in reader) Now you can use the
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
importing CSV to SQLite as library 491219910 | |
541052329 | https://github.com/simonw/datasette/issues/585#issuecomment-541052329 | https://api.github.com/repos/simonw/datasette/issues/585 | MDEyOklzc3VlQ29tbWVudDU0MTA1MjMyOQ== | rixx 2657547 | 2019-10-11T12:53:51Z | 2019-10-11T12:53:51Z | CONTRIBUTOR | I think this would be good, yeah – currently, databases are explicitly sorted by name in the IndexView, we could just remove that part (and use an |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Databases on index page should display in order they were passed to "datasette serve"? 503217375 | |
541118904 | https://github.com/simonw/datasette/issues/507#issuecomment-541118904 | https://api.github.com/repos/simonw/datasette/issues/507 | MDEyOklzc3VlQ29tbWVudDU0MTExODkwNA== | rixx 2657547 | 2019-10-11T15:48:49Z | 2019-10-11T15:48:49Z | CONTRIBUTOR | Headless Chrome and Firefox via Selenium are a solid choice in my experience. You may be interested in how pretix and pretalx solve this problem: They use pytest to create those screenshots on release to make sure they are up to date. See this writeup and this repo. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Every datasette plugin on the ecosystem page should have a screenshot 455852801 | |
541119038 | https://github.com/simonw/datasette/issues/512#issuecomment-541119038 | https://api.github.com/repos/simonw/datasette/issues/512 | MDEyOklzc3VlQ29tbWVudDU0MTExOTAzOA== | rixx 2657547 | 2019-10-11T15:49:13Z | 2019-10-11T15:49:13Z | CONTRIBUTOR | How open are you to changing the config variable names (with appropriate deprecation, of course)? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"about" parameter in metadata does not appear when alone 457147936 | |
541562581 | https://github.com/simonw/datasette/pull/590#issuecomment-541562581 | https://api.github.com/repos/simonw/datasette/issues/590 | MDEyOklzc3VlQ29tbWVudDU0MTU2MjU4MQ== | rixx 2657547 | 2019-10-14T08:57:46Z | 2019-10-14T08:57:46Z | CONTRIBUTOR | Ah, thank you – I saw the need for unit tests but wasn't sure what the best way to add one would be. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spaces in DB names 505818256 | |
541587823 | https://github.com/simonw/datasette/pull/590#issuecomment-541587823 | https://api.github.com/repos/simonw/datasette/issues/590 | MDEyOklzc3VlQ29tbWVudDU0MTU4NzgyMw== | rixx 2657547 | 2019-10-14T09:58:23Z | 2019-10-14T09:58:23Z | CONTRIBUTOR | Added tests. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spaces in DB names 505818256 | |
544008463 | https://github.com/simonw/datasette/pull/601#issuecomment-544008463 | https://api.github.com/repos/simonw/datasette/issues/601 | MDEyOklzc3VlQ29tbWVudDU0NDAwODQ2Mw== | rixx 2657547 | 2019-10-18T23:39:21Z | 2019-10-18T23:39:21Z | CONTRIBUTOR | That looks right, and I completely agree with the intent. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't auto-format SQL on page load 509340359 | |
544008944 | https://github.com/simonw/datasette/pull/601#issuecomment-544008944 | https://api.github.com/repos/simonw/datasette/issues/601 | MDEyOklzc3VlQ29tbWVudDU0NDAwODk0NA== | rixx 2657547 | 2019-10-18T23:40:48Z | 2019-10-18T23:40:48Z | CONTRIBUTOR | The only negative impact that comes to mind is that now you have no way to get the read-only query to be formatted nicely, I think, so maybe a second PR adding the formatting functionality even to the read-only page would be good? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't auto-format SQL on page load 509340359 | |
544214418 | https://github.com/simonw/datasette/pull/601#issuecomment-544214418 | https://api.github.com/repos/simonw/datasette/issues/601 | MDEyOklzc3VlQ29tbWVudDU0NDIxNDQxOA== | rixx 2657547 | 2019-10-20T02:29:49Z | 2019-10-20T02:29:49Z | CONTRIBUTOR | Submitted in #602! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Don't auto-format SQL on page load 509340359 | |
549246007 | https://github.com/simonw/datasette/pull/602#issuecomment-549246007 | https://api.github.com/repos/simonw/datasette/issues/602 | MDEyOklzc3VlQ29tbWVudDU0OTI0NjAwNw== | rixx 2657547 | 2019-11-04T07:29:33Z | 2019-11-04T07:29:33Z | CONTRIBUTOR | Not sure – I'm always a bit weirded out when elements that I clicked disappear on me. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Offer to format readonly SQL 509535510 | |
552134876 | https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552134876 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29 | MDEyOklzc3VlQ29tbWVudDU1MjEzNDg3Ng== | jacobian 21148 | 2019-11-09T20:33:38Z | 2019-11-09T20:33:38Z | CONTRIBUTOR | ❤️ thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`import` command fails on empty files 518725064 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
updated_at (date) >1000 ✖