issue_comments
996 rows where author_association = "NONE" sorted by updated_at
This data as json, CSV (advanced)
Suggested facets: issue_url, reactions, created_at (date)
issue 622
- Transformation type `--type DATETIME` 14
- link_or_copy_directory() error - Invalid cross-device link 13
- WIP: Add Gmail takeout mbox import 12
- .json and .csv exports fail to apply base_url 11
- base_url configuration setting 10
- Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 10
- Documentation with recommendations on running Datasette in production without using Docker 9
- JavaScript plugin hooks mechanism similar to pluggy 9
- Add GraphQL endpoint 8
- Call for birthday presents: if you're using Datasette, let us know how you're using it here 8
- Full text search of all tables at once? 7
- Populate "endpoint" key in ASGI scope 7
- Figure out some interesting example SQL queries 7
- Add Gmail takeout mbox import (v2) 7
- Incorrect URLs when served behind a proxy with base_url set 6
- publish heroku does not work on Windows 10 6
- Update for Big Sur 6
- Improve the display of facets information 6
- De-tangling Metadata before Datasette 1.0 6
- Metadata should be a nested arbitrary KV store 5
- Windows installation error 5
- Ways to improve fuzzy search speed on larger data sets? 5
- Redesign default .json format 5
- UNIQUE constraint failed: workouts.id 5
- Feature Request: Gmail 5
- Plugin hook for dynamic metadata 5
- i18n support 5
- datasette --root running in Docker doesn't reliably show the magic URL 5
- Datasette serve should accept paths/URLs to CSVs and other file formats 4
- Mechanism for ranking results from SQLite full-text search 4
- Port Datasette to ASGI 4
- Wildcard support in query parameters 4
- Handle really wide tables better 4
- Prototoype for Datasette on PostgreSQL 4
- Support column descriptions in metadata.json 4
- .delete_where() does not auto-commit (unlike .insert() or .upsert()) 4
- "Stream all rows" is not at all obvious 4
- Possible to deploy as a python app (for Rstudio connect server)? 4
- Document how to send multiple values for "Named parameters" 4
- Add support for Jinja2 version 3.0 4
- Win32 "used by another process" error with datasette publish 4
- introduce new option for datasette package to use a slim base image 4
- CLI eats my cursor 4
- datasette package --spatialite throws error during build 4
- How to redirect from "/" to a specific db/table 4
- Package as standalone binary 3
- Plugin that adds an authentication layer of some sort 3
- datasette publish lambda plugin 3
- Explore if SquashFS can be used to shrink size of packaged Docker containers 3
- make uvicorn optional dependancy (because not ok on windows python yet) 3
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 3
- updating metadata.json without recreating the app 3
- upsert_all() throws issue when upserting to empty table 3
- base_url doesn't seem to work when adding criteria and clicking "apply" 3
- Fallback to databases in inspect-data.json when no -i options are passed 3
- Some workout columns should be float, not text 3
- Archive import appears to be broken on recent exports 3
- Use structlog for logging 3
- KeyError: 'Contents' on running upload 3
- photo-to-sqlite: command not found 3
- sqlite-utils extract could handle nested objects 3
- Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions 3
- improve table horizontal scroll experience 3
- feature: support "events" 3
- Rename Datasette.__init__(config=) parameter to settings= 3
- [Enhancement] Please allow 'insert-files' to insert content as text. 3
- KeyError: 'created_at' for private accounts? 3
- JSON link on row page is 404 if base_url setting is used 3
- Creating tables with custom datatypes 3
- query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data 3
- SQL query field can't begin by a comment 3
- Feature request: output number of ignored/replaced rows for insert command 3
- Expand foreign key references in row view as well 3
- When reverse proxying datasette with nginx an URL element gets erronously added 3
- Link to JSON for the list of tables 2
- Option to open readonly but not immutable 2
- Support WITH query 2
- I18n and L10n support 2
- add "format sql" button to query page, uses sql-formatter 2
- 500 from missing table name 2
- Ability to sort (and paginate) by column 2
- Figure out how to bundle a more up-to-date SQLite 2
- Escaping named parameters in canned queries 2
- Validate metadata.json on startup 2
- Support cross-database joins 2
- datasette inspect takes a very long time on large dbs 2
- Installation instructions, including how to use the docker image 2
- Problems handling column names containing spaces or - 2
- Zeit API v1 does not work for new users - need to migrate to v2 2
- How to pass named parameter into spatialite MakePoint() function 2
- Datasette Library 2
- Mechanism for turning nested JSON into foreign keys / many-to-many 2
- Too many SQL variables 2
- "Invalid SQL" page should let you edit the SQL 2
- Support Python 3.8, stop supporting Python 3.5 2
- Make database level information from metadata.json available in the index.html template 2
- Mechanism for adding arbitrary pages like /about 2
- Exception running first command: IndexError: list index out of range 2
- Allow creation of virtual tables at startup 2
- Escape_fts5_query-hookimplementation does not work with queries to standard tables 2
- Allow injecting configuration data from plugins 2
- --cp option for datasette publish and datasette package for shipping additional files and directories 2
- ?_searchmode=raw option for running FTS searches without escaping characters 2
- Authentication (and permissions) as a core concept 2
- Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6 2
- [Feature Request] Support Repo Name in Search 🥺 2
- Consider pagination of canned queries 2
- initial windows ci setup 2
- github-to-sqlite should handle rate limits better 2
- .extract() shouldn't extract null values 2
- Make it possible to download BLOB data from the Datasette UI 2
- changes to allow for compound foreign keys 2
- Support for generated columns 2
- sqlite-utils should suggest --csv if JSON parsing fails 2
- Better error message for *_fts methods against views 2
- Access Denied Error in Windows 2
- Not all quoted statuses get fetched? 2
- SSL Error 2
- Installing datasette via docker: Path 'fixtures.db' does not exist 2
- Share button for copying current URL 2
- Facets timing out but work when filtering 2
- I'm creating a plugin to export a spreadsheet file (.ods or .xlsx) 2
- Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0 2
- bool type not supported 2
- Cannot set type JSON 2
- basic support for events 2
- Serve all db files in a folder 2
- feature request: document minimum permissions for service account for cloudrun 2
- Manage /robots.txt in Datasette core, block robots by default 2
- Deploy a live instance of demos/apache-proxy 2
- Use datasette-table Web Component to guide the design of the JSON API for 1.0 2
- Support for CHECK constraints 2
- Table+query JSON and CSV links broken when using `base_url` setting 2
- Make it easier to insert geometries, with documentation and maybe code 2
- base_url or prefix does not work with _exact match 2
- `deterministic=True` fails on versions of SQLite prior to 3.8.3 2
- [feature] immutable mode for a directory, not just individual sqlite file 2
- `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 2
- Research: demonstrate if parallel SQL queries are worthwhile 2
- Allow making m2m relation of a table to itself 2
- illegal UTF-16 surrogate 2
- Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 2
- Ability to insert multi-line files 2
- Setting to turn off table row counts entirely 2
- devrel/python api: Pylance type hinting 2
- Reconsider the Datasette first-run experience 2
- don't use immutable=1, only mode=ro 2
- Datasette with many and large databases > Memory use 2
- Cannot enable FTS5 despite it being available 2
- DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 2
- Suggestion: Hiding columns 2
- How to use Datasette with apache webserver on GCP? 2
- Character encoding problem 2
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 2
- GitHub Action to lint Python code with ruff 2
- 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" 2
- photos-to-sql not found? 2
- Permissions in metadata.yml / metadata.json 2
- [feature request]`datasette install plugins.json` options 2
- Plugin hook for database queries that are run 2
- TemplateAssertionError: no filter named 'tojson' 1
- TemplateAssertionError: no filter named 'tojson' 1
- datasette publish can fail if /tmp is on a different device 1
- apsw as alternative sqlite3 binding (for full text search) 1
- Ability to customize presentation of specific columns in HTML view 1
- A primary key column that has foreign key restriction associated won't rendering label column 1
- proposal new option to disable user agents cache 1
- Ability to bundle metadata and templates inside the SQLite file 1
- Cleaner mechanism for handling custom errors 1
- Allow plugins to define additional URL routes and views 1
- prepare_context() plugin hook 1
- SQLite code decoupled from Datasette 1
- Add new metadata key persistent_urls which removes the hash from all database urls 1
- Add links to example Datasette instances to appropiate places in docs 1
- Documentation for URL hashing, redirects and cache policy 1
- Handle spatialite geometry columns better 1
- Support for external database connectors 1
- Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 1
- render_cell(value) plugin hook 1
- Search all apps during heroku publish 1
- CSV export in "Advanced export" pane doesn't respect query 1
- How to pass configuration to plugins? 1
- How does persistence work? 1
- .insert/.upsert/.insert_all/.upsert_all should add missing columns 1
- Add query parameter to hide SQL textarea 1
- Upgrade to Jinja2==2.10.1 1
- Option to facet by date using month or year 1
- Additional options to gcloud build command in cloudrun - timeout 1
- Accessibility for non-techie newsies? 1
- Exporting sqlite database(s)? 1
- Option to display binary data 1
- Get Datasette tests passing on Windows in GitHub Actions 1
- "about" parameter in metadata does not appear when alone 1
- Is it possible to publish to Heroku despite slug size being too large? 1
- Handle case-insensitive headers in a nicer way 1
- Stream all results for arbitrary SQL and canned queries 1
- Use keyed rows - fixes #521 1
- Support unicode in url 1
- extracts= option for insert/update/etc 1
- Unexpected keyword argument 'hidden' 1
- Datasette Edit 1
- Ability to list views, and to access db["view_name"].rows / rows_where / etc 1
- Added support for multi arch builds 1
- Queries per DB table in metadata.json 1
- upgrade to uvicorn-0.9 to be Python-3.8 friendly 1
- Support queries at the table level 1
- Datasette FTS detection bug 1
- "friends" command (similar to "followers") 1
- Publish to Heroku is broken: "WARNING: You must pass the application as an import string to enable 'reload' or 'workers" 1
- Feature request: enable extensions loading 1
- Implement ON DELETE and ON UPDATE actions for foreign keys 1
- fts5 syntax error when using punctuation 1
- Assets table with downloads 1
- order_by mechanism 1
- How do I use the app.css as style sheet? 1
- --port option to expose a port other than 8001 in "datasette package" 1
- Problem with square bracket in CSV column name 1
- Cashe-header missing in http-response 1
- Ability to customize columns used by extracts= feature 1
- datasette publish cloudrun --memory option 1
- Adding a "recreate" flag to the `Database` constructor 1
- Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records 1
- Import EXIF data into SQLite - lens used, ISO, aperture etc 1
- Integrate image content hashing 1
- Error when I click on "View and edit SQL" 1
- strange behavior using accented characters 1
- Replace "datasette publish --extra-options" with "--setting" 1
- Fall back to authentication via ENV 1
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 1
- Question: Access to immutable database-path 1
- fts search on a column doesn't work anymore due to escape_fts 1
- Ability to serve thumbnailed Apple Photo from its place on disk 1
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 1
- Enable wildcard-searches by default 1
- Invalid SQL no such table: main.uploads 1
- Error pages not correctly loading CSS 1
- Group permission checks by request on /-/permissions debug page 1
- Reload support for config_dir mode. 1
- Fall back to FTS4 if FTS5 is not available 1
- Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15 1
- Magic parameters for canned queries 1
- New pattern for views that return either JSON or HTML, available for plugins 1
- Skip counting hidden tables 1
- Load only python files from plugins-dir. 1
- Use None as a default arg 1
- Don't install tests package 1
- Feature: pull request reviews and comments 1
- Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0 1
- Support reverse pagination (previous page, has-previous-items) 1
- Travis should not build the master branch, only the main branch 1
- 'datasette --get' option, refs #926 1
- Don't hang in db.execute_write_fn() if connection fails 1
- Run CI on GitHub Actions, not Travis 1
- Try out CodeMirror SQL hints 1
- favorites --stop_after=N stops after min(N, 200) 1
- request an "-o" option on "datasette server" to open the default browser at the running url 1
- Idea: transitive closure tables for tree structures 1
- Progress bar for sqlite-utils insert 1
- Update pytest requirement from <6.1.0,>=5.2.2 to >=5.2.2,<6.2.0 1
- Allow facet by primary keys, fixes #985 1
- Redesign application homepage 1
- Run tests against Python 3.9 1
- Document setting Google Cloud SDK properties 1
- datasette.client internal requests mechanism 1
- from_json jinja2 filter 1
- Add json_loads and json_dumps jinja2 filters 1
- Update janus requirement from <0.6,>=0.4 to >=0.4,<0.7 1
- Update asgiref requirement from ~=3.2.10 to >=3.2.10,<3.4.0 1
- Fix table name in spatialite example command 1
- About loading spatialite 1
- export.xml file name varies with different language settings 1
- Make `package` command deal with a configuration directory argument 1
- Bring date parsing into Datasette core 1
- DOC: Fix syntax error 1
- /db/table/-/blob/pk/column.blob download URL 1
- Include LICENSE in sdist 1
- Add minimum supported python 1
- Add template block prior to extra URL loaders 1
- Switch to .blob render extension for BLOB downloads 1
- Radical new colour scheme and base styles, courtesy of @natbat 1
- Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7 1
- New explicit versioning mechanism 1
- .blob output renderer 1
- Nav menu plus menu_links() hook 1
- load_template() plugin hook 1
- DigitalOcean buildpack memory errors for large sqlite db? 1
- Use FTS4 in fixtures 1
- import EX_CANTCREAT means datasette fails to work on Windows 1
- SQLite does not have case sensitive columns 1
- Use f-strings 1
- Discussion: Adding support for fetching only fresh tweets 1
- Fix --metadata doc usage 1
- GENERATED column support 1
- generated_columns table in fixtures.py 1
- Fix misaligned table actions cog 1
- Fix startup error on windows 1
- Fix footer not sticking to bottom in short pages 1
- "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 1
- sqlite3.OperationalError: near "(": syntax error 1
- More flexible CORS support in core, to encourage good security practices 1
- JavaScript to help plugins interact with the fragment part of the URL 1
- Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0 1
- killed by oomkiller on large location-history 1
- Maintain an in-memory SQLite table of connected databases and their tables 1
- --since support for favorites 1
- Modernize code to Python 3.6+ 1
- Mechanism for executing JavaScript unit tests 1
- Adopt Prettier for JavaScript code formatting 1
- Install Prettier via package.json 1
- GitHub Actions workflow to build and sign macOS binary executables 1
- Certain database names results in 404: "Database not found: None" 1
- Add fts offset docs. 1
- XML parse error 1
- WIP: Plugin includes 1
- Release 0.54 1
- Immutable Database w/ Canned Queries 1
- Use context manager instead of plain open 1
- /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory 1
- Add compile option to Dockerfile to fix failing test (fixes #696) 1
- Error reading csv files with large column data 1
- --no-headers option for CSV and TSV 1
- 500 error caused by faceting if a column called `n` exists 1
- ensure immutable databses when starting in configuration directory mode with 1
- Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages 1
- --crossdb option for joining across databases 1
- Custom pages don't work with base_url setting 1
- Allow facetting on custom queries 1
- fix small typo 1
- Sticky table column headers would be useful, especially on the query page 1
- Async support 1
- Add back styling to lists within table cells (fixes #1141) 1
- Capture "Ctrl + Enter" or "⌘ + Enter" to send SQL query? 1
- Minor type in IP adress 1
- Allow canned query params to specify default values 1
- Fix: code quality issues 1
- Escaping FTS search strings 1
- Some links aren't properly URL encoded. 1
- FTS quote functionality from datasette 1
- Plugin hook that could support 'order by random()' for table view 1
- Support for HTTP Basic Authentication 1
- support for Apache Arrow / parquet files I/O 1
- Full text search possibly broken? 1
- Use SQLite conn.interrupt() instead of sqlite_timelimit() 1
- Unit tests for the Dockerfile 1
- Invalid SQL: "no such table: pragma_database_list" on database page 1
- Minor Docs Update. Added `--app` to fly install command. 1
- Support to annotate photos on other than macOS OSes 1
- Add testres-db tool 1
- Fix little typo 1
- Better default display of arrays of items 1
- Use pytest-xdist to speed up tests 1
- Update docs: explain allow_download setting 1
- Dockerfile: use Ubuntu 20.10 as base 1
- Update pytest-asyncio requirement from <0.15,>=0.10 to >=0.10,<0.16 1
- Avoid error sorting by relationships if related tables are not allowed 1
- Bump black from 20.8b1 to 21.4b0 1
- Bump black from 20.8b1 to 21.4b1 1
- Bump black from 20.8b1 to 21.4b2 1
- Upgrade to GitHub-native Dependabot 1
- Bump black from 21.4b2 to 21.5b0 1
- Add Docker multi-arch support with Buildx 1
- Bump black from 21.4b2 to 21.5b1 1
- Update click requirement from ~=7.1.1 to >=7.1.1,<8.1.0 1
- Update jinja2 requirement from <2.12.0,>=2.10.3 to >=2.10.3,<3.1.0 1
- Support Unicode characters in metadata.json 1
- Update aiofiles requirement from <0.7,>=0.4 to >=0.4,<0.8 1
- Fix small typo 1
- ?_col=/?_nocol= to show/hide columns on the table page 1
- Re-display user's query with an error message if an error occurs 1
- DRAFT: add test and scan for docker images 1
- Error: Use either --since or --since_id, not both 1
- Using enable_fts before search term 1
- Make custom pages compatible with base_url setting 1
- Consider using CSP to protect against future XSS 1
- Update trustme requirement from <0.8,>=0.7 to >=0.7,<0.9 1
- Bump black from 21.5b2 to 21.6b0 1
- JSON export dumps JSON fields as TEXT 1
- sqlite-utils memory command for directly querying CSV/JSON data 1
- add -h support closes #276 1
- Update pytest-xdist requirement from <2.3,>=2.2.1 to >=2.2.1,<2.4 1
- Mypy fixes for rows_from_file() 1
- Test against Python 3.10-dev 1
- Fix + improve get_metadata plugin hook docs 1
- Update asgiref requirement from <3.4.0,>=3.2.10 to >=3.2.10,<3.5.0 1
- absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 1
- Option for importing CSV data using the SQLite .import mechanism 1
- Documentation on using Datasette as a library 1
- Bump black from 21.6b0 to 21.7b0 1
- Read lines with JSON object 1
- 403 when getting token 1
- sqlite-utils convert command and db[table].convert(...) method 1
- Spelling corrections plus CI job for codespell 1
- Show count of facet values if ?_facet_size=max 1
- `sqlite-utils insert --flatten` option to flatten nested JSON 1
- Add reference page to documentation using Sphinx autodoc 1
- Column metadata 1
- Update trustme requirement from <0.9,>=0.7 to >=0.7,<0.10 1
- Rethink how .ext formats (v.s. ?_format=) works before 1.0 1
- Add --merged-by flag to pull-requests sub command 1
- Duplicate Column 1
- Make sure that case-insensitive column names are unique 1
- Ability to insert file contents as text, in addition to blob 1
- Update pluggy requirement from ~=0.13.0 to >=0.13,<1.1 1
- Bump black from 21.7b0 to 21.8b0 1
- xml.etree.ElementTree.Parse Error - mismatched tag 1
- Correct naming of tool in readme 1
- Update beautifulsoup4 requirement from <4.10.0,>=4.8.1 to >=4.8.1,<4.11.0 1
- Test against 3.10-dev 1
- Add Authorization header when CORS flag is set 1
- Bump black from 21.7b0 to 21.9b0 1
- Update pytest-xdist requirement from <2.4,>=2.2.1 to >=2.2.1,<2.5 1
- Invalid JSON output when no rows 1
- Fix compatibility with Python 3.10 1
- Update pytest-timeout requirement from <1.5,>=1.4.2 to >=1.4.2,<2.1 1
- Test against Python 3.10 1
- Update pytest-asyncio requirement from <0.16,>=0.10 to >=0.10,<0.17 1
- Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" 1
- Add functionality to read Parquet files. 1
- Bump black from 21.9b0 to 21.10b0 1
- Default values for `--attach` and `--param` options 1
- Datasette should have an option to output CSV with semicolons 1
- Update docutils requirement from <0.18 to <0.19 1
- New pattern for async view classes 1
- Bump black from 21.9b0 to 21.11b0 1
- Bump black from 21.9b0 to 21.11b1 1
- base_url is omitted in JSON and CSV views 1
- Add new `"sql_file"` key to Canned Queries in metadata? 1
- Update janus requirement from <0.7,>=0.6.2 to >=0.6.2,<0.8 1
- Execution on Windows 1
- Update aiofiles requirement from <0.8,>=0.4 to >=0.4,<0.9 1
- Test against pysqlite3 running SQLite 3.37 1
- Bump black from 21.11b1 to 21.12b0 1
- Update pytest-xdist requirement from <2.5,>=2.2.1 to >=2.2.1,<2.6 1
- Data Pull fails for "Essential" level access to the Twitter API (for Documentation) 1
- TableView refactor 1
- filters_from_request plugin hook, now used in TableView 1
- Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 1
- --lines and --text and --convert and --import 1
- Initial prototype of .analyze() methods 1
- `sqlite-utils bulk` command 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18 1
- Add new spatialite helper methods 1
- Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2 1
- Documentation should clarify /stable/ vs /latest/ 1
- Potential simplified publishing mechanism 1
- Bump black from 21.12b0 to 22.1.0 1
- Ensure template_path always uses "/" to match jinja 1
- Reconsider policy on blocking queries containing the string "pragma" 1
- Test against Python 3.11-dev 1
- Index page `/` has no CORS headers 1
- Try test suite against macOS and Windows 1
- sqlite3.OperationalError: no such table: main.my_activity 1
- Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0 1
- Advanced class-based `conversions=` mechanism 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19 1
- Update Dockerfile generated by `datasette publish` 1
- Add SpatiaLite helpers to CLI 1
- Configuration directory mode does not pick up other file extensions than .db 1
- Optional Pandas integration 1
- Use dash encoding for table names and row primary keys in URLs 1
- Add /opt/homebrew to where spatialite extension can be found 1
- Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 1
- Tilde encoding 1
- Options for how `r.parsedate()` should handle invalid dates 1
- insert fails on JSONL with whitespace 1
- Ignore common generated files 1
- Document how to use a `--convert` function that runs initialization code first 1
- "Error: near "(": syntax error" when using sqlite-utils indexes CLI 1
- Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0 1
- Bump black from 22.1.0 to 22.3.0 1
- Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0 1
- Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0 1
- Datasette feature for publishing snapshots of query results 1
- Add timeout option to Cloudrun build 1
- Custom page variables aren't decoded 1
- Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases 1
- When running `auth` command, don't overwrite an existing auth.json file 1
- Misleading progress bar against utf-16-le CSV input 1
- Add scrollbars to table presentation in default layout 1
- Combining `rows_where()` and `search()` to limit which rows are searched 1
- Bump furo from 2022.4.7 to 2022.6.4.1 1
- Extract facet portions of table.html out into included templates 1
- Bump furo from 2022.4.7 to 2022.6.21 1
- Bump black from 22.1.0 to 22.6.0 1
- Keep track of config_dir 1
- Add duplicate table feature 1
- Update pytest-asyncio requirement from <0.19,>=0.17 to >=0.17,<0.20 1
- minor a11y: <select> has no visual indicator when tabbed to 1
- in extract code, check equality with IS instead of = for nulls 1
- feature request: pivot command 1
- Link to installation instructions 1
- Cross-link CLI to Python docs 1
- Discord badge 1
- beanbag-docutils>=2.0 1
- -a option is used for "--auth" and for "--all" 1
- Updating metadata.json on Datasette for MacOS 1
- db[table].create(..., transform=True) and create-table --transform 1
- Test `--load-extension` in GitHub Actions 1
- sqlite-utils query --functions mechanism for registering extra functions 1
- Support entrypoints for `--load-extension` 1
- Add an option for specifying column names when inserting CSV data 1
- Conda Forge 1
- search_sql add include_rank option 1
- Don't use upper bound dependencies, refs #1800 1
- Workaround for test failure: RuntimeError: There is no current event loop 1
- Add organization support to repos command 1
- truncate_cells_html does not work for links? 1
- progressbar for inserts/upserts of all fileformats, closes #485 1
- Specify foreign key against compound key in other table 1
- Database() constructor currently defaults is_mutable to False 1
- `sqlite-utils transform` should set empty strings to null when converting text columns to integer/float 1
- Bump furo from 2022.6.21 to 2022.9.15 1
- [SPIKE] Don't truncate query CSVs 1
- Keyword-only arguments for a bunch of internal methods 1
- Convert &_hide_sql=1 to #_hide_sql 1
- Add documentation for serving via OpenRC 1
- render_cell documentation example doesn't match the method signature 1
- Bump furo from 2022.9.15 to 2022.9.29 1
- use inspect data for hash and file size 1
- Make hash and size a lazy property 1
- Open Datasette link in new tab 1
- fix: enable-fts permanently save triggers 1
- feat: recreate fts triggers after table transform 1
- check_visibility can now take multiple permissions into account 1
- API to insert a single record into an existing table 1
- Default API token authentication mechanism 1
- Allow surrogates in parameters 1
- /db/table/-/upsert API 1
- Errors when using table filters behind a proxy 1
- Merge 1.0-dev branch back to main 1
- Upgrade to CodeMirror 6, add SQL autocomplete 1
- Use DOMContentLoaded instead of load event for CodeMirror initialization 1
- Typo in JSON API `Updating a row` documentation 1
- /db/table/-/upsert 1
- Bump furo from 2022.9.29 to 2022.12.7 1
- "permissions" blocks in metadata.json/yaml 1
- register_permissions() plugin hook 1
- invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 1
- Port as many tests as possible to async def tests against ds_client 1
- Bump sphinx from 5.3.0 to 6.0.0 1
- Bump sphinx from 5.3.0 to 6.1.0 1
- Bump sphinx from 5.3.0 to 6.1.1 1
- Bump blacken-docs from 1.12.1 to 1.13.0 1
- Stuck on loading screen 1
- Document custom json encoder 1
- ?_extra= support (draft) 1
- Datasette is not compatible with SQLite's strict quoting compilation option 1
- Show referring tables and rows when the referring foreign key is compound 1
- use single quotes for string literals, fixes #2001 1
- array facet: don't materialize unnecessary columns 1
- Deploy demo job is failing due to rate limit 1
- Error 500 - not clear the cause 1
- Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1
- add Python 3.11 classifier 1
- remove an unused `app` var in cli.py 1
- Potential feature: special support for `?a=1&a=2` on the query page 1
- Increase performance using macnotesapp 1
- Add paths for homebrew on Apple silicon 1
- Bump furo from 2022.12.7 to 2023.3.23 1
- Add permalink virtual field to items table 1
- rows: --transpose or psql extended view-like functionality 1
- Make detailed notes on how table, query and row views work right now 1
- Add paths for homebrew on Apple silicon 1
- Support self-referencing FKs in `Table.create` 1
- Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6 1
- `table.upsert_all` fails to write rows when `not_null` is present 1
- [BUG] Cannot insert new data to deployed instance 1
- sphinx.builders.linkcheck build error 1
- Bump sphinx from 6.1.3 to 7.0.1 1
- Analyze tables options: --common-limit, --no-most, --no-least 1
- TUI powered by Trogon 1
- Reformatted CLI examples in docs 1
- Bump furo from 2023.3.27 to 2023.5.20 1
- `IndexError` when doing `.insert(..., pk='id')` after `insert_all` 1
- New View base class 1
- `--settings settings.json` option 1
- Use sqlean if available in environment 1
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1
- Bump blacken-docs from 1.14.0 to 1.15.0 1
- feat: Implement a prepare_connection plugin hook 1
- cannot use jinja filters in display? 1
- Bump sphinx from 6.1.3 to 7.1.0 1
- Bump furo from 2023.3.27 to 2023.7.26 1
- datasette serve when invoked with --reload interprets the serve command as a file 1
- Bump sphinx from 6.1.3 to 7.1.1 1
- Bump sphinx from 6.1.3 to 7.1.2 1
- Bump blacken-docs, furo, blacken-docs 1
- Bump the python-packages group with 1 update 1
- Bump the python-packages group with 2 updates 1
- .transform() instead of modifying sqlite_master for add_foreign_keys 1
- Bump the python-packages group with 3 updates 1
- If a row has a primary key of `null` various things break 1
- Bump sphinx, furo, blacken-docs dependencies 1
- Start a new `datasette.yaml` configuration file, with settings support 1
- Test Datasette on multiple SQLite versions 1
- Bump the python-packages group with 3 updates 1
- Cascade for restricted token view-table/view-database/view-instance operations 1
- Fix hupper.start_reloader entry point 1
- Bump sphinx, furo, blacken-docs dependencies 1
- -s/--setting x y gets merged into datasette.yml, refs #2143, #2156 1
- Add new `--internal internal.db` option, deprecate legacy `_internal` database 1
- DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins 1
- Bump the python-packages group with 1 update 1
- click-default-group>=1.2.3 1
- Use $DATASETTE_INTERNAL in absence of --internal 1
- Test against Python 3.12 preview 1
- .transform() now preserves rowid values, refs #592 1
- actors_from_ids plugin hook and datasette.actors_from_ids() method 1
- `datasette.yaml` plugin support 1
- Bump the python-packages group with 3 updates 1
- Server hang on parallel execution of queries to named in-memory databases 1
- Raise an exception if a "plugins" block exists in metadata.json 1
- Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml` 1
- Stop using parallel SQL queries for tables 1
- Cascading DELETE not working with Table.delete(pk) 1
- Discord invite link returns 401 1
- Bump the python-packages group with 1 update 1
- Add spatialite arm64 linux path 1
- Bump the python-packages group with 1 update 1
- Fix query for suggested facets with column named value 1
- Add more STRICT table support 1
- CSV export fails for some `text` foreign key references 1
user 336
- codecov[bot] 240
- aborruso 19
- chrismp 18
- carlmjohnson 14
- tballison 13
- psychemedia 11
- stonebig 11
- frafra 10
- maxhawkins 10
- terrycojones 10
- dracos 10
- rayvoelker 10
- 20after4 9
- clausjuhl 9
- UtahDave 8
- tomchristie 8
- bsilverm 8
- 4l1fe 8
- zaneselvans 7
- mhalle 7
- zeluspudding 7
- cobiadigital 7
- cldellow 6
- khimaros 6
- CharlesNepote 6
- ocdtrekkie 6
- tsibley 5
- khusmann 5
- rdmurphy 5
- MarkusH 5
- lovasoa 5
- Mjboothaus 5
- dazzag24 5
- ar-jan 5
- xavdid 5
- davidhaley 5
- SteadBytes 5
- fs111 4
- yozlet 4
- Btibert3 4
- dholth 4
- jungle-boogie 4
- ColinMaudry 4
- nitinpaultifr 4
- Kabouik 4
- hydrosquall 4
- dvizard 4
- henry501 4
- pjamargh 4
- frankieroberto 3
- obra 3
- janimo 3
- atomotic 3
- briandorsey 3
- pkoppstein 3
- yschimke 3
- philroche 3
- coldclimate 3
- wsxiaoys 3
- johnfelipe 3
- mdrovdahl 3
- xrotwang 3
- robroc 3
- dmick 3
- betatim 3
- dufferzafar 3
- Florents-Tselai 3
- aki-k 3
- ashishdotme 3
- yejiyang 3
- henrikek 3
- swyxio 3
- Segerberg 3
- jsancho-gpl 3
- gk7279 3
- learning4life 3
- mattmalcher 3
- FabianHertwig 3
- polyrand 3
- justmars 3
- garethr 2
- nelsonjchen 2
- dsisnero 2
- hubgit 2
- jayvdb 2
- jackowayed 2
- ftrain 2
- chrishas35 2
- tannewt 2
- HaveF 2
- pkulchenko 2
- coleifer 2
- gavinband 2
- aviflax 2
- iloveitaly 2
- tholo 2
- mungewell 2
- frankier 2
- lchski 2
- tmaier 2
- hcarter333 2
- amitkoth 2
- eads 2
- virtadpt 2
- leafgarland 2
- glyph 2
- rafguns 2
- strada 2
- eelkevdbos 2
- ligurio 2
- n8henrie 2
- soobrosa 2
- nathancahill 2
- mustafa0x 2
- bsmithgall 2
- noslouch 2
- willingc 2
- nattaylor 2
- durkie 2
- cclauss 2
- wulfmann 2
- philshem 2
- bram2000 2
- zzeleznick 2
- plpxsk 2
- jeqo 2
- chapmanjacobd 2
- nickvazz 2
- aaronyih1 2
- luxint 2
- jussiarpalahti 2
- sachaj 2
- lagolucas 2
- stevecrawshaw 2
- chekos 2
- ctsrc 2
- ad-si 2
- smithdc1 2
- gsajko 2
- jcmkk3 2
- null92 2
- publicmatt 2
- rachelmarconi 2
- tunguyenatwork 2
- LVerneyPEReN 2
- tmcl-it 2
- anotherjesse 1
- jarib 1
- jokull 1
- danp 1
- fernand0 1
- precipice 1
- llimllib 1
- gijs 1
- blaine 1
- ashanan 1
- gravis 1
- nkirsch 1
- mrchrisadams 1
- dkam 1
- harperreed 1
- nileshtrivedi 1
- chrismytton 1
- nedbat 1
- furilo 1
- kindly 1
- prabhur 1
- palfrey 1
- dmd 1
- pquentin 1
- Uninen 1
- rtanglao 1
- carsonyl 1
- nryberg 1
- step21 1
- stefanocudini 1
- rcoup 1
- scoates 1
- hpk42 1
- annapowellsmith 1
- cadeef 1
- thorn0 1
- yurivish 1
- pax 1
- lucapette 1
- jmelloy 1
- Krazybug 1
- dvhthomas 1
- dckc 1
- phubbard 1
- sethvincent 1
- andrewdotn 1
- aitoehigie 1
- julienma 1
- michaelmcandrew 1
- drewda 1
- stiles 1
- saulpw 1
- adamalton 1
- terinjokes 1
- thadk 1
- camallen 1
- robintw 1
- astrojuanlu 1
- ipmb 1
- steren 1
- aidansteele 1
- 0x1997 1
- jonafato 1
- gwk 1
- knutwannheden 1
- davidszotten 1
- chrislkeller 1
- kevboh 1
- eaubin 1
- yunzheng 1
- mhkeller 1
- lfdebrux 1
- karlcow 1
- heyarne 1
- ryanfox 1
- sopel 1
- cephillips 1
- ryascott 1
- sirnacnud 1
- simonrjones 1
- justinpinkney 1
- merwok 1
- mattkiefer 1
- snth 1
- adarshp 1
- joshmgrant 1
- bcongdon 1
- nickdirienzo 1
- hannseman 1
- kaihendry 1
- urbas 1
- metamoof 1
- brimstone 1
- adamchainz 1
- PabloLerma 1
- heussd 1
- RayBB 1
- BryantD 1
- limar 1
- drkane 1
- Gagravarr 1
- radusuciu 1
- esagara 1
- agguser 1
- rclement 1
- dyllan-to-you 1
- justinallen 1
- jordaneremieff 1
- wdccdw 1
- wpears 1
- progpow 1
- DavidPratten 1
- ltrgoddard 1
- costrouc 1
- jratike80 1
- ment4list 1
- ccorcos 1
- choldgraf 1
- Olshansk 1
- qqilihq 1
- jdangerx 1
- fidiego 1
- OverkillGuy 1
- QAInsights 1
- secretGeek 1
- fkuhn 1
- jameslittle230 1
- Profpatsch 1
- dskrad 1
- kwladyka 1
- Carib0u 1
- fatihky 1
- phoenixjun 1
- JesperTreetop 1
- wenhoujx 1
- bapowell 1
- yairlenga 1
- chris48s 1
- ChristopherWilks 1
- Maltazar 1
- hueyy 1
- wuhland 1
- eric-burel 1
- foscoj 1
- dvot197007 1
- kokes 1
- RamiAwar 1
- csusanu 1
- rprimet 1
- metab0t 1
- spdkils 1
- sturzl 1
- jrdmb 1
- robmarkcole 1
- jfeiwell 1
- coisnepe 1
- chmaynard 1
- erlend-aasland 1
- amlestin 1
- tf13 1
- alecstein 1
- bendnorman 1
- noklam 1
- jakewilkins 1
- Thomascountz 1
- eigenfoo 1
- GmGniap 1
- rdtq 1
- AnkitKundariya 1
- LucasElArruda 1
- duarteocarmo 1
- sarcasticadmin 1
- yqlbu 1
- Rik-de-Kort 1
- patricktrainer 1
- xmichele 1
- RhetTbull 1
- miuku 1
- philipp-heinrich 1
- jimmybutton 1
- thewchan 1
- izzues 1
- thisismyfuckingusername 1
- kirajano 1
- J450n-4-W 1
- mlaparie 1
- Dhyanesh97 1
- knowledgecamp12 1
- McEazy2700 1
- cycle-data 1
id | html_url | issue_url | node_id | user | created_at | updated_at ▼ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
344424382 | https://github.com/simonw/datasette/issues/93#issuecomment-344424382 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQyNDM4Mg== | atomotic 67420 | 2017-11-14T22:42:16Z | 2017-11-14T22:42:16Z | NONE | tried quickly, this seems working: ``` ~ pip3 install pyinstaller ~ pyinstaller -F --add-data /usr/local/lib/python3.6/site-packages/datasette/templates:datasette/templates --add-data /usr/local/lib/python3.6/site-packages/datasette/static:datasette/static /usr/local/bin/datasette ~ du -h dist/datasette 6.8M dist/datasette ~ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Package as standalone binary 273944952 | |
344430299 | https://github.com/simonw/datasette/issues/93#issuecomment-344430299 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQzMDI5OQ== | atomotic 67420 | 2017-11-14T23:06:33Z | 2017-11-14T23:06:33Z | NONE | i will look better tomorrow, it's late i surely made some mistake https://asciinema.org/a/ZyAWbetrlriDadwWyVPUWB94H |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Package as standalone binary 273944952 | |
344516406 | https://github.com/simonw/datasette/issues/93#issuecomment-344516406 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDUxNjQwNg== | atomotic 67420 | 2017-11-15T08:09:41Z | 2017-11-15T08:09:41Z | NONE | actually you can use travis to build for linux/macos and appveyor to build for windows. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Package as standalone binary 273944952 | |
344597274 | https://github.com/simonw/datasette/issues/101#issuecomment-344597274 | https://api.github.com/repos/simonw/datasette/issues/101 | MDEyOklzc3VlQ29tbWVudDM0NDU5NzI3NA== | eaubin 450244 | 2017-11-15T13:48:55Z | 2017-11-15T13:48:55Z | NONE | This is a duplicate of https://github.com/simonw/datasette/issues/100 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
TemplateAssertionError: no filter named 'tojson' 274161964 | |
344864254 | https://github.com/simonw/datasette/issues/100#issuecomment-344864254 | https://api.github.com/repos/simonw/datasette/issues/100 | MDEyOklzc3VlQ29tbWVudDM0NDg2NDI1NA== | coisnepe 13304454 | 2017-11-16T09:25:10Z | 2017-11-16T09:25:10Z | NONE | @simonw I see. I upgraded sanic-jinja2 and jinja2: it now works flawlessly. Thank you! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
TemplateAssertionError: no filter named 'tojson' 274160723 | |
345509500 | https://github.com/simonw/datasette/issues/97#issuecomment-345509500 | https://api.github.com/repos/simonw/datasette/issues/97 | MDEyOklzc3VlQ29tbWVudDM0NTUwOTUwMA== | yschimke 231923 | 2017-11-19T11:26:58Z | 2017-11-19T11:26:58Z | NONE | Specifically docs should make it clearer this file exists https://parlgov.datasettes.com/.json And from that you can build https://parlgov.datasettes.com/parlgov-25f9855.json Then https://parlgov.datasettes.com/parlgov-25f9855/cabinet.json |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Link to JSON for the list of tables 274022950 | |
346427794 | https://github.com/simonw/datasette/issues/144#issuecomment-346427794 | https://api.github.com/repos/simonw/datasette/issues/144 | MDEyOklzc3VlQ29tbWVudDM0NjQyNzc5NA== | mhalle 649467 | 2017-11-22T17:55:45Z | 2017-11-22T17:55:45Z | NONE | Thanks. There is a way to use pip to grab apsw, which also let's you configure it (flags to build extensions, use an internal sqlite, etc). Don't know how that works as a dependency for another package, though. On November 22, 2017 11:38:06 AM EST, Simon Willison notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
apsw as alternative sqlite3 binding (for full text search) 276091279 | |
346974336 | https://github.com/simonw/datasette/issues/141#issuecomment-346974336 | https://api.github.com/repos/simonw/datasette/issues/141 | MDEyOklzc3VlQ29tbWVudDM0Njk3NDMzNg== | janimo 50138 | 2017-11-26T00:00:35Z | 2017-11-26T00:00:35Z | NONE | FWIW I worked around this by setting TMPDIR to ~/tmp before running the command. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette publish can fail if /tmp is on a different device 275814941 | |
346987395 | https://github.com/simonw/datasette/issues/124#issuecomment-346987395 | https://api.github.com/repos/simonw/datasette/issues/124 | MDEyOklzc3VlQ29tbWVudDM0Njk4NzM5NQ== | janimo 50138 | 2017-11-26T06:24:08Z | 2017-11-26T06:24:08Z | NONE | Are there performance gains when using immutable as opposed to read-only? From what I see other processes can still modify the DB when immutable, but there are no change notifications. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to open readonly but not immutable 275125805 | |
347123991 | https://github.com/simonw/datasette/issues/124#issuecomment-347123991 | https://api.github.com/repos/simonw/datasette/issues/124 | MDEyOklzc3VlQ29tbWVudDM0NzEyMzk5MQ== | janimo 50138 | 2017-11-27T09:25:15Z | 2017-11-27T09:25:15Z | NONE | That's the only reference to immutable I saw as well, making me think that there may be no perceivable advantages over simply using mode=ro. Since the database is never or seldom updated the change notifications should not impact performance. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to open readonly but not immutable 275125805 | |
347714314 | https://github.com/simonw/datasette/issues/155#issuecomment-347714314 | https://api.github.com/repos/simonw/datasette/issues/155 | MDEyOklzc3VlQ29tbWVudDM0NzcxNDMxNA== | wsxiaoys 388154 | 2017-11-29T00:46:25Z | 2017-11-29T00:46:25Z | NONE | ``` CREATE TABLE rhs ( id INTEGER PRIMARY KEY, name TEXT ); CREATE TABLE lhs ( symbol INTEGER PRIMARY KEY, FOREIGN KEY (symbol) REFERENCES rhs(id) ); INSERT INTO rhs VALUES (1, "foo"); INSERT INTO rhs VALUES (2, "bar"); INSERT INTO lhs VALUES (1); INSERT INTO lhs VALUES (2); ``` It's expected that in lhs's view, foo / bar should be displayed. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
A primary key column that has foreign key restriction associated won't rendering label column 277589569 | |
348252037 | https://github.com/simonw/datasette/issues/153#issuecomment-348252037 | https://api.github.com/repos/simonw/datasette/issues/153 | MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw== | ftrain 20264 | 2017-11-30T16:59:00Z | 2017-11-30T16:59:00Z | NONE | WOW! -- Paul Ford // (646) 369-7128 // @ftrain On Thu, Nov 30, 2017 at 11:47 AM, Simon Willison notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to customize presentation of specific columns in HTML view 276842536 | |
350108113 | https://github.com/simonw/datasette/issues/161#issuecomment-350108113 | https://api.github.com/repos/simonw/datasette/issues/161 | MDEyOklzc3VlQ29tbWVudDM1MDEwODExMw== | wsxiaoys 388154 | 2017-12-07T22:02:24Z | 2017-12-07T22:02:24Z | NONE | It's not throwing the validation error anymore, but i still cannot run following with query:
I got |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support WITH query 278814220 | |
350182904 | https://github.com/simonw/datasette/issues/161#issuecomment-350182904 | https://api.github.com/repos/simonw/datasette/issues/161 | MDEyOklzc3VlQ29tbWVudDM1MDE4MjkwNA== | wsxiaoys 388154 | 2017-12-08T06:18:12Z | 2017-12-08T06:18:12Z | NONE | You're right..got this resolved after upgrading the sqlite version. Thanks you! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support WITH query 278814220 | |
355487646 | https://github.com/simonw/datasette/issues/120#issuecomment-355487646 | https://api.github.com/repos/simonw/datasette/issues/120 | MDEyOklzc3VlQ29tbWVudDM1NTQ4NzY0Ng== | nickdirienzo 723567 | 2018-01-05T07:10:12Z | 2018-01-05T07:10:12Z | NONE | Ah, glad I found this issue. I have private data that I'd like to share to a few different people. Personally, a shared username and password would be sufficient for me, more-or-less Basic Auth. Do you have more complex requirements in mind? I'm not sure if "plugin" means "build a plugin" or "find a plugin" or something else entirely. FWIW, I stumbled upon sanic-auth which looks like a new project to bring some interfaces around auth to sanic, similar to Flask. Alternatively, it shouldn't be too bad to add in Basic Auth. If we went down that route, that would probably be best built as a separate package for sanic that What are your thoughts around this? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin that adds an authentication layer of some sort 275087397 | |
356115657 | https://github.com/simonw/datasette/issues/176#issuecomment-356115657 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM1NjExNTY1Nw== | wulfmann 4313116 | 2018-01-08T22:22:32Z | 2018-01-08T22:22:32Z | NONE | This project probably would not be the place for that. This is a layer for sqllite specifically. It solves a similar problem as graphql, so adding that here wouldn't make sense. Here's an example i found from google that uses micro to run a graphql microservice. you'd just then need to connect your db. https://github.com/timneutkens/micro-graphql |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
356161672 | https://github.com/simonw/datasette/issues/176#issuecomment-356161672 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM1NjE2MTY3Mg== | yozlet 173848 | 2018-01-09T02:35:35Z | 2018-01-09T02:35:35Z | NONE | @wulfmann I think I disagree, except I'm not entirely sure what you mean by that first paragraph. The JSON API that Datasette currently exposes is quite different to GraphQL. Furthermore, there's no "just" about connecting micro-graphql to a DB; at least, no more "just" than adding any other API. You still need to configure the schema, which is exactly the kind of thing that Datasette does for JSON API. This is why I think that GraphQL's a good fit here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
356175667 | https://github.com/simonw/datasette/issues/176#issuecomment-356175667 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw== | wulfmann 4313116 | 2018-01-09T04:19:03Z | 2018-01-09T04:19:03Z | NONE | @yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement. |
{ "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
359697938 | https://github.com/simonw/datasette/issues/176#issuecomment-359697938 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA== | gijs 7193 | 2018-01-23T07:17:56Z | 2018-01-23T07:17:56Z | NONE | 👍 I'd like this too! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
368625350 | https://github.com/simonw/datasette/issues/176#issuecomment-368625350 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM2ODYyNTM1MA== | wuhland 7431774 | 2018-02-26T19:44:11Z | 2018-02-26T19:44:11Z | NONE | great idea! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
370461231 | https://github.com/simonw/datasette/issues/185#issuecomment-370461231 | https://api.github.com/repos/simonw/datasette/issues/185 | MDEyOklzc3VlQ29tbWVudDM3MDQ2MTIzMQ== | carlmjohnson 222245 | 2018-03-05T15:43:56Z | 2018-03-05T15:44:27Z | NONE | Yes. I think the simplest implementation is to change lines like
to
so that specified inner values overwrite outer values, but only if they exist. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Metadata should be a nested arbitrary KV store 299760684 | |
374872202 | https://github.com/simonw/datasette/issues/186#issuecomment-374872202 | https://api.github.com/repos/simonw/datasette/issues/186 | MDEyOklzc3VlQ29tbWVudDM3NDg3MjIwMg== | stefanocudini 47107 | 2018-03-21T09:07:22Z | 2018-03-21T09:07:22Z | NONE | --debug is perfect tnk |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
proposal new option to disable user agents cache 306811513 | |
376590265 | https://github.com/simonw/datasette/issues/185#issuecomment-376590265 | https://api.github.com/repos/simonw/datasette/issues/185 | MDEyOklzc3VlQ29tbWVudDM3NjU5MDI2NQ== | carlmjohnson 222245 | 2018-03-27T16:32:51Z | 2018-03-27T16:32:51Z | NONE |
Yes, you could have |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Metadata should be a nested arbitrary KV store 299760684 | |
376592044 | https://github.com/simonw/datasette/issues/185#issuecomment-376592044 | https://api.github.com/repos/simonw/datasette/issues/185 | MDEyOklzc3VlQ29tbWVudDM3NjU5MjA0NA== | carlmjohnson 222245 | 2018-03-27T16:38:23Z | 2018-03-27T16:38:23Z | NONE | It would be nice to also allow arbitrary keys (maybe under a parent key called params or something to prevent conflicts). For our datasette project, we just have a bunch of dictionaries defined in the base template for things like site URL and column humanized names: https://github.com/baltimore-sun-data/salaries-datasette/blob/master/templates/base.html It would be cleaner if this were in the metadata.json. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Metadata should be a nested arbitrary KV store 299760684 | |
376614973 | https://github.com/simonw/datasette/issues/185#issuecomment-376614973 | https://api.github.com/repos/simonw/datasette/issues/185 | MDEyOklzc3VlQ29tbWVudDM3NjYxNDk3Mw== | carlmjohnson 222245 | 2018-03-27T17:49:00Z | 2018-03-27T17:49:00Z | NONE | @simonw Other than metadata, the biggest item on wishlist for the salaries project was the ability to reorder by column. Of course, that could be done with a custom SQL query, but we didn't want to have to reimplement all the nav/pagination stuff from scratch. @carolinp, feel free to add your thoughts. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Metadata should be a nested arbitrary KV store 299760684 | |
378297842 | https://github.com/simonw/datasette/pull/181#issuecomment-378297842 | https://api.github.com/repos/simonw/datasette/issues/181 | MDEyOklzc3VlQ29tbWVudDM3ODI5Nzg0Mg== | bsmithgall 1957344 | 2018-04-03T15:47:13Z | 2018-04-03T15:47:13Z | NONE | I can work on that -- would you prefer to inline a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
add "format sql" button to query page, uses sql-formatter 289425975 | |
379142500 | https://github.com/simonw/datasette/issues/193#issuecomment-379142500 | https://api.github.com/repos/simonw/datasette/issues/193 | MDEyOklzc3VlQ29tbWVudDM3OTE0MjUwMA== | carlmjohnson 222245 | 2018-04-06T04:05:58Z | 2018-04-06T04:05:58Z | NONE | You could try pulling out a validate query strings method. If it fails validation build the error object from the message. If it passes, you only need to go down a happy path. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cleaner mechanism for handling custom errors 310882100 | |
379759875 | https://github.com/simonw/datasette/pull/181#issuecomment-379759875 | https://api.github.com/repos/simonw/datasette/issues/181 | MDEyOklzc3VlQ29tbWVudDM3OTc1OTg3NQ== | bsmithgall 1957344 | 2018-04-09T13:53:14Z | 2018-04-09T13:53:14Z | NONE | I've implemented that approach in 86ac746. It does cause the button to pop in only after Codemirror is finished rendering which is a bit awkward. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
add "format sql" button to query page, uses sql-formatter 289425975 | |
379788103 | https://github.com/simonw/datasette/issues/184#issuecomment-379788103 | https://api.github.com/repos/simonw/datasette/issues/184 | MDEyOklzc3VlQ29tbWVudDM3OTc4ODEwMw== | carlmjohnson 222245 | 2018-04-09T15:15:11Z | 2018-04-09T15:15:11Z | NONE | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
500 from missing table name 292011379 | ||
379791047 | https://github.com/simonw/datasette/issues/189#issuecomment-379791047 | https://api.github.com/repos/simonw/datasette/issues/189 | MDEyOklzc3VlQ29tbWVudDM3OTc5MTA0Nw== | carlmjohnson 222245 | 2018-04-09T15:23:45Z | 2018-04-09T15:23:45Z | NONE | Awesome! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to sort (and paginate) by column 309471814 | |
381429213 | https://github.com/simonw/datasette/issues/189#issuecomment-381429213 | https://api.github.com/repos/simonw/datasette/issues/189 | MDEyOklzc3VlQ29tbWVudDM4MTQyOTIxMw== | carlmjohnson 222245 | 2018-04-15T18:54:22Z | 2018-04-15T18:54:22Z | NONE | I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The next_url gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to sort (and paginate) by column 309471814 | |
381602005 | https://github.com/simonw/datasette/issues/191#issuecomment-381602005 | https://api.github.com/repos/simonw/datasette/issues/191 | MDEyOklzc3VlQ29tbWVudDM4MTYwMjAwNQ== | coleifer 119974 | 2018-04-16T13:37:32Z | 2018-04-16T13:37:32Z | NONE | I don't think it should be too difficult... you can look at what @ghaering did with pysqlite (and similarly what I copied for pysqlite3). You would theoretically take an amalgamation build of Sqlite (all code in a single .c and .h file). The |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out how to bundle a more up-to-date SQLite 310533258 | |
388367027 | https://github.com/simonw/datasette/issues/254#issuecomment-388367027 | https://api.github.com/repos/simonw/datasette/issues/254 | MDEyOklzc3VlQ29tbWVudDM4ODM2NzAyNw== | philroche 247131 | 2018-05-11T13:41:46Z | 2018-05-11T13:41:46Z | NONE | An example deployment @ https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c/cloudimage?content_id__exact=com.ubuntu.cloud%3Areleased%3Adownload It is not causing errors, more of an inconvenience. I have worked around it using a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Escaping named parameters in canned queries 322283067 | |
390577711 | https://github.com/simonw/datasette/pull/258#issuecomment-390577711 | https://api.github.com/repos/simonw/datasette/issues/258 | MDEyOklzc3VlQ29tbWVudDM5MDU3NzcxMQ== | philroche 247131 | 2018-05-21T07:38:15Z | 2018-05-21T07:38:15Z | NONE | Excellent, I was not aware of the auto redirect to the new hash. My bad This solves my use case. I do agree that your suggested --no-url-hash approach is much neater. I will investigate |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add new metadata key persistent_urls which removes the hash from all database urls 322741659 | |
390689406 | https://github.com/simonw/datasette/issues/247#issuecomment-390689406 | https://api.github.com/repos/simonw/datasette/issues/247 | MDEyOklzc3VlQ29tbWVudDM5MDY4OTQwNg== | jsancho-gpl 11912854 | 2018-05-21T15:29:31Z | 2018-05-21T15:29:31Z | NONE | I've changed my mind about the way to support external connectors aside of SQLite and I'm working in a more simple style that respects the original Datasette, i.e. less refactoring. I present you a version of Datasette wich supports other database connectors and a Datasette connector for HDF5/PyTables files. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
SQLite code decoupled from Datasette 319449852 | |
392828475 | https://github.com/simonw/datasette/issues/191#issuecomment-392828475 | https://api.github.com/repos/simonw/datasette/issues/191 | MDEyOklzc3VlQ29tbWVudDM5MjgyODQ3NQ== | coleifer 119974 | 2018-05-29T15:50:18Z | 2018-05-29T15:50:18Z | NONE | Python standard-library SQLite dynamically links against the system sqlite3. So presumably you installed a more up-to-date sqlite3 somewhere on your To compile a statically-linked pysqlite you need to include an amalgamation in the project root when building the extension. Read the relevant setup.py. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out how to bundle a more up-to-date SQLite 310533258 | |
392890045 | https://github.com/simonw/datasette/issues/265#issuecomment-392890045 | https://api.github.com/repos/simonw/datasette/issues/265 | MDEyOklzc3VlQ29tbWVudDM5Mjg5MDA0NQ== | yschimke 231923 | 2018-05-29T18:37:49Z | 2018-05-29T18:37:49Z | NONE | Just about to ask for this! Move this page https://github.com/simonw/datasette/wiki/Datasettes into a datasette, with some concept of versioning as well. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add links to example Datasette instances to appropiate places in docs 323677499 | |
392895733 | https://github.com/simonw/datasette/issues/97#issuecomment-392895733 | https://api.github.com/repos/simonw/datasette/issues/97 | MDEyOklzc3VlQ29tbWVudDM5Mjg5NTczMw== | yschimke 231923 | 2018-05-29T18:51:35Z | 2018-05-29T18:51:35Z | NONE | Do you have an existing example with views? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Link to JSON for the list of tables 274022950 | |
398030903 | https://github.com/simonw/datasette/issues/316#issuecomment-398030903 | https://api.github.com/repos/simonw/datasette/issues/316 | MDEyOklzc3VlQ29tbWVudDM5ODAzMDkwMw== | gavinband 132230 | 2018-06-18T12:00:43Z | 2018-06-18T12:00:43Z | NONE | I should add that I'm using datasette version 0.22, Python 2.7.10 on Mac OS X. Happy to send more info if helpful. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette inspect takes a very long time on large dbs 333238932 | |
398109204 | https://github.com/simonw/datasette/issues/316#issuecomment-398109204 | https://api.github.com/repos/simonw/datasette/issues/316 | MDEyOklzc3VlQ29tbWVudDM5ODEwOTIwNA== | gavinband 132230 | 2018-06-18T16:12:45Z | 2018-06-18T16:12:45Z | NONE | Hi Simon,
Thanks for the response. Ok I'll try running |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette inspect takes a very long time on large dbs 333238932 | |
398778485 | https://github.com/simonw/datasette/issues/188#issuecomment-398778485 | https://api.github.com/repos/simonw/datasette/issues/188 | MDEyOklzc3VlQ29tbWVudDM5ODc3ODQ4NQ== | bsilverm 12617395 | 2018-06-20T14:48:39Z | 2018-06-20T14:48:39Z | NONE | This would be a great feature to have! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to bundle metadata and templates inside the SQLite file 309047460 | |
399098080 | https://github.com/simonw/datasette/issues/321#issuecomment-399098080 | https://api.github.com/repos/simonw/datasette/issues/321 | MDEyOklzc3VlQ29tbWVudDM5OTA5ODA4MA== | bsilverm 12617395 | 2018-06-21T13:10:48Z | 2018-06-21T13:10:48Z | NONE | Perfect, thank you!! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Wildcard support in query parameters 334190959 | |
399106871 | https://github.com/simonw/datasette/issues/321#issuecomment-399106871 | https://api.github.com/repos/simonw/datasette/issues/321 | MDEyOklzc3VlQ29tbWVudDM5OTEwNjg3MQ== | bsilverm 12617395 | 2018-06-21T13:39:37Z | 2018-06-21T13:39:37Z | NONE | One thing I've noticed with this approach is that the query is executed with no parameters which I do not believe was the case previously. In the case the table contains a lot of data, this adds some time executing the query before the user can enter their input and run it with the parameters they want. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Wildcard support in query parameters 334190959 | |
399129220 | https://github.com/simonw/datasette/issues/321#issuecomment-399129220 | https://api.github.com/repos/simonw/datasette/issues/321 | MDEyOklzc3VlQ29tbWVudDM5OTEyOTIyMA== | bsilverm 12617395 | 2018-06-21T14:45:02Z | 2018-06-21T14:45:02Z | NONE | Those queries look identical. How can this be prevented if the queries are in a metadata.json file? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Wildcard support in query parameters 334190959 | |
399173916 | https://github.com/simonw/datasette/issues/321#issuecomment-399173916 | https://api.github.com/repos/simonw/datasette/issues/321 | MDEyOklzc3VlQ29tbWVudDM5OTE3MzkxNg== | bsilverm 12617395 | 2018-06-21T17:00:10Z | 2018-06-21T17:00:10Z | NONE | Oh I see.. My issue is that the query executes with an empty string prior to the user submitting the parameters. I'll try adding your workaround to some of my queries. Thanks again, |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Wildcard support in query parameters 334190959 | |
400571521 | https://github.com/simonw/datasette/issues/272#issuecomment-400571521 | https://api.github.com/repos/simonw/datasette/issues/272 | MDEyOklzc3VlQ29tbWVudDQwMDU3MTUyMQ== | tomchristie 647359 | 2018-06-27T07:30:07Z | 2018-06-27T07:30:07Z | NONE | I’m up for helping with this. Looks like you’d need static files support, which I’m planning on adding a component for. Anything else obviously missing? For a quick overview it looks very doable - the test client ought to me your test cases stay roughly the same. Are you using any middleware or other components for the Sanic ecosystem? Do you use cookies or sessions at all? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Port Datasette to ASGI 324188953 | |
404514973 | https://github.com/simonw/datasette/issues/272#issuecomment-404514973 | https://api.github.com/repos/simonw/datasette/issues/272 | MDEyOklzc3VlQ29tbWVudDQwNDUxNDk3Mw== | tomchristie 647359 | 2018-07-12T13:38:24Z | 2018-07-12T13:38:24Z | NONE | Okay. I reckon the latest version should have all the kinds of components you'd need: Recently added ASGI components for Routing and Static Files support, as well as making few tweaks to make sure requests and responses are instantiated efficiently. Don't have any redirect-to-slash / redirect-to-non-slash stuff out of the box yet, which it looks like you might miss. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Port Datasette to ASGI 324188953 | |
404576136 | https://github.com/simonw/datasette/issues/339#issuecomment-404576136 | https://api.github.com/repos/simonw/datasette/issues/339 | MDEyOklzc3VlQ29tbWVudDQwNDU3NjEzNg== | bsilverm 12617395 | 2018-07-12T16:45:08Z | 2018-07-12T16:45:08Z | NONE | Thanks for the quick reply. Looks like that is working well. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 340396247 | |
412663658 | https://github.com/simonw/datasette/issues/185#issuecomment-412663658 | https://api.github.com/repos/simonw/datasette/issues/185 | MDEyOklzc3VlQ29tbWVudDQxMjY2MzY1OA== | carlmjohnson 222245 | 2018-08-13T21:04:11Z | 2018-08-13T21:04:11Z | NONE | That seems good to me. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Metadata should be a nested arbitrary KV store 299760684 | |
414860009 | https://github.com/simonw/datasette/issues/267#issuecomment-414860009 | https://api.github.com/repos/simonw/datasette/issues/267 | MDEyOklzc3VlQ29tbWVudDQxNDg2MDAwOQ== | annapowellsmith 78156 | 2018-08-21T23:57:51Z | 2018-08-21T23:57:51Z | NONE | Looks to me like hashing, redirects and caching were documented as part of https://github.com/simonw/datasette/commit/788a542d3c739da5207db7d1fb91789603cdd336#diff-3021b0e065dce289c34c3b49b3952a07 - so perhaps this can be closed? :tada: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation for URL hashing, redirects and cache policy 323716411 | |
417684877 | https://github.com/simonw/datasette/pull/363#issuecomment-417684877 | https://api.github.com/repos/simonw/datasette/issues/363 | MDEyOklzc3VlQ29tbWVudDQxNzY4NDg3Nw== | kevboh 436032 | 2018-08-31T14:39:45Z | 2018-08-31T14:39:45Z | NONE | It looks like the check passed, not sure why it's showing as running in GH. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Search all apps during heroku publish 355299310 | |
418695115 | https://github.com/simonw/datasette/issues/272#issuecomment-418695115 | https://api.github.com/repos/simonw/datasette/issues/272 | MDEyOklzc3VlQ29tbWVudDQxODY5NTExNQ== | tomchristie 647359 | 2018-09-05T11:21:25Z | 2018-09-05T11:21:25Z | NONE | Some notes:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Port Datasette to ASGI 324188953 | |
420295524 | https://github.com/simonw/datasette/pull/293#issuecomment-420295524 | https://api.github.com/repos/simonw/datasette/issues/293 | MDEyOklzc3VlQ29tbWVudDQyMDI5NTUyNA== | jsancho-gpl 11912854 | 2018-09-11T14:32:45Z | 2018-09-11T14:32:45Z | NONE | I close this PR because it's better to use the new one #364 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for external database connectors 326987229 | |
427261369 | https://github.com/simonw/datasette/issues/328#issuecomment-427261369 | https://api.github.com/repos/simonw/datasette/issues/328 | MDEyOklzc3VlQ29tbWVudDQyNzI2MTM2OQ== | chmaynard 13698964 | 2018-10-05T06:37:06Z | 2018-10-05T06:37:06Z | NONE |
Error: Invalid value for "files": Path "/mnt/fixtures.db" does not exist. ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installation instructions, including how to use the docker image 336464733 | |
427943710 | https://github.com/simonw/datasette/issues/187#issuecomment-427943710 | https://api.github.com/repos/simonw/datasette/issues/187 | MDEyOklzc3VlQ29tbWVudDQyNzk0MzcxMA== | progpow 1583271 | 2018-10-08T18:58:05Z | 2018-10-08T18:58:05Z | NONE | I have same error:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Windows installation error 309033998 | |
431867885 | https://github.com/simonw/datasette/issues/176#issuecomment-431867885 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ== | eads 634572 | 2018-10-22T15:24:57Z | 2018-10-22T15:24:57Z | NONE | I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
439194286 | https://github.com/simonw/datasette/issues/227#issuecomment-439194286 | https://api.github.com/repos/simonw/datasette/issues/227 | MDEyOklzc3VlQ29tbWVudDQzOTE5NDI4Ng== | carlmjohnson 222245 | 2018-11-15T21:20:37Z | 2018-11-15T21:20:37Z | NONE | I'm diving back into https://salaries.news.baltimoresun.com and what I really want is the ability to inject the request into my context. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
prepare_context() plugin hook 315960272 | |
439421164 | https://github.com/simonw/datasette/issues/120#issuecomment-439421164 | https://api.github.com/repos/simonw/datasette/issues/120 | MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA== | ad-si 36796532 | 2018-11-16T15:05:18Z | 2018-11-16T15:05:18Z | NONE | This would be an awesome feature ❤️ |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin that adds an authentication layer of some sort 275087397 | |
451415063 | https://github.com/simonw/datasette/issues/393#issuecomment-451415063 | https://api.github.com/repos/simonw/datasette/issues/393 | MDEyOklzc3VlQ29tbWVudDQ1MTQxNTA2Mw== | ltrgoddard 1727065 | 2019-01-04T11:04:08Z | 2019-01-04T11:04:08Z | NONE | Awesome - will get myself up and running on 0.26 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
CSV export in "Advanced export" pane doesn't respect query 395236066 | |
455520561 | https://github.com/simonw/datasette/issues/401#issuecomment-455520561 | https://api.github.com/repos/simonw/datasette/issues/401 | MDEyOklzc3VlQ29tbWVudDQ1NTUyMDU2MQ== | dazzag24 1055831 | 2019-01-18T11:48:13Z | 2019-01-18T11:48:13Z | NONE | Thanks. I'll take a look at your changes. I must admit I was struggling to see how to pass info from the python code in init.py into the javascript document.addEventListener function. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How to pass configuration to plugins? 400229984 | |
455752238 | https://github.com/simonw/datasette/issues/403#issuecomment-455752238 | https://api.github.com/repos/simonw/datasette/issues/403 | MDEyOklzc3VlQ29tbWVudDQ1NTc1MjIzOA== | ccorcos 1794527 | 2019-01-19T05:47:55Z | 2019-01-19T05:47:55Z | NONE | Ah. That makes much more sense. Interesting approach. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How does persistence work? 400511206 | |
463917744 | https://github.com/simonw/datasette/issues/187#issuecomment-463917744 | https://api.github.com/repos/simonw/datasette/issues/187 | MDEyOklzc3VlQ29tbWVudDQ2MzkxNzc0NA== | phoenixjun 4190962 | 2019-02-15T05:58:44Z | 2019-02-15T05:58:44Z | NONE | is this supported or not? you can comment if it is not supported so that people like me can stop trying. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Windows installation error 309033998 | |
464341721 | https://github.com/simonw/sqlite-utils/issues/8#issuecomment-464341721 | https://api.github.com/repos/simonw/sqlite-utils/issues/8 | MDEyOklzc3VlQ29tbWVudDQ2NDM0MTcyMQ== | psychemedia 82988 | 2019-02-16T12:08:41Z | 2019-02-16T12:08:41Z | NONE | We also get an error if a column name contains a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Problems handling column names containing spaces or - 403922644 | |
466325528 | https://github.com/simonw/datasette/issues/187#issuecomment-466325528 | https://api.github.com/repos/simonw/datasette/issues/187 | MDEyOklzc3VlQ29tbWVudDQ2NjMyNTUyOA== | fkuhn 2892252 | 2019-02-22T09:03:50Z | 2019-02-22T09:03:50Z | NONE | I ran into the same issue when trying to install datasette on windows after successfully using it on linux. Unfortunately, there has not been any progress in implementing uvloop for windows - so I recommend not to use it on win. You can read about this issue here: https://github.com/MagicStack/uvloop/issues/14 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Windows installation error 309033998 | |
472844001 | https://github.com/simonw/datasette/issues/409#issuecomment-472844001 | https://api.github.com/repos/simonw/datasette/issues/409 | MDEyOklzc3VlQ29tbWVudDQ3Mjg0NDAwMQ== | Uninen 43100 | 2019-03-14T13:04:20Z | 2019-03-14T13:04:42Z | NONE | It seems this affects the Datasette Publish -site as well: https://github.com/simonw/datasette-publish-support/issues/3 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Zeit API v1 does not work for new users - need to migrate to v2 408376825 | |
472875713 | https://github.com/simonw/datasette/issues/409#issuecomment-472875713 | https://api.github.com/repos/simonw/datasette/issues/409 | MDEyOklzc3VlQ29tbWVudDQ3Mjg3NTcxMw== | michaelmcandrew 209967 | 2019-03-14T14:14:39Z | 2019-03-14T14:14:39Z | NONE | also linking this zeit issue in case it is helpful: https://github.com/zeit/now-examples/issues/163#issuecomment-440125769 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Zeit API v1 does not work for new users - need to migrate to v2 408376825 | |
473217334 | https://github.com/simonw/datasette/issues/415#issuecomment-473217334 | https://api.github.com/repos/simonw/datasette/issues/415 | MDEyOklzc3VlQ29tbWVudDQ3MzIxNzMzNA== | ad-si 36796532 | 2019-03-15T09:30:57Z | 2019-03-15T09:30:57Z | NONE | Awesome, thanks! 😁 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add query parameter to hide SQL textarea 418329842 | |
480621924 | https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924 | https://api.github.com/repos/simonw/sqlite-utils/issues/18 | MDEyOklzc3VlQ29tbWVudDQ4MDYyMTkyNA== | psychemedia 82988 | 2019-04-07T19:31:42Z | 2019-04-07T19:31:42Z | NONE | I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows. Do |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
.insert/.upsert/.insert_all/.upsert_all should add missing columns 413871266 | |
482994231 | https://github.com/simonw/sqlite-utils/issues/8#issuecomment-482994231 | https://api.github.com/repos/simonw/sqlite-utils/issues/8 | MDEyOklzc3VlQ29tbWVudDQ4Mjk5NDIzMQ== | psychemedia 82988 | 2019-04-14T15:04:07Z | 2019-04-14T15:29:33Z | NONE | PLEASE IGNORE THE BELOW... I did a package update and rebuilt the kernel I was working in... may just have been an old version of sqlite_utils, seems to be working now. (Too many containers / too many environments!) Has an issue been reintroduced here with FTS? eg I'm getting an error thrown by spaces in column names here: ``` /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) def enable_fts(self, columns, fts_version="FTS5"): --> 329 "Enables FTS on the specified columns" 330 sql = """ 331 CREATE VIRTUAL TABLE "{table}_fts" USING {fts_version} ( ``` when trying an Also, if a col has a ``` /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ".": syntax error ``` (Can't post a worked minimal example right now; racing trying to build something against a live timing screen that will stop until next weekend in an hour or two...) PS Hmmm I did a test and they seem to work; I must be messing up s/where else... ``` import sqlite3 from sqlite_utils import Database dbname='testingDB_sqlite_utils.db' !rm $dbnameconn = sqlite3.connect(dbname, timeout=10) Setup database tablesc = conn.cursor() setup=''' CREATE TABLE IF NOT EXISTS "test1" ( "NO" INTEGER, "NAME" TEXT ); CREATE TABLE IF NOT EXISTS "test2" (
"NO" INTEGER,
CREATE TABLE IF NOT EXISTS "test3" (
"NO" INTEGER,
c.executescript(setup) DB = Database(conn) import pandas as pd df1 = pd.DataFrame({'NO':[1,2],'NAME':['a','b']}) DB['test1'].insert_all(df1.to_dict(orient='records')) df2 = pd.DataFrame({'NO':[1,2],'TIME OF DAY':['early on','late']}) DB['test2'].insert_all(df2.to_dict(orient='records')) df3 = pd.DataFrame({'NO':[1,2],'AVG. SPEED (MPH)':['123.3','123.4']}) DB['test3'].insert_all(df3.to_dict(orient='records')) ``` all seem to work ok. I'm still getting errors in my set up though, which is not too different to the text cases? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Problems handling column names containing spaces or - 403922644 | |
485557574 | https://github.com/simonw/datasette/pull/426#issuecomment-485557574 | https://api.github.com/repos/simonw/datasette/issues/426 | MDEyOklzc3VlQ29tbWVudDQ4NTU1NzU3NA== | carlmjohnson 222245 | 2019-04-22T21:23:22Z | 2019-04-22T21:23:22Z | NONE | Can you cut a new release with this? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Upgrade to Jinja2==2.10.1 431756352 | |
489353316 | https://github.com/simonw/datasette/issues/187#issuecomment-489353316 | https://api.github.com/repos/simonw/datasette/issues/187 | MDEyOklzc3VlQ29tbWVudDQ4OTM1MzMxNg== | carsonyl 46059 | 2019-05-04T18:36:36Z | 2019-05-04T18:36:36Z | NONE | Hi @simonw - I just hit this issue when trying out Datasette after your PyCon talk today. Datasette is pinned to Sanic 0.7.0, but it looks like 0.8.0 added the option to remove the uvloop dependency for Windows by having an environment variable |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 } |
Windows installation error 309033998 | |
490039343 | https://github.com/simonw/datasette/issues/187#issuecomment-490039343 | https://api.github.com/repos/simonw/datasette/issues/187 | MDEyOklzc3VlQ29tbWVudDQ5MDAzOTM0Mw== | Maltazar 6422964 | 2019-05-07T11:24:42Z | 2019-05-07T11:24:42Z | NONE | I totally agree with carsonyl |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Windows installation error 309033998 | |
494297022 | https://github.com/simonw/datasette/issues/272#issuecomment-494297022 | https://api.github.com/repos/simonw/datasette/issues/272 | MDEyOklzc3VlQ29tbWVudDQ5NDI5NzAyMg== | tomchristie 647359 | 2019-05-21T08:39:17Z | 2019-05-21T08:39:17Z | NONE | Useful context stuff:
That was an issue specifically against the <=3.5.2 minor point releases of Python, now resolved: https://github.com/encode/uvicorn/issues/330 👍
Yeah - the bits that require 3.6 are anywhere with the "async for" syntax. If it wasn't for that I'd downport it, but that one's a pain. It's the one bit of syntax to watch out for if you're looking to bring any bits of implementation across to Datasette. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Port Datasette to ASGI 324188953 | |
494459264 | https://github.com/simonw/datasette/issues/184#issuecomment-494459264 | https://api.github.com/repos/simonw/datasette/issues/184 | MDEyOklzc3VlQ29tbWVudDQ5NDQ1OTI2NA== | carlmjohnson 222245 | 2019-05-21T16:17:29Z | 2019-05-21T16:17:29Z | NONE | Reopening this because it still raises 500 for incorrect table capitalization. Example:
I think because the table name exists but is not in its canonical form, it triggers a dict lookup error. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
500 from missing table name 292011379 | |
495034774 | https://github.com/simonw/datasette/issues/483#issuecomment-495034774 | https://api.github.com/repos/simonw/datasette/issues/483 | MDEyOklzc3VlQ29tbWVudDQ5NTAzNDc3NA== | jcmkk3 45919695 | 2019-05-23T01:38:32Z | 2019-05-23T01:43:04Z | NONE | I think that location information is one of the other common pieces of hierarchical data. At least one that is general enough that extra dimensions could be auto-generated. Also, I think this is an awesome project. Thank you for creating this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to facet by date using month or year 447408527 | |
496966227 | https://github.com/simonw/datasette/issues/120#issuecomment-496966227 | https://api.github.com/repos/simonw/datasette/issues/120 | MDEyOklzc3VlQ29tbWVudDQ5Njk2NjIyNw== | duarteocarmo 26342344 | 2019-05-29T14:40:52Z | 2019-05-29T14:40:52Z | NONE | I would really like this. If you give me some pointers @simonw I'm willing to PR! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin that adds an authentication layer of some sort 275087397 | |
497885590 | https://github.com/simonw/datasette/issues/496#issuecomment-497885590 | https://api.github.com/repos/simonw/datasette/issues/496 | MDEyOklzc3VlQ29tbWVudDQ5Nzg4NTU5MA== | costrouc 1740337 | 2019-05-31T23:05:05Z | 2019-05-31T23:05:05Z | NONE | Upon doing a "fix" which allowed a longer build timeout the cloudrun container was too slow when it actually ran. So I would say if your sqlite database is over 1 GB heroku and cloudrun are not good options. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Additional options to gcloud build command in cloudrun - timeout 450862577 | |
499260727 | https://github.com/simonw/datasette/issues/499#issuecomment-499260727 | https://api.github.com/repos/simonw/datasette/issues/499 | MDEyOklzc3VlQ29tbWVudDQ5OTI2MDcyNw== | chrismp 7936571 | 2019-06-05T21:22:55Z | 2019-06-05T21:22:55Z | NONE | I was thinking of having some kind of GUI in which regular reporters can upload a CSV and choose how to name the tables, columns and whatnot. Maybe it's possible to make such a GUI using Jinja template language? I ask because I'm unsure how to pursue this but I'd like to try. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Accessibility for non-techie newsies? 451585764 | |
499262397 | https://github.com/simonw/datasette/issues/498#issuecomment-499262397 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDQ5OTI2MjM5Nw== | chrismp 7936571 | 2019-06-05T21:28:32Z | 2019-06-05T21:28:32Z | NONE | Thinking about this more, I'd probably have to make a template page to go along with this, right? I'm guessing there's no way to add an all-databases-all-tables search to datasette's "home page" except by copying the "home page" template and editing it? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
500238035 | https://github.com/simonw/datasette/issues/506#issuecomment-500238035 | https://api.github.com/repos/simonw/datasette/issues/506 | MDEyOklzc3VlQ29tbWVudDUwMDIzODAzNQ== | Gagravarr 1059677 | 2019-06-09T19:21:18Z | 2019-06-09T19:21:18Z | NONE | If you don't mind calling out to Java, then Apache Tika is able to tell you what a load of "binary stuff" is, plus render it to XHTML where possible. There's a python wrapper around the Apache Tika server, but for a more typical datasette usecase you'd probably just want to grab the Tika CLI jar, and call it with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to display binary data 453846217 | |
501903071 | https://github.com/simonw/datasette/issues/498#issuecomment-501903071 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDUwMTkwMzA3MQ== | chrismp 7936571 | 2019-06-13T22:35:06Z | 2019-06-13T22:35:06Z | NONE | I'd like to start working on this. I've made a custom template for Can I make additional custom Python scripts for this or must I edit datasette's files directly? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
503236800 | https://github.com/simonw/datasette/issues/512#issuecomment-503236800 | https://api.github.com/repos/simonw/datasette/issues/512 | MDEyOklzc3VlQ29tbWVudDUwMzIzNjgwMA== | chrismp 7936571 | 2019-06-18T17:36:37Z | 2019-06-18T17:36:37Z | NONE | Oh I didn't know the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"about" parameter in metadata does not appear when alone 457147936 | |
503237884 | https://github.com/simonw/datasette/issues/502#issuecomment-503237884 | https://api.github.com/repos/simonw/datasette/issues/502 | MDEyOklzc3VlQ29tbWVudDUwMzIzNzg4NA== | chrismp 7936571 | 2019-06-18T17:39:18Z | 2019-06-18T17:46:08Z | NONE | It appears that I cannot reopen this issue but the proposed solution did not solve it. The link is not there. I have full text search enabled for a bunch of tables in my database and even clicking the link to reveal hidden tables did not show the download DB link. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Exporting sqlite database(s)? 453131917 | |
503249999 | https://github.com/simonw/datasette/issues/513#issuecomment-503249999 | https://api.github.com/repos/simonw/datasette/issues/513 | MDEyOklzc3VlQ29tbWVudDUwMzI0OTk5OQ== | chrismp 7936571 | 2019-06-18T18:11:36Z | 2019-06-18T18:11:36Z | NONE | Ah, so basically put the SQLite databases on Linode, for example, and run |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Is it possible to publish to Heroku despite slug size being too large? 457201907 | |
504684709 | https://github.com/simonw/datasette/issues/514#issuecomment-504684709 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY4NDcwOQ== | chrismp 7936571 | 2019-06-22T17:36:25Z | 2019-06-22T17:36:25Z | NONE |
@russss, Which directory does this represent? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504685187 | https://github.com/simonw/datasette/issues/514#issuecomment-504685187 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY4NTE4Nw== | chrismp 7936571 | 2019-06-22T17:43:24Z | 2019-06-22T17:43:24Z | NONE |
In my case, on a remote server, I set up a virtual environment in My datasette project is in And the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504785662 | https://github.com/simonw/datasette/issues/498#issuecomment-504785662 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDUwNDc4NTY2Mg== | chrismp 7936571 | 2019-06-23T20:47:37Z | 2019-06-23T20:47:37Z | NONE | Very cool, thank you. Using http://search-24ways.herokuapp.com as an example, let's say I want to search all FTS columns in all tables in all databases for the word "web." Here's a link to the query I'd need to run to search "web" on FTS columns in And here's a link to the JSON version of the above result. I'd like to get the JSON result of that query for each FTS table of each database in my datasette project. Is it possible in Javascript to automate the construction of query URLs like the one I linked, but for every FTS table in my datasette project? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
504686266 | https://github.com/simonw/datasette/issues/514#issuecomment-504686266 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDY4NjI2Ng== | chrismp 7936571 | 2019-06-22T17:58:50Z | 2019-06-23T21:21:57Z | NONE | @russss Actually, here's what I've got in ``` [Unit] Description=Datasette After=network.target [Service] Type=simple User=chris WorkingDirectory=/home/chris/digital-library ExecStart=/home/chris/Env/datasette/lib/python3.7/site-packages/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json Restart=on-failure [Install] WantedBy=multi-user.target ``` I ran:
Got this message.
``` Welcome to nginx! If you see this page, the nginx web server is successfully installed and working. Further configuration is required. For online documentation and support please refer to nginx.org. Commercial support is available at nginx.com. Thank you for using nginx. ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504789231 | https://github.com/simonw/datasette/issues/514#issuecomment-504789231 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDc4OTIzMQ== | chrismp 7936571 | 2019-06-23T21:35:33Z | 2019-06-23T21:35:33Z | NONE | @russss Thanks, just one more thing. I edited ``` [Unit] Description=Datasette After=network.target [Service] Type=simple User=chris WorkingDirectory=/home/chris/digital-library ExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json Restart=on-failure [Install] WantedBy=multi-user.target ``` Then ran:
But the logs from
But the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
504998302 | https://github.com/simonw/datasette/issues/514#issuecomment-504998302 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNDk5ODMwMg== | chrismp 7936571 | 2019-06-24T12:57:19Z | 2019-06-24T12:57:19Z | NONE | Same error when I used the full path. On Sun, Jun 23, 2019 at 18:31 Simon Willison notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
505228873 | https://github.com/simonw/datasette/issues/498#issuecomment-505228873 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDUwNTIyODg3Mw== | chrismp 7936571 | 2019-06-25T00:21:17Z | 2019-06-25T00:21:17Z | NONE | Eh, I'm not concerned with a relevance score right now. I think I'd be fine with a search whose results show links to data tables with at least one result. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
505232675 | https://github.com/simonw/datasette/issues/514#issuecomment-505232675 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwNTIzMjY3NQ== | chrismp 7936571 | 2019-06-25T00:43:12Z | 2019-06-25T00:43:12Z | NONE | Yep, that worked to get the site up and running at ``` !/bin/bash/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 -p 80 /home/chris/digital-library/databases/*.db --cors --metadata /home/chris/digital-library/metadata.json ``` I got this error.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
505424665 | https://github.com/simonw/datasette/pull/529#issuecomment-505424665 | https://api.github.com/repos/simonw/datasette/issues/529 | MDEyOklzc3VlQ29tbWVudDUwNTQyNDY2NQ== | nathancahill 1383872 | 2019-06-25T12:35:07Z | 2019-06-25T12:35:07Z | NONE | Opps, wrote this late last night, didn't see you'd already worked on the issue. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use keyed rows - fixes #521 460396952 | |
506000023 | https://github.com/simonw/datasette/issues/522#issuecomment-506000023 | https://api.github.com/repos/simonw/datasette/issues/522 | MDEyOklzc3VlQ29tbWVudDUwNjAwMDAyMw== | nathancahill 1383872 | 2019-06-26T18:48:53Z | 2019-06-26T18:48:53Z | NONE | Reference implementation from Requests: https://github.com/kennethreitz/requests/blob/3.0/requests/structures.py#L14 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle case-insensitive headers in a nicer way 459622390 | |
506985050 | https://github.com/simonw/datasette/issues/498#issuecomment-506985050 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDUwNjk4NTA1MA== | chrismp 7936571 | 2019-06-29T20:28:21Z | 2019-06-29T20:28:21Z | NONE | In my case, I have an ever-growing number of databases and tables within them. Most tables have FTS enabled. I cannot predict the names of future tables and databases, nor can I predict the names of the columns for which I wish to enable FTS. For my purposes, I was thinking of writing up something that sends these two GET requests to each of my databases' tables.
In the resulting JSON strings, I'd check the value of the key Is this feasible within the datasette library, or would it require some type of plugin? Or maybe you know of a better way of accomplishing this goal. Maybe I overlooked something. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
508590397 | https://github.com/simonw/datasette/issues/498#issuecomment-508590397 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDUwODU5MDM5Nw== | chrismp 7936571 | 2019-07-04T23:34:41Z | 2019-07-04T23:34:41Z | NONE | I'll take your suggestion and do this all in Javascript. Would I need to make a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
509042334 | https://github.com/simonw/datasette/issues/498#issuecomment-509042334 | https://api.github.com/repos/simonw/datasette/issues/498 | MDEyOklzc3VlQ29tbWVudDUwOTA0MjMzNA== | chrismp 7936571 | 2019-07-08T00:18:29Z | 2019-07-08T00:18:29Z | NONE | @simonw I made this primitive search that I've put in my Datasette project's custom templates directory: https://gist.github.com/chrismp/e064b41f08208a6f9a93150a23cf7e03 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search of all tables at once? 451513541 | |
509154312 | https://github.com/simonw/datasette/issues/514#issuecomment-509154312 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwOTE1NDMxMg== | JesperTreetop 4363711 | 2019-07-08T09:36:25Z | 2019-07-08T09:40:33Z | NONE | @chrismp: Ports 1024 and under are privileged and can usually only be bound by a root or supervisor user, so it makes sense if you're running as the user See this generic question-and-answer and this systemd question-and-answer for more information about ways to skin this cat. Without knowing your specific circumstances, either extending those privileges to that service/executable/user, proxying them through something like nginx or indeed looking at what the nginx systemd job has to do to listen at port 80 all sound like good ways to start. At this point, this is more generic systemd/Linux support than a Datasette issue, which is why a complete rando like me is able to contribute anything. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
509431603 | https://github.com/simonw/datasette/issues/514#issuecomment-509431603 | https://api.github.com/repos/simonw/datasette/issues/514 | MDEyOklzc3VlQ29tbWVudDUwOTQzMTYwMw== | chrismp 7936571 | 2019-07-08T23:39:52Z | 2019-07-08T23:39:52Z | NONE | In
To...
It worked. I can access |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation with recommendations on running Datasette in production without using Docker 459397625 | |
511252718 | https://github.com/simonw/datasette/issues/558#issuecomment-511252718 | https://api.github.com/repos/simonw/datasette/issues/558 | MDEyOklzc3VlQ29tbWVudDUxMTI1MjcxOA== | 0x1997 380586 | 2019-07-15T01:29:29Z | 2019-07-15T01:29:29Z | NONE | Thanks, the latest version works. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support unicode in url 467218270 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
body 995 ✖