issue_comments
996 rows where author_association = "NONE" sorted by user
This data as json, CSV (advanced)
Suggested facets: body, reactions, created_at (date), updated_at (date)
issue 622
- Transformation type `--type DATETIME` 14
- link_or_copy_directory() error - Invalid cross-device link 13
- WIP: Add Gmail takeout mbox import 12
- .json and .csv exports fail to apply base_url 11
- base_url configuration setting 10
- Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 10
- Documentation with recommendations on running Datasette in production without using Docker 9
- JavaScript plugin hooks mechanism similar to pluggy 9
- Add GraphQL endpoint 8
- Call for birthday presents: if you're using Datasette, let us know how you're using it here 8
- Full text search of all tables at once? 7
- Populate "endpoint" key in ASGI scope 7
- Figure out some interesting example SQL queries 7
- Add Gmail takeout mbox import (v2) 7
- Incorrect URLs when served behind a proxy with base_url set 6
- publish heroku does not work on Windows 10 6
- Update for Big Sur 6
- Improve the display of facets information 6
- De-tangling Metadata before Datasette 1.0 6
- Metadata should be a nested arbitrary KV store 5
- Windows installation error 5
- Ways to improve fuzzy search speed on larger data sets? 5
- Redesign default .json format 5
- UNIQUE constraint failed: workouts.id 5
- Feature Request: Gmail 5
- Plugin hook for dynamic metadata 5
- i18n support 5
- datasette --root running in Docker doesn't reliably show the magic URL 5
- Datasette serve should accept paths/URLs to CSVs and other file formats 4
- Mechanism for ranking results from SQLite full-text search 4
- Port Datasette to ASGI 4
- Wildcard support in query parameters 4
- Handle really wide tables better 4
- Prototoype for Datasette on PostgreSQL 4
- Support column descriptions in metadata.json 4
- .delete_where() does not auto-commit (unlike .insert() or .upsert()) 4
- "Stream all rows" is not at all obvious 4
- Possible to deploy as a python app (for Rstudio connect server)? 4
- Document how to send multiple values for "Named parameters" 4
- Add support for Jinja2 version 3.0 4
- Win32 "used by another process" error with datasette publish 4
- introduce new option for datasette package to use a slim base image 4
- CLI eats my cursor 4
- datasette package --spatialite throws error during build 4
- How to redirect from "/" to a specific db/table 4
- Package as standalone binary 3
- Plugin that adds an authentication layer of some sort 3
- datasette publish lambda plugin 3
- Explore if SquashFS can be used to shrink size of packaged Docker containers 3
- make uvicorn optional dependancy (because not ok on windows python yet) 3
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 3
- updating metadata.json without recreating the app 3
- upsert_all() throws issue when upserting to empty table 3
- base_url doesn't seem to work when adding criteria and clicking "apply" 3
- Fallback to databases in inspect-data.json when no -i options are passed 3
- Some workout columns should be float, not text 3
- Archive import appears to be broken on recent exports 3
- Use structlog for logging 3
- KeyError: 'Contents' on running upload 3
- photo-to-sqlite: command not found 3
- sqlite-utils extract could handle nested objects 3
- Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions 3
- improve table horizontal scroll experience 3
- feature: support "events" 3
- Rename Datasette.__init__(config=) parameter to settings= 3
- [Enhancement] Please allow 'insert-files' to insert content as text. 3
- KeyError: 'created_at' for private accounts? 3
- JSON link on row page is 404 if base_url setting is used 3
- Creating tables with custom datatypes 3
- query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data 3
- SQL query field can't begin by a comment 3
- Feature request: output number of ignored/replaced rows for insert command 3
- Expand foreign key references in row view as well 3
- When reverse proxying datasette with nginx an URL element gets erronously added 3
- Link to JSON for the list of tables 2
- Option to open readonly but not immutable 2
- Support WITH query 2
- I18n and L10n support 2
- add "format sql" button to query page, uses sql-formatter 2
- 500 from missing table name 2
- Ability to sort (and paginate) by column 2
- Figure out how to bundle a more up-to-date SQLite 2
- Escaping named parameters in canned queries 2
- Validate metadata.json on startup 2
- Support cross-database joins 2
- datasette inspect takes a very long time on large dbs 2
- Installation instructions, including how to use the docker image 2
- Problems handling column names containing spaces or - 2
- Zeit API v1 does not work for new users - need to migrate to v2 2
- How to pass named parameter into spatialite MakePoint() function 2
- Datasette Library 2
- Mechanism for turning nested JSON into foreign keys / many-to-many 2
- Too many SQL variables 2
- "Invalid SQL" page should let you edit the SQL 2
- Support Python 3.8, stop supporting Python 3.5 2
- Make database level information from metadata.json available in the index.html template 2
- Mechanism for adding arbitrary pages like /about 2
- Exception running first command: IndexError: list index out of range 2
- Allow creation of virtual tables at startup 2
- Escape_fts5_query-hookimplementation does not work with queries to standard tables 2
- Allow injecting configuration data from plugins 2
- --cp option for datasette publish and datasette package for shipping additional files and directories 2
- ?_searchmode=raw option for running FTS searches without escaping characters 2
- Authentication (and permissions) as a core concept 2
- Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6 2
- [Feature Request] Support Repo Name in Search 🥺 2
- Consider pagination of canned queries 2
- initial windows ci setup 2
- github-to-sqlite should handle rate limits better 2
- .extract() shouldn't extract null values 2
- Make it possible to download BLOB data from the Datasette UI 2
- changes to allow for compound foreign keys 2
- Support for generated columns 2
- sqlite-utils should suggest --csv if JSON parsing fails 2
- Better error message for *_fts methods against views 2
- Access Denied Error in Windows 2
- Not all quoted statuses get fetched? 2
- SSL Error 2
- Installing datasette via docker: Path 'fixtures.db' does not exist 2
- Share button for copying current URL 2
- Facets timing out but work when filtering 2
- I'm creating a plugin to export a spreadsheet file (.ods or .xlsx) 2
- Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0 2
- bool type not supported 2
- Cannot set type JSON 2
- basic support for events 2
- Serve all db files in a folder 2
- feature request: document minimum permissions for service account for cloudrun 2
- Manage /robots.txt in Datasette core, block robots by default 2
- Deploy a live instance of demos/apache-proxy 2
- Use datasette-table Web Component to guide the design of the JSON API for 1.0 2
- Support for CHECK constraints 2
- Table+query JSON and CSV links broken when using `base_url` setting 2
- Make it easier to insert geometries, with documentation and maybe code 2
- base_url or prefix does not work with _exact match 2
- `deterministic=True` fails on versions of SQLite prior to 3.8.3 2
- [feature] immutable mode for a directory, not just individual sqlite file 2
- `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 2
- Research: demonstrate if parallel SQL queries are worthwhile 2
- Allow making m2m relation of a table to itself 2
- illegal UTF-16 surrogate 2
- Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 2
- Ability to insert multi-line files 2
- Setting to turn off table row counts entirely 2
- devrel/python api: Pylance type hinting 2
- Reconsider the Datasette first-run experience 2
- don't use immutable=1, only mode=ro 2
- Datasette with many and large databases > Memory use 2
- Cannot enable FTS5 despite it being available 2
- DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 2
- Suggestion: Hiding columns 2
- How to use Datasette with apache webserver on GCP? 2
- Character encoding problem 2
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 2
- GitHub Action to lint Python code with ruff 2
- 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" 2
- photos-to-sql not found? 2
- Permissions in metadata.yml / metadata.json 2
- [feature request]`datasette install plugins.json` options 2
- Plugin hook for database queries that are run 2
- TemplateAssertionError: no filter named 'tojson' 1
- TemplateAssertionError: no filter named 'tojson' 1
- datasette publish can fail if /tmp is on a different device 1
- apsw as alternative sqlite3 binding (for full text search) 1
- Ability to customize presentation of specific columns in HTML view 1
- A primary key column that has foreign key restriction associated won't rendering label column 1
- proposal new option to disable user agents cache 1
- Ability to bundle metadata and templates inside the SQLite file 1
- Cleaner mechanism for handling custom errors 1
- Allow plugins to define additional URL routes and views 1
- prepare_context() plugin hook 1
- SQLite code decoupled from Datasette 1
- Add new metadata key persistent_urls which removes the hash from all database urls 1
- Add links to example Datasette instances to appropiate places in docs 1
- Documentation for URL hashing, redirects and cache policy 1
- Handle spatialite geometry columns better 1
- Support for external database connectors 1
- Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 1
- render_cell(value) plugin hook 1
- Search all apps during heroku publish 1
- CSV export in "Advanced export" pane doesn't respect query 1
- How to pass configuration to plugins? 1
- How does persistence work? 1
- .insert/.upsert/.insert_all/.upsert_all should add missing columns 1
- Add query parameter to hide SQL textarea 1
- Upgrade to Jinja2==2.10.1 1
- Option to facet by date using month or year 1
- Additional options to gcloud build command in cloudrun - timeout 1
- Accessibility for non-techie newsies? 1
- Exporting sqlite database(s)? 1
- Option to display binary data 1
- Get Datasette tests passing on Windows in GitHub Actions 1
- "about" parameter in metadata does not appear when alone 1
- Is it possible to publish to Heroku despite slug size being too large? 1
- Handle case-insensitive headers in a nicer way 1
- Stream all results for arbitrary SQL and canned queries 1
- Use keyed rows - fixes #521 1
- Support unicode in url 1
- extracts= option for insert/update/etc 1
- Unexpected keyword argument 'hidden' 1
- Datasette Edit 1
- Ability to list views, and to access db["view_name"].rows / rows_where / etc 1
- Added support for multi arch builds 1
- Queries per DB table in metadata.json 1
- upgrade to uvicorn-0.9 to be Python-3.8 friendly 1
- Support queries at the table level 1
- Datasette FTS detection bug 1
- "friends" command (similar to "followers") 1
- Publish to Heroku is broken: "WARNING: You must pass the application as an import string to enable 'reload' or 'workers" 1
- Feature request: enable extensions loading 1
- Implement ON DELETE and ON UPDATE actions for foreign keys 1
- fts5 syntax error when using punctuation 1
- Assets table with downloads 1
- order_by mechanism 1
- How do I use the app.css as style sheet? 1
- --port option to expose a port other than 8001 in "datasette package" 1
- Problem with square bracket in CSV column name 1
- Cashe-header missing in http-response 1
- Ability to customize columns used by extracts= feature 1
- datasette publish cloudrun --memory option 1
- Adding a "recreate" flag to the `Database` constructor 1
- Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records 1
- Import EXIF data into SQLite - lens used, ISO, aperture etc 1
- Integrate image content hashing 1
- Error when I click on "View and edit SQL" 1
- strange behavior using accented characters 1
- Replace "datasette publish --extra-options" with "--setting" 1
- Fall back to authentication via ENV 1
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 1
- Question: Access to immutable database-path 1
- fts search on a column doesn't work anymore due to escape_fts 1
- Ability to serve thumbnailed Apple Photo from its place on disk 1
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 1
- Enable wildcard-searches by default 1
- Invalid SQL no such table: main.uploads 1
- Error pages not correctly loading CSS 1
- Group permission checks by request on /-/permissions debug page 1
- Reload support for config_dir mode. 1
- Fall back to FTS4 if FTS5 is not available 1
- Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15 1
- Magic parameters for canned queries 1
- New pattern for views that return either JSON or HTML, available for plugins 1
- Skip counting hidden tables 1
- Load only python files from plugins-dir. 1
- Use None as a default arg 1
- Don't install tests package 1
- Feature: pull request reviews and comments 1
- Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0 1
- Support reverse pagination (previous page, has-previous-items) 1
- Travis should not build the master branch, only the main branch 1
- 'datasette --get' option, refs #926 1
- Don't hang in db.execute_write_fn() if connection fails 1
- Run CI on GitHub Actions, not Travis 1
- Try out CodeMirror SQL hints 1
- favorites --stop_after=N stops after min(N, 200) 1
- request an "-o" option on "datasette server" to open the default browser at the running url 1
- Idea: transitive closure tables for tree structures 1
- Progress bar for sqlite-utils insert 1
- Update pytest requirement from <6.1.0,>=5.2.2 to >=5.2.2,<6.2.0 1
- Allow facet by primary keys, fixes #985 1
- Redesign application homepage 1
- Run tests against Python 3.9 1
- Document setting Google Cloud SDK properties 1
- datasette.client internal requests mechanism 1
- from_json jinja2 filter 1
- Add json_loads and json_dumps jinja2 filters 1
- Update janus requirement from <0.6,>=0.4 to >=0.4,<0.7 1
- Update asgiref requirement from ~=3.2.10 to >=3.2.10,<3.4.0 1
- Fix table name in spatialite example command 1
- About loading spatialite 1
- export.xml file name varies with different language settings 1
- Make `package` command deal with a configuration directory argument 1
- Bring date parsing into Datasette core 1
- DOC: Fix syntax error 1
- /db/table/-/blob/pk/column.blob download URL 1
- Include LICENSE in sdist 1
- Add minimum supported python 1
- Add template block prior to extra URL loaders 1
- Switch to .blob render extension for BLOB downloads 1
- Radical new colour scheme and base styles, courtesy of @natbat 1
- Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7 1
- New explicit versioning mechanism 1
- .blob output renderer 1
- Nav menu plus menu_links() hook 1
- load_template() plugin hook 1
- DigitalOcean buildpack memory errors for large sqlite db? 1
- Use FTS4 in fixtures 1
- import EX_CANTCREAT means datasette fails to work on Windows 1
- SQLite does not have case sensitive columns 1
- Use f-strings 1
- Discussion: Adding support for fetching only fresh tweets 1
- Fix --metadata doc usage 1
- GENERATED column support 1
- generated_columns table in fixtures.py 1
- Fix misaligned table actions cog 1
- Fix startup error on windows 1
- Fix footer not sticking to bottom in short pages 1
- "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 1
- sqlite3.OperationalError: near "(": syntax error 1
- More flexible CORS support in core, to encourage good security practices 1
- JavaScript to help plugins interact with the fragment part of the URL 1
- Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0 1
- killed by oomkiller on large location-history 1
- Maintain an in-memory SQLite table of connected databases and their tables 1
- --since support for favorites 1
- Modernize code to Python 3.6+ 1
- Mechanism for executing JavaScript unit tests 1
- Adopt Prettier for JavaScript code formatting 1
- Install Prettier via package.json 1
- GitHub Actions workflow to build and sign macOS binary executables 1
- Certain database names results in 404: "Database not found: None" 1
- Add fts offset docs. 1
- XML parse error 1
- WIP: Plugin includes 1
- Release 0.54 1
- Immutable Database w/ Canned Queries 1
- Use context manager instead of plain open 1
- /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory 1
- Add compile option to Dockerfile to fix failing test (fixes #696) 1
- Error reading csv files with large column data 1
- --no-headers option for CSV and TSV 1
- 500 error caused by faceting if a column called `n` exists 1
- ensure immutable databses when starting in configuration directory mode with 1
- Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages 1
- --crossdb option for joining across databases 1
- Custom pages don't work with base_url setting 1
- Allow facetting on custom queries 1
- fix small typo 1
- Sticky table column headers would be useful, especially on the query page 1
- Async support 1
- Add back styling to lists within table cells (fixes #1141) 1
- Capture "Ctrl + Enter" or "⌘ + Enter" to send SQL query? 1
- Minor type in IP adress 1
- Allow canned query params to specify default values 1
- Fix: code quality issues 1
- Escaping FTS search strings 1
- Some links aren't properly URL encoded. 1
- FTS quote functionality from datasette 1
- Plugin hook that could support 'order by random()' for table view 1
- Support for HTTP Basic Authentication 1
- support for Apache Arrow / parquet files I/O 1
- Full text search possibly broken? 1
- Use SQLite conn.interrupt() instead of sqlite_timelimit() 1
- Unit tests for the Dockerfile 1
- Invalid SQL: "no such table: pragma_database_list" on database page 1
- Minor Docs Update. Added `--app` to fly install command. 1
- Support to annotate photos on other than macOS OSes 1
- Add testres-db tool 1
- Fix little typo 1
- Better default display of arrays of items 1
- Use pytest-xdist to speed up tests 1
- Update docs: explain allow_download setting 1
- Dockerfile: use Ubuntu 20.10 as base 1
- Update pytest-asyncio requirement from <0.15,>=0.10 to >=0.10,<0.16 1
- Avoid error sorting by relationships if related tables are not allowed 1
- Bump black from 20.8b1 to 21.4b0 1
- Bump black from 20.8b1 to 21.4b1 1
- Bump black from 20.8b1 to 21.4b2 1
- Upgrade to GitHub-native Dependabot 1
- Bump black from 21.4b2 to 21.5b0 1
- Add Docker multi-arch support with Buildx 1
- Bump black from 21.4b2 to 21.5b1 1
- Update click requirement from ~=7.1.1 to >=7.1.1,<8.1.0 1
- Update jinja2 requirement from <2.12.0,>=2.10.3 to >=2.10.3,<3.1.0 1
- Support Unicode characters in metadata.json 1
- Update aiofiles requirement from <0.7,>=0.4 to >=0.4,<0.8 1
- Fix small typo 1
- ?_col=/?_nocol= to show/hide columns on the table page 1
- Re-display user's query with an error message if an error occurs 1
- DRAFT: add test and scan for docker images 1
- Error: Use either --since or --since_id, not both 1
- Using enable_fts before search term 1
- Make custom pages compatible with base_url setting 1
- Consider using CSP to protect against future XSS 1
- Update trustme requirement from <0.8,>=0.7 to >=0.7,<0.9 1
- Bump black from 21.5b2 to 21.6b0 1
- JSON export dumps JSON fields as TEXT 1
- sqlite-utils memory command for directly querying CSV/JSON data 1
- add -h support closes #276 1
- Update pytest-xdist requirement from <2.3,>=2.2.1 to >=2.2.1,<2.4 1
- Mypy fixes for rows_from_file() 1
- Test against Python 3.10-dev 1
- Fix + improve get_metadata plugin hook docs 1
- Update asgiref requirement from <3.4.0,>=3.2.10 to >=3.2.10,<3.5.0 1
- absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 1
- Option for importing CSV data using the SQLite .import mechanism 1
- Documentation on using Datasette as a library 1
- Bump black from 21.6b0 to 21.7b0 1
- Read lines with JSON object 1
- 403 when getting token 1
- sqlite-utils convert command and db[table].convert(...) method 1
- Spelling corrections plus CI job for codespell 1
- Show count of facet values if ?_facet_size=max 1
- `sqlite-utils insert --flatten` option to flatten nested JSON 1
- Add reference page to documentation using Sphinx autodoc 1
- Column metadata 1
- Update trustme requirement from <0.9,>=0.7 to >=0.7,<0.10 1
- Rethink how .ext formats (v.s. ?_format=) works before 1.0 1
- Add --merged-by flag to pull-requests sub command 1
- Duplicate Column 1
- Make sure that case-insensitive column names are unique 1
- Ability to insert file contents as text, in addition to blob 1
- Update pluggy requirement from ~=0.13.0 to >=0.13,<1.1 1
- Bump black from 21.7b0 to 21.8b0 1
- xml.etree.ElementTree.Parse Error - mismatched tag 1
- Correct naming of tool in readme 1
- Update beautifulsoup4 requirement from <4.10.0,>=4.8.1 to >=4.8.1,<4.11.0 1
- Test against 3.10-dev 1
- Add Authorization header when CORS flag is set 1
- Bump black from 21.7b0 to 21.9b0 1
- Update pytest-xdist requirement from <2.4,>=2.2.1 to >=2.2.1,<2.5 1
- Invalid JSON output when no rows 1
- Fix compatibility with Python 3.10 1
- Update pytest-timeout requirement from <1.5,>=1.4.2 to >=1.4.2,<2.1 1
- Test against Python 3.10 1
- Update pytest-asyncio requirement from <0.16,>=0.10 to >=0.10,<0.17 1
- Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" 1
- Add functionality to read Parquet files. 1
- Bump black from 21.9b0 to 21.10b0 1
- Default values for `--attach` and `--param` options 1
- Datasette should have an option to output CSV with semicolons 1
- Update docutils requirement from <0.18 to <0.19 1
- New pattern for async view classes 1
- Bump black from 21.9b0 to 21.11b0 1
- Bump black from 21.9b0 to 21.11b1 1
- base_url is omitted in JSON and CSV views 1
- Add new `"sql_file"` key to Canned Queries in metadata? 1
- Update janus requirement from <0.7,>=0.6.2 to >=0.6.2,<0.8 1
- Execution on Windows 1
- Update aiofiles requirement from <0.8,>=0.4 to >=0.4,<0.9 1
- Test against pysqlite3 running SQLite 3.37 1
- Bump black from 21.11b1 to 21.12b0 1
- Update pytest-xdist requirement from <2.5,>=2.2.1 to >=2.2.1,<2.6 1
- Data Pull fails for "Essential" level access to the Twitter API (for Documentation) 1
- TableView refactor 1
- filters_from_request plugin hook, now used in TableView 1
- Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 1
- --lines and --text and --convert and --import 1
- Initial prototype of .analyze() methods 1
- `sqlite-utils bulk` command 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18 1
- Add new spatialite helper methods 1
- Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2 1
- Documentation should clarify /stable/ vs /latest/ 1
- Potential simplified publishing mechanism 1
- Bump black from 21.12b0 to 22.1.0 1
- Ensure template_path always uses "/" to match jinja 1
- Reconsider policy on blocking queries containing the string "pragma" 1
- Test against Python 3.11-dev 1
- Index page `/` has no CORS headers 1
- Try test suite against macOS and Windows 1
- sqlite3.OperationalError: no such table: main.my_activity 1
- Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0 1
- Advanced class-based `conversions=` mechanism 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19 1
- Update Dockerfile generated by `datasette publish` 1
- Add SpatiaLite helpers to CLI 1
- Configuration directory mode does not pick up other file extensions than .db 1
- Optional Pandas integration 1
- Use dash encoding for table names and row primary keys in URLs 1
- Add /opt/homebrew to where spatialite extension can be found 1
- Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 1
- Tilde encoding 1
- Options for how `r.parsedate()` should handle invalid dates 1
- insert fails on JSONL with whitespace 1
- Ignore common generated files 1
- Document how to use a `--convert` function that runs initialization code first 1
- "Error: near "(": syntax error" when using sqlite-utils indexes CLI 1
- Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0 1
- Bump black from 22.1.0 to 22.3.0 1
- Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0 1
- Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0 1
- Datasette feature for publishing snapshots of query results 1
- Add timeout option to Cloudrun build 1
- Custom page variables aren't decoded 1
- Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases 1
- When running `auth` command, don't overwrite an existing auth.json file 1
- Misleading progress bar against utf-16-le CSV input 1
- Add scrollbars to table presentation in default layout 1
- Combining `rows_where()` and `search()` to limit which rows are searched 1
- Bump furo from 2022.4.7 to 2022.6.4.1 1
- Extract facet portions of table.html out into included templates 1
- Bump furo from 2022.4.7 to 2022.6.21 1
- Bump black from 22.1.0 to 22.6.0 1
- Keep track of config_dir 1
- Add duplicate table feature 1
- Update pytest-asyncio requirement from <0.19,>=0.17 to >=0.17,<0.20 1
- minor a11y: <select> has no visual indicator when tabbed to 1
- in extract code, check equality with IS instead of = for nulls 1
- feature request: pivot command 1
- Link to installation instructions 1
- Cross-link CLI to Python docs 1
- Discord badge 1
- beanbag-docutils>=2.0 1
- -a option is used for "--auth" and for "--all" 1
- Updating metadata.json on Datasette for MacOS 1
- db[table].create(..., transform=True) and create-table --transform 1
- Test `--load-extension` in GitHub Actions 1
- sqlite-utils query --functions mechanism for registering extra functions 1
- Support entrypoints for `--load-extension` 1
- Add an option for specifying column names when inserting CSV data 1
- Conda Forge 1
- search_sql add include_rank option 1
- Don't use upper bound dependencies, refs #1800 1
- Workaround for test failure: RuntimeError: There is no current event loop 1
- Add organization support to repos command 1
- truncate_cells_html does not work for links? 1
- progressbar for inserts/upserts of all fileformats, closes #485 1
- Specify foreign key against compound key in other table 1
- Database() constructor currently defaults is_mutable to False 1
- `sqlite-utils transform` should set empty strings to null when converting text columns to integer/float 1
- Bump furo from 2022.6.21 to 2022.9.15 1
- [SPIKE] Don't truncate query CSVs 1
- Keyword-only arguments for a bunch of internal methods 1
- Convert &_hide_sql=1 to #_hide_sql 1
- Add documentation for serving via OpenRC 1
- render_cell documentation example doesn't match the method signature 1
- Bump furo from 2022.9.15 to 2022.9.29 1
- use inspect data for hash and file size 1
- Make hash and size a lazy property 1
- Open Datasette link in new tab 1
- fix: enable-fts permanently save triggers 1
- feat: recreate fts triggers after table transform 1
- check_visibility can now take multiple permissions into account 1
- API to insert a single record into an existing table 1
- Default API token authentication mechanism 1
- Allow surrogates in parameters 1
- /db/table/-/upsert API 1
- Errors when using table filters behind a proxy 1
- Merge 1.0-dev branch back to main 1
- Upgrade to CodeMirror 6, add SQL autocomplete 1
- Use DOMContentLoaded instead of load event for CodeMirror initialization 1
- Typo in JSON API `Updating a row` documentation 1
- /db/table/-/upsert 1
- Bump furo from 2022.9.29 to 2022.12.7 1
- "permissions" blocks in metadata.json/yaml 1
- register_permissions() plugin hook 1
- invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 1
- Port as many tests as possible to async def tests against ds_client 1
- Bump sphinx from 5.3.0 to 6.0.0 1
- Bump sphinx from 5.3.0 to 6.1.0 1
- Bump sphinx from 5.3.0 to 6.1.1 1
- Bump blacken-docs from 1.12.1 to 1.13.0 1
- Stuck on loading screen 1
- Document custom json encoder 1
- ?_extra= support (draft) 1
- Datasette is not compatible with SQLite's strict quoting compilation option 1
- Show referring tables and rows when the referring foreign key is compound 1
- use single quotes for string literals, fixes #2001 1
- array facet: don't materialize unnecessary columns 1
- Deploy demo job is failing due to rate limit 1
- Error 500 - not clear the cause 1
- Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1
- add Python 3.11 classifier 1
- remove an unused `app` var in cli.py 1
- Potential feature: special support for `?a=1&a=2` on the query page 1
- Increase performance using macnotesapp 1
- Add paths for homebrew on Apple silicon 1
- Bump furo from 2022.12.7 to 2023.3.23 1
- Add permalink virtual field to items table 1
- rows: --transpose or psql extended view-like functionality 1
- Make detailed notes on how table, query and row views work right now 1
- Add paths for homebrew on Apple silicon 1
- Support self-referencing FKs in `Table.create` 1
- Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6 1
- `table.upsert_all` fails to write rows when `not_null` is present 1
- [BUG] Cannot insert new data to deployed instance 1
- sphinx.builders.linkcheck build error 1
- Bump sphinx from 6.1.3 to 7.0.1 1
- Analyze tables options: --common-limit, --no-most, --no-least 1
- TUI powered by Trogon 1
- Reformatted CLI examples in docs 1
- Bump furo from 2023.3.27 to 2023.5.20 1
- `IndexError` when doing `.insert(..., pk='id')` after `insert_all` 1
- New View base class 1
- `--settings settings.json` option 1
- Use sqlean if available in environment 1
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1
- Bump blacken-docs from 1.14.0 to 1.15.0 1
- feat: Implement a prepare_connection plugin hook 1
- cannot use jinja filters in display? 1
- Bump sphinx from 6.1.3 to 7.1.0 1
- Bump furo from 2023.3.27 to 2023.7.26 1
- datasette serve when invoked with --reload interprets the serve command as a file 1
- Bump sphinx from 6.1.3 to 7.1.1 1
- Bump sphinx from 6.1.3 to 7.1.2 1
- Bump blacken-docs, furo, blacken-docs 1
- Bump the python-packages group with 1 update 1
- Bump the python-packages group with 2 updates 1
- .transform() instead of modifying sqlite_master for add_foreign_keys 1
- Bump the python-packages group with 3 updates 1
- If a row has a primary key of `null` various things break 1
- Bump sphinx, furo, blacken-docs dependencies 1
- Start a new `datasette.yaml` configuration file, with settings support 1
- Test Datasette on multiple SQLite versions 1
- Bump the python-packages group with 3 updates 1
- Cascade for restricted token view-table/view-database/view-instance operations 1
- Fix hupper.start_reloader entry point 1
- Bump sphinx, furo, blacken-docs dependencies 1
- -s/--setting x y gets merged into datasette.yml, refs #2143, #2156 1
- Add new `--internal internal.db` option, deprecate legacy `_internal` database 1
- DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins 1
- Bump the python-packages group with 1 update 1
- click-default-group>=1.2.3 1
- Use $DATASETTE_INTERNAL in absence of --internal 1
- Test against Python 3.12 preview 1
- .transform() now preserves rowid values, refs #592 1
- actors_from_ids plugin hook and datasette.actors_from_ids() method 1
- `datasette.yaml` plugin support 1
- Bump the python-packages group with 3 updates 1
- Server hang on parallel execution of queries to named in-memory databases 1
- Raise an exception if a "plugins" block exists in metadata.json 1
- Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml` 1
- Stop using parallel SQL queries for tables 1
- Cascading DELETE not working with Table.delete(pk) 1
- Discord invite link returns 401 1
- Bump the python-packages group with 1 update 1
- Add spatialite arm64 linux path 1
- Bump the python-packages group with 1 update 1
- Fix query for suggested facets with column named value 1
- Add more STRICT table support 1
- CSV export fails for some `text` foreign key references 1
user 336
- codecov[bot] 240
- aborruso 19
- chrismp 18
- carlmjohnson 14
- tballison 13
- psychemedia 11
- stonebig 11
- frafra 10
- maxhawkins 10
- terrycojones 10
- dracos 10
- rayvoelker 10
- 20after4 9
- clausjuhl 9
- UtahDave 8
- tomchristie 8
- bsilverm 8
- 4l1fe 8
- zaneselvans 7
- mhalle 7
- zeluspudding 7
- cobiadigital 7
- cldellow 6
- khimaros 6
- CharlesNepote 6
- ocdtrekkie 6
- tsibley 5
- khusmann 5
- rdmurphy 5
- MarkusH 5
- lovasoa 5
- Mjboothaus 5
- dazzag24 5
- ar-jan 5
- xavdid 5
- davidhaley 5
- SteadBytes 5
- fs111 4
- yozlet 4
- Btibert3 4
- dholth 4
- jungle-boogie 4
- ColinMaudry 4
- nitinpaultifr 4
- Kabouik 4
- hydrosquall 4
- dvizard 4
- henry501 4
- pjamargh 4
- frankieroberto 3
- obra 3
- janimo 3
- atomotic 3
- briandorsey 3
- pkoppstein 3
- yschimke 3
- philroche 3
- coldclimate 3
- wsxiaoys 3
- johnfelipe 3
- mdrovdahl 3
- xrotwang 3
- robroc 3
- dmick 3
- betatim 3
- dufferzafar 3
- Florents-Tselai 3
- aki-k 3
- ashishdotme 3
- yejiyang 3
- henrikek 3
- swyxio 3
- Segerberg 3
- jsancho-gpl 3
- gk7279 3
- learning4life 3
- mattmalcher 3
- FabianHertwig 3
- polyrand 3
- justmars 3
- garethr 2
- nelsonjchen 2
- dsisnero 2
- hubgit 2
- jayvdb 2
- jackowayed 2
- ftrain 2
- chrishas35 2
- tannewt 2
- HaveF 2
- pkulchenko 2
- coleifer 2
- gavinband 2
- aviflax 2
- iloveitaly 2
- tholo 2
- mungewell 2
- frankier 2
- lchski 2
- tmaier 2
- hcarter333 2
- amitkoth 2
- eads 2
- virtadpt 2
- leafgarland 2
- glyph 2
- rafguns 2
- strada 2
- eelkevdbos 2
- ligurio 2
- n8henrie 2
- soobrosa 2
- nathancahill 2
- mustafa0x 2
- bsmithgall 2
- noslouch 2
- willingc 2
- nattaylor 2
- durkie 2
- cclauss 2
- wulfmann 2
- philshem 2
- bram2000 2
- zzeleznick 2
- plpxsk 2
- jeqo 2
- chapmanjacobd 2
- nickvazz 2
- aaronyih1 2
- luxint 2
- jussiarpalahti 2
- sachaj 2
- lagolucas 2
- stevecrawshaw 2
- chekos 2
- ctsrc 2
- ad-si 2
- smithdc1 2
- gsajko 2
- jcmkk3 2
- null92 2
- publicmatt 2
- rachelmarconi 2
- tunguyenatwork 2
- LVerneyPEReN 2
- tmcl-it 2
- anotherjesse 1
- jarib 1
- jokull 1
- danp 1
- fernand0 1
- precipice 1
- llimllib 1
- gijs 1
- blaine 1
- ashanan 1
- gravis 1
- nkirsch 1
- mrchrisadams 1
- dkam 1
- harperreed 1
- nileshtrivedi 1
- chrismytton 1
- nedbat 1
- furilo 1
- kindly 1
- prabhur 1
- palfrey 1
- dmd 1
- pquentin 1
- Uninen 1
- rtanglao 1
- carsonyl 1
- nryberg 1
- step21 1
- stefanocudini 1
- rcoup 1
- scoates 1
- hpk42 1
- annapowellsmith 1
- cadeef 1
- thorn0 1
- yurivish 1
- pax 1
- lucapette 1
- jmelloy 1
- Krazybug 1
- dvhthomas 1
- dckc 1
- phubbard 1
- sethvincent 1
- andrewdotn 1
- aitoehigie 1
- julienma 1
- michaelmcandrew 1
- drewda 1
- stiles 1
- saulpw 1
- adamalton 1
- terinjokes 1
- thadk 1
- camallen 1
- robintw 1
- astrojuanlu 1
- ipmb 1
- steren 1
- aidansteele 1
- 0x1997 1
- jonafato 1
- gwk 1
- knutwannheden 1
- davidszotten 1
- chrislkeller 1
- kevboh 1
- eaubin 1
- yunzheng 1
- mhkeller 1
- lfdebrux 1
- karlcow 1
- heyarne 1
- ryanfox 1
- sopel 1
- cephillips 1
- ryascott 1
- sirnacnud 1
- simonrjones 1
- justinpinkney 1
- merwok 1
- mattkiefer 1
- snth 1
- adarshp 1
- joshmgrant 1
- bcongdon 1
- nickdirienzo 1
- hannseman 1
- kaihendry 1
- urbas 1
- metamoof 1
- brimstone 1
- adamchainz 1
- PabloLerma 1
- heussd 1
- RayBB 1
- BryantD 1
- limar 1
- drkane 1
- Gagravarr 1
- radusuciu 1
- esagara 1
- agguser 1
- rclement 1
- dyllan-to-you 1
- justinallen 1
- jordaneremieff 1
- wdccdw 1
- wpears 1
- progpow 1
- DavidPratten 1
- ltrgoddard 1
- costrouc 1
- jratike80 1
- ment4list 1
- ccorcos 1
- choldgraf 1
- Olshansk 1
- qqilihq 1
- jdangerx 1
- fidiego 1
- OverkillGuy 1
- QAInsights 1
- secretGeek 1
- fkuhn 1
- jameslittle230 1
- Profpatsch 1
- dskrad 1
- kwladyka 1
- Carib0u 1
- fatihky 1
- phoenixjun 1
- JesperTreetop 1
- wenhoujx 1
- bapowell 1
- yairlenga 1
- chris48s 1
- ChristopherWilks 1
- Maltazar 1
- hueyy 1
- wuhland 1
- eric-burel 1
- foscoj 1
- dvot197007 1
- kokes 1
- RamiAwar 1
- csusanu 1
- rprimet 1
- metab0t 1
- spdkils 1
- sturzl 1
- jrdmb 1
- robmarkcole 1
- jfeiwell 1
- coisnepe 1
- chmaynard 1
- erlend-aasland 1
- amlestin 1
- tf13 1
- alecstein 1
- bendnorman 1
- noklam 1
- jakewilkins 1
- Thomascountz 1
- eigenfoo 1
- GmGniap 1
- rdtq 1
- AnkitKundariya 1
- LucasElArruda 1
- duarteocarmo 1
- sarcasticadmin 1
- yqlbu 1
- Rik-de-Kort 1
- patricktrainer 1
- xmichele 1
- RhetTbull 1
- miuku 1
- philipp-heinrich 1
- jimmybutton 1
- thewchan 1
- izzues 1
- thisismyfuckingusername 1
- kirajano 1
- J450n-4-W 1
- mlaparie 1
- Dhyanesh97 1
- knowledgecamp12 1
- McEazy2700 1
- cycle-data 1
id | html_url | issue_url | node_id | user ▼ | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
748436453 | https://github.com/dogsheep/twitter-to-sqlite/issues/53#issuecomment-748436453 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53 | MDEyOklzc3VlQ29tbWVudDc0ODQzNjQ1Mw== | anotherjesse 27 | 2020-12-19T07:47:01Z | 2020-12-19T07:47:01Z | NONE | I think this should probably be closed as won't fix. Attempting to make a patch for this I realized that the since_id would limit to tweets posted since that since_id, not when it was favorited. So favoriting something in the older would be missed if you used Better to just use |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--since support for favorites 771324837 | |
711083698 | https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711083698 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11 | MDEyOklzc3VlQ29tbWVudDcxMTA4MzY5OA== | jarib 572 | 2020-10-17T21:39:15Z | 2020-10-17T21:39:15Z | NONE | Nice! Works perfectly. Thanks for the quick response and great tooling in general. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
export.xml file name varies with different language settings 723838331 | |
810943882 | https://github.com/simonw/datasette/issues/526#issuecomment-810943882 | https://api.github.com/repos/simonw/datasette/issues/526 | MDEyOklzc3VlQ29tbWVudDgxMDk0Mzg4Mg== | jokull 701 | 2021-03-31T10:03:55Z | 2021-03-31T10:03:55Z | NONE | +1 on using nested queries to achieve this! Would be great as streaming CSV is an amazing feature. Some UX/DX details: I was expecting it to work to simply add
After a bit of testing back and forth I realized streaming only works for full tables. Would love this feature because I'm using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Stream all results for arbitrary SQL and canned queries 459882902 | |
605439685 | https://github.com/dogsheep/github-to-sqlite/issues/15#issuecomment-605439685 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15 | MDEyOklzc3VlQ29tbWVudDYwNTQzOTY4NQ== | garethr 2029 | 2020-03-28T12:17:01Z | 2020-03-28T12:17:01Z | NONE | That looks great, thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Assets table with downloads 544571092 | |
622279374 | https://github.com/dogsheep/github-to-sqlite/issues/33#issuecomment-622279374 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33 | MDEyOklzc3VlQ29tbWVudDYyMjI3OTM3NA== | garethr 2029 | 2020-05-01T07:12:47Z | 2020-05-01T07:12:47Z | NONE | I also go it working with:
|
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fall back to authentication via ENV 609950090 | |
927312650 | https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-927312650 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54 | IC_kwDODEm0Qs43RasK | danp 2182 | 2021-09-26T14:09:51Z | 2021-09-26T14:09:51Z | NONE | Similar trouble with ageinfo using 0.22. Here's what my ageinfo.js file looks like:
Commenting out the registration for ageinfo in archive.py gets my archive to import. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Archive import appears to be broken on recent exports 779088071 | |
1221521377 | https://github.com/dogsheep/pocket-to-sqlite/issues/11#issuecomment-1221521377 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11 | IC_kwDODLZ_YM5Izu_h | fernand0 2467 | 2022-08-21T10:51:37Z | 2022-08-21T10:51:37Z | NONE | I didn't see there is a PR about this: https://github.com/dogsheep/pocket-to-sqlite/pull/7 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
-a option is used for "--auth" and for "--all" 1345452427 | |
1844819002 | https://github.com/simonw/datasette/issues/2214#issuecomment-1844819002 | https://api.github.com/repos/simonw/datasette/issues/2214 | IC_kwDOBm6k_c5t9bQ6 | precipice 2874 | 2023-12-07T07:36:33Z | 2023-12-07T07:36:33Z | NONE | If I uncheck |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
CSV export fails for some `text` foreign key references 2029908157 | |
859940977 | https://github.com/simonw/sqlite-utils/issues/269#issuecomment-859940977 | https://api.github.com/repos/simonw/sqlite-utils/issues/269 | MDEyOklzc3VlQ29tbWVudDg1OTk0MDk3Nw== | frafra 4068 | 2021-06-11T22:33:08Z | 2021-06-11T22:33:08Z | NONE |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
bool type not supported 919250621 | |
860031071 | https://github.com/simonw/sqlite-utils/issues/270#issuecomment-860031071 | https://api.github.com/repos/simonw/sqlite-utils/issues/270 | MDEyOklzc3VlQ29tbWVudDg2MDAzMTA3MQ== | frafra 4068 | 2021-06-12T10:00:24Z | 2021-06-12T10:00:24Z | NONE | Sure, I am sorry if my message hasn't been clear enough. I am also a new user :) At the beginning, I just call ``` sqlite-utils transform species.sqlite species --type criteria json Usage: sqlite-utils transform [OPTIONS] PATH TABLE Try 'sqlite-utils transform --help' for help. Error: Invalid value for '--type': 'json' is not one of 'INTEGER', 'TEXT', 'FLOAT', 'BLOB'. ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot set type JSON 919314806 | |
860031217 | https://github.com/simonw/sqlite-utils/issues/269#issuecomment-860031217 | https://api.github.com/repos/simonw/sqlite-utils/issues/269 | MDEyOklzc3VlQ29tbWVudDg2MDAzMTIxNw== | frafra 4068 | 2021-06-12T10:01:53Z | 2021-06-12T10:01:53Z | NONE |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
bool type not supported 919250621 | |
860047794 | https://github.com/simonw/datasette/issues/1286#issuecomment-860047794 | https://api.github.com/repos/simonw/datasette/issues/1286 | MDEyOklzc3VlQ29tbWVudDg2MDA0Nzc5NA== | frafra 4068 | 2021-06-12T12:36:15Z | 2021-06-12T12:36:15Z | NONE | @mroswell That is a very nice solution. I wonder if custom classes, like |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Better default display of arrays of items 849220154 | |
860548546 | https://github.com/simonw/datasette/issues/1375#issuecomment-860548546 | https://api.github.com/repos/simonw/datasette/issues/1375 | MDEyOklzc3VlQ29tbWVudDg2MDU0ODU0Ng== | frafra 4068 | 2021-06-14T09:41:59Z | 2021-06-14T09:41:59Z | NONE |
Thanks :)
If a developer is not sure if the JSON fields are valid, but then retrieves and parse them, it should handle errors too. Handling inconsistent data is necessary due to the nature of SQLite. A global or dataset option to render the data as they have been defined (JSON, boolean, etc.) when requesting JSON could allow the user to download a regular JSON from the browser without having to rely on APIs. I would guess someone could just make a custom template with an extra JSON-parsed download button otherwise :) |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JSON export dumps JSON fields as TEXT 919508498 | |
862574390 | https://github.com/simonw/sqlite-utils/issues/270#issuecomment-862574390 | https://api.github.com/repos/simonw/sqlite-utils/issues/270 | MDEyOklzc3VlQ29tbWVudDg2MjU3NDM5MA== | frafra 4068 | 2021-06-16T17:34:49Z | 2021-06-16T17:34:49Z | NONE | Sorry, I got confused because SQLite has a JSON column type, even if it is treated as TEXT, and I though automatic facets were available for JSON arrays stored as JSON only :) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Cannot set type JSON 919314806 | |
1139379923 | https://github.com/simonw/sqlite-utils/issues/438#issuecomment-1139379923 | https://api.github.com/repos/simonw/sqlite-utils/issues/438 | IC_kwDOCGYnMM5D6Y7T | frafra 4068 | 2022-05-27T08:05:01Z | 2022-05-27T08:05:01Z | NONE | I tried to debug it using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
illegal UTF-16 surrogate 1250161887 | |
1139392769 | https://github.com/simonw/sqlite-utils/issues/438#issuecomment-1139392769 | https://api.github.com/repos/simonw/sqlite-utils/issues/438 | IC_kwDOCGYnMM5D6cEB | frafra 4068 | 2022-05-27T08:21:53Z | 2022-05-27T08:21:53Z | NONE | Argument were specified in the wrong order. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
illegal UTF-16 surrogate 1250161887 | |
1139426398 | https://github.com/simonw/sqlite-utils/issues/439#issuecomment-1139426398 | https://api.github.com/repos/simonw/sqlite-utils/issues/439 | IC_kwDOCGYnMM5D6kRe | frafra 4068 | 2022-05-27T09:04:05Z | 2022-05-27T10:44:54Z | NONE | This code works:
I used
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Misleading progress bar against utf-16-le CSV input 1250495688 | |
1139484453 | https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1139484453 | https://api.github.com/repos/simonw/sqlite-utils/issues/433 | IC_kwDOCGYnMM5D6ycl | frafra 4068 | 2022-05-27T10:20:08Z | 2022-05-27T10:20:08Z | NONE | I can confirm. This only happens with sqlite-utils. I am using gnome-terminal with bash. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
CLI eats my cursor 1239034903 | |
1257290709 | https://github.com/simonw/datasette/issues/1818#issuecomment-1257290709 | https://api.github.com/repos/simonw/datasette/issues/1818 | IC_kwDOBm6k_c5K8LvV | nelsonjchen 5363 | 2022-09-25T22:17:06Z | 2022-09-25T22:17:06Z | NONE | I wonder if having an option for displaying the max row id might help too. Not accurate especially if something was deleted, but useful for DBs as a dump. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Setting to turn off table row counts entirely 1384549993 | |
1258738740 | https://github.com/simonw/datasette/issues/1818#issuecomment-1258738740 | https://api.github.com/repos/simonw/datasette/issues/1818 | IC_kwDOBm6k_c5LBtQ0 | nelsonjchen 5363 | 2022-09-26T22:52:45Z | 2022-09-26T22:55:57Z | NONE | thoughts on order of precedence to use:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Setting to turn off table row counts entirely 1384549993 | |
791053721 | https://github.com/dogsheep/dogsheep-photos/issues/32#issuecomment-791053721 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32 | MDEyOklzc3VlQ29tbWVudDc5MTA1MzcyMQ== | dsisnero 6213 | 2021-03-05T00:31:27Z | 2021-03-05T00:31:27Z | NONE | I am getting the same thing for US West (N. California) us-west-1 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'Contents' on running upload 803333769 | |
1499797384 | https://github.com/simonw/datasette/issues/2054#issuecomment-1499797384 | https://api.github.com/repos/simonw/datasette/issues/2054 | IC_kwDOBm6k_c5ZZReI | dsisnero 6213 | 2023-04-07T00:46:50Z | 2023-04-07T00:46:50Z | NONE | you should have a look at Roda written in ruby . |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Make detailed notes on how table, query and row views work right now 1657861026 | |
1125083348 | https://github.com/simonw/datasette/issues/1298#issuecomment-1125083348 | https://api.github.com/repos/simonw/datasette/issues/1298 | IC_kwDOBm6k_c5DD2jU | llimllib 7150 | 2022-05-12T14:43:51Z | 2022-05-12T14:43:51Z | NONE | user report: I found this issue because the first time I tried to use datasette for real, I displayed a large table, and thought there was no horizontal scroll bar at all. I didn't even consider that I had to scroll all the way to the end of the page to find it. Just chipping in to say that this confused me, and I didn't even find the scroll bar until after I saw this issue. I don't know what the right answer is, but IMO the UI should suggest to the user that there is a way to view the data that's hidden to the right. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
improve table horizontal scroll experience 855476501 | |
359697938 | https://github.com/simonw/datasette/issues/176#issuecomment-359697938 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA== | gijs 7193 | 2018-01-23T07:17:56Z | 2018-01-23T07:17:56Z | NONE | 👍 I'd like this too! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
1074256603 | https://github.com/simonw/sqlite-utils/issues/417#issuecomment-1074256603 | https://api.github.com/repos/simonw/sqlite-utils/issues/417 | IC_kwDOCGYnMM5AB9rb | blaine 9954 | 2022-03-21T18:19:41Z | 2022-03-21T18:19:41Z | NONE | That makes sense; just a little hint that points folks towards doing the right thing might be helpful! fwiw, the reason I was using jq in the first place was just a quick way to extract one attribute from an actual JSON array. When I initially imported it, I got a table with a bunch of embedded JSON values, rather than a native table, because each array entry had two attributes, one with the data I actually wanted. Not sure how common a use-case this is, though (and easily fixed, aside from the jq weirdness!) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
insert fails on JSONL with whitespace 1175744654 | |
1239516561 | https://github.com/dogsheep/pocket-to-sqlite/issues/10#issuecomment-1239516561 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10 | IC_kwDODLZ_YM5J4YWR | ashanan 11887 | 2022-09-07T15:07:38Z | 2022-09-07T15:07:38Z | NONE | Thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
When running `auth` command, don't overwrite an existing auth.json file 1246826792 | |
925300720 | https://github.com/simonw/sqlite-utils/issues/328#issuecomment-925300720 | https://api.github.com/repos/simonw/sqlite-utils/issues/328 | IC_kwDOCGYnMM43Jvfw | gravis 12752 | 2021-09-22T20:21:33Z | 2021-09-22T20:21:33Z | NONE | Wow, that was fast! Thank you! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Invalid JSON output when no rows 1004613267 | |
617208503 | https://github.com/simonw/datasette/issues/176#issuecomment-617208503 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDYxNzIwODUwMw== | nkirsch 12976 | 2020-04-21T14:16:24Z | 2020-04-21T14:16:24Z | NONE | @eads I'm interested in helping, if there's still a need... |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add GraphQL endpoint 285168503 | |
1229449018 | https://github.com/simonw/sqlite-utils/issues/474#issuecomment-1229449018 | https://api.github.com/repos/simonw/sqlite-utils/issues/474 | IC_kwDOCGYnMM5JR-c6 | hubgit 14294 | 2022-08-28T12:40:13Z | 2022-08-28T12:40:13Z | NONE | Creating the table before inserting is a useful workaround, thanks. It does require figuring out the I was expecting to find an option like |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add an option for specifying column names when inserting CSV data 1353074021 | |
1236200834 | https://github.com/simonw/sqlite-utils/issues/239#issuecomment-1236200834 | https://api.github.com/repos/simonw/sqlite-utils/issues/239 | IC_kwDOCGYnMM5Jru2C | hubgit 14294 | 2022-09-03T21:26:32Z | 2022-09-03T21:26:32Z | NONE | I was looking for something like this today, for extracting columns containing objects (and arrays of objects) into separate tables. Would it make sense (especially for the fields containing arrays of objects) to create a one-to-many relationship, where each row of the newly created table would contain the id of the row that originally contained it? If the extracted objects have a unique id and are repeated, it could even create a many-to-many relationship, with a third table for the joins. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
sqlite-utils extract could handle nested objects 816526538 | |
571412923 | https://github.com/dogsheep/github-to-sqlite/issues/16#issuecomment-571412923 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16 | MDEyOklzc3VlQ29tbWVudDU3MTQxMjkyMw== | jayvdb 15092 | 2020-01-07T03:06:46Z | 2020-01-07T03:06:46Z | NONE | I re-tried after doing |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Exception running first command: IndexError: list index out of range 546051181 | |
602136481 | https://github.com/dogsheep/github-to-sqlite/issues/16#issuecomment-602136481 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16 | MDEyOklzc3VlQ29tbWVudDYwMjEzNjQ4MQ== | jayvdb 15092 | 2020-03-22T02:08:57Z | 2020-03-22T02:08:57Z | NONE | I'd love to be using your library as a better cached gh layer for a new library I have built, replacing large parts of the very ugly https://github.com/jayvdb/pypidb/blob/master/pypidb/_github.py , and then probably being able to rebuild the setuppy chunk as a feature here at a later stage. I would also need tokenless and netrc support, but I would be happy to add those bits. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Exception running first command: IndexError: list index out of range 546051181 | |
974607456 | https://github.com/simonw/datasette/issues/1522#issuecomment-974607456 | https://api.github.com/repos/simonw/datasette/issues/1522 | IC_kwDOBm6k_c46F1Rg | mrchrisadams 17906 | 2021-11-20T07:10:11Z | 2021-11-20T07:10:11Z | NONE | As a a sanity check, would it be worth looking at trying to push the multi-process container on another provider of a knative / cloud run / tekton ? I have a somewhat similar use case for a future proejct, so i'm been very grateful to you sharing all the progress in this issue. As I understand it, Scaleway also offer a very similar offering using what appear to be many similar components that might at least see if it's an issue with more than one knative based FaaS provider https://www.scaleway.com/en/serverless-containers/ https://developers.scaleway.com/en/products/containers/api/#main-features |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deploy a live instance of demos/apache-proxy 1058896236 | |
906015471 | https://github.com/dogsheep/dogsheep-photos/issues/7#issuecomment-906015471 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7 | IC_kwDOD079W842ALLv | dkam 18232 | 2021-08-26T02:01:01Z | 2021-08-26T02:01:01Z | NONE | Perceptual hashes might be what you're after : http://phash.org |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Integrate image content hashing 602585497 | |
1035717429 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1035717429 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W849u8s1 | harperreed 18504 | 2022-02-11T01:55:38Z | 2022-02-11T01:55:38Z | NONE | I would love this merged! |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1687433388 | https://github.com/simonw/datasette/issues/2147#issuecomment-1687433388 | https://api.github.com/repos/simonw/datasette/issues/2147 | IC_kwDOBm6k_c5klDCs | jackowayed 18899 | 2023-08-22T05:05:33Z | 2023-08-22T05:05:33Z | NONE | Thanks for all this! You're totally right that the ASGI option is doable, if a bit low level and coupled to the current URI design. I'm totally fine with that being the final answer. process_view is interesting and in the general direction of what I had in mind. A somewhat less powerful idea: Is there value in giving a hook for just the query that's about to be run? Maybe I'm thinking a little narrowly about this problem I decided I wanted to solve, but I could see other uses for a hook of the sketch below:
Maybe it's too narrowly useful and some of the other pieces of datasette obviate some of these ideas, but off the cuff I could imagine using it to: * Require a LIMIT. Either fail the query or add the limit if it's not there. * Do logging, like my usecase. * Do other analysis on whether you want to allow the query to run; a linter? query complexity? Definitely feel free to say no, or not now. This is all me just playing around with what datasette and its plugin architecture can do with toy ideas, so don't let me push you to commit to a hook you don't feel confident fits well in the design. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for database queries that are run 1858228057 | |
1690955706 | https://github.com/simonw/datasette/issues/2147#issuecomment-1690955706 | https://api.github.com/repos/simonw/datasette/issues/2147 | IC_kwDOBm6k_c5kye-6 | jackowayed 18899 | 2023-08-24T03:54:35Z | 2023-08-24T03:54:35Z | NONE | That's fair. The best idea I can think of is that if a plugin wanted to limit intensive queries, it could add LIMITs or something. A hook that gives you visibility of queries and maybe the option to reject felt a little more limited than the existing plugin hooks, so I was trying to think of what else one might want to do while looking at to-be-run queries. But without a real motivating example, I see why you don't want to add that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for database queries that are run 1858228057 | |
1141711418 | https://github.com/simonw/sqlite-utils/issues/26#issuecomment-1141711418 | https://api.github.com/repos/simonw/sqlite-utils/issues/26 | IC_kwDOCGYnMM5EDSI6 | nileshtrivedi 19304 | 2022-05-31T06:21:15Z | 2022-05-31T06:21:15Z | NONE | I ran into this. My use case has a JSON file with array of I think the right way to declare the relationship while inserting a JSON might be to describe the relationship:
This is relying on the assumption that foreign keys can point to |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for turning nested JSON into foreign keys / many-to-many 455486286 | |
348252037 | https://github.com/simonw/datasette/issues/153#issuecomment-348252037 | https://api.github.com/repos/simonw/datasette/issues/153 | MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw== | ftrain 20264 | 2017-11-30T16:59:00Z | 2017-11-30T16:59:00Z | NONE | WOW! -- Paul Ford // (646) 369-7128 // @ftrain On Thu, Nov 30, 2017 at 11:47 AM, Simon Willison notifications@github.com wrote:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to customize presentation of specific columns in HTML view 276842536 | |
524300388 | https://github.com/simonw/sqlite-utils/issues/54#issuecomment-524300388 | https://api.github.com/repos/simonw/sqlite-utils/issues/54 | MDEyOklzc3VlQ29tbWVudDUyNDMwMDM4OA== | ftrain 20264 | 2019-08-23T12:41:09Z | 2019-08-23T12:41:09Z | NONE | Extremely cool and easy to understand. Thank you! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to list views, and to access db["view_name"].rows / rows_where / etc 480961330 | |
1669877769 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1669877769 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W85jiFAJ | chrismytton 22996 | 2023-08-08T15:52:52Z | 2023-08-08T15:52:52Z | NONE | You can also install this with pip using this oneliner:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1847317568 | https://github.com/dogsheep/github-to-sqlite/issues/79#issuecomment-1847317568 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79 | IC_kwDODFdgUs5uG9RA | nedbat 23789 | 2023-12-08T14:50:13Z | 2023-12-08T14:50:13Z | NONE | Adding |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deploy demo job is failing due to rate limit 1570375808 | |
712855389 | https://github.com/simonw/datasette/issues/991#issuecomment-712855389 | https://api.github.com/repos/simonw/datasette/issues/991 | MDEyOklzc3VlQ29tbWVudDcxMjg1NTM4OQ== | furilo 24740 | 2020-10-20T13:36:41Z | 2020-10-20T13:36:41Z | NONE | Here is one quick sketch (done in Figma :P) for an idea: a possible filter to switch between showing all tables from all databases, or grouping tables by database. (the switch is interactive) When only 1 database: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=1%3A162&viewport=536%2C348%2C0.5&scaling=min-zoom Is this is useful, I can send some more suggestions/sketches. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign application homepage 714377268 | |
791089881 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-791089881 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc5MTA4OTg4MQ== | maxhawkins 28565 | 2021-03-05T02:03:19Z | 2021-03-05T02:03:19Z | NONE | I just tried to run this on a small VPS instance with 2GB of memory and it crashed out of memory while processing a 12GB mbox from Takeout. Is it possible to stream the emails to sqlite instead of loading it all into memory and upserting at once? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
849708617 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-849708617 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDg0OTcwODYxNw== | maxhawkins 28565 | 2021-05-27T15:01:42Z | 2021-05-27T15:01:42Z | NONE | Any updates? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
884672647 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-884672647 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40uwiH | maxhawkins 28565 | 2021-07-22T05:56:31Z | 2021-07-22T14:03:08Z | NONE | How does this commit look? https://github.com/maxhawkins/google-takeout-to-sqlite/commit/72802a83fee282eb5d02d388567731ba4301050d It seems that Takeout's mbox format is pretty simple, so we can get away with just splitting the file on lines begining with I was able to load a 12GB takeout mbox without the program using more than a couple hundred MB of memory during the import process. It does make us lose the progress bar, but maybe I can add that back in a later commit. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
885022230 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-885022230 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40wF4W | maxhawkins 28565 | 2021-07-22T15:51:46Z | 2021-07-22T15:51:46Z | NONE | One thing I noticed is this importer doesn't save attachments along with the body of the emails. It would be nice if those got stored as blobs in a separate attachments table so attachments can be included while fetching search results. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
885094284 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-885094284 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40wXeM | maxhawkins 28565 | 2021-07-22T17:41:32Z | 2021-07-22T17:41:32Z | NONE | I added a follow-up commit that deals with emails that don't have a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
888075098 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-888075098 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs407vNa | maxhawkins 28565 | 2021-07-28T07:18:56Z | 2021-07-28T07:18:56Z | NONE |
I did some investigation into this issue and made a fix here. The problem was that some messages (like gchat logs) don't have a @simonw While looking into this I found something unexpected about how sqlite_utils handles upserts if the pkey column is |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
894581223 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-894581223 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs41Ujnn | maxhawkins 28565 | 2021-08-07T00:57:48Z | 2021-08-07T00:57:48Z | NONE | Just added two more fixes:
I was able to run this on my Takeout export and everything seems to work fine. @simonw let me know if this looks good to merge. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
896378525 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-896378525 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs41baad | maxhawkins 28565 | 2021-08-10T23:28:45Z | 2021-08-10T23:28:45Z | NONE | I added parsing of text/html emails using BeautifulSoup. Around half of the emails in my archive don't include a text/plain payload so adding html parsing makes a good chunk of them searchable. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1003437288 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1003437288 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs47zzzo | maxhawkins 28565 | 2021-12-31T19:06:20Z | 2021-12-31T19:06:20Z | NONE |
Shouldn't be hard. The easiest way is probably to remove the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1710380941 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1710380941 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs5l8leN | maxhawkins 28565 | 2023-09-07T15:39:59Z | 2023-09-07T15:39:59Z | NONE |
Mailbox parses the entire mbox into memory. Using the lower level library lets us stream the emails in one at a time to support larger archives. Both libraries are in the stdlib. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
620401172 | https://github.com/simonw/datasette/issues/736#issuecomment-620401172 | https://api.github.com/repos/simonw/datasette/issues/736 | MDEyOklzc3VlQ29tbWVudDYyMDQwMTE3Mg== | aborruso 30607 | 2020-04-28T06:09:28Z | 2020-04-28T06:09:28Z | NONE |
It works in heroku, than might be a bug with datasette-publish-now. Thank you |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
strange behavior using accented characters 606720674 | |
620401443 | https://github.com/simonw/datasette/issues/735#issuecomment-620401443 | https://api.github.com/repos/simonw/datasette/issues/735 | MDEyOklzc3VlQ29tbWVudDYyMDQwMTQ0Mw== | aborruso 30607 | 2020-04-28T06:10:20Z | 2020-04-28T06:10:20Z | NONE | It works in heroku, than might be a bug with datasette-publish-now. Thank you |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Error when I click on "View and edit SQL" 605806386 | |
621008152 | https://github.com/simonw/datasette/issues/744#issuecomment-621008152 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyMTAwODE1Mg== | aborruso 30607 | 2020-04-29T06:05:02Z | 2020-04-29T06:05:02Z | NONE | Hi @simonw , I have installed it and I have the below errors.
No, /tmp folder is in the same volume. Thank you ``` Traceback (most recent call last): File "/home/aborruso/.local/lib/python3.7/site-packages/datasette/utils/init.py", line 607, in link_or_copy_directory shutil.copytree(src, dst, copy_function=os.link) File "/usr/lib/python3.7/shutil.py", line 365, in copytree raise Error(errors) shutil.Error: [('/var/youtubeComunePalermo/processing/./template/base.html', '/tmp/tmpcqv_1i5d/templates/base.html', "[Errno 18] Invalid cross-device link: '/var/youtubeComunePalermo/processing/./template/base.html' -> '/tmp/tmpcqv_1i5d/templates/base.html'"), ('/var/youtubeComunePalermo/processing/./template/index.html', '/tmp/tmpcqv_1i5d/templates/index.html', "[Errno 18] Invalid cross-device link: '/var/youtubeComunePalermo/processing/./template/index.html' -> '/tmp/tmpcqv_1i5d/templates/index.html'")] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/aborruso/.local/bin/datasette", line 8, in <module> sys.exit(cli()) File "/home/aborruso/.local/lib/python3.7/site-packages/click/core.py", line 829, in call return self.main(args, kwargs) File "/home/aborruso/.local/lib/python3.7/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/home/aborruso/.local/lib/python3.7/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/aborruso/.local/lib/python3.7/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/aborruso/.local/lib/python3.7/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, ctx.params) File "/home/aborruso/.local/lib/python3.7/site-packages/click/core.py", line 610, in invoke return callback(args, **kwargs) File "/home/aborruso/.local/lib/python3.7/site-packages/datasette/publish/heroku.py", line 103, in heroku extra_metadata, File "/usr/lib/python3.7/contextlib.py", line 112, in enter return next(self.gen) File "/home/aborruso/.local/lib/python3.7/site-packages/datasette/publish/heroku.py", line 191, in temporary_heroku_directory os.path.join(tmp.name, "templates"), File "/home/aborruso/.local/lib/python3.7/site-packages/datasette/utils/init.py", line 609, in link_or_copy_directory shutil.copytree(src, dst) File "/usr/lib/python3.7/shutil.py", line 321, in copytree os.makedirs(dst) File "/usr/lib/python3.7/os.py", line 221, in makedirs mkdir(name, mode) FileExistsError: [Errno 17] File exists: '/tmp/tmpcqv_1i5d/templates' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
621011554 | https://github.com/simonw/datasette/issues/744#issuecomment-621011554 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyMTAxMTU1NA== | aborruso 30607 | 2020-04-29T06:17:26Z | 2020-04-29T06:17:26Z | NONE | A stupid note: I have no It seems to me that it does not create any |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
621030783 | https://github.com/simonw/datasette/issues/744#issuecomment-621030783 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyMTAzMDc4Mw== | aborruso 30607 | 2020-04-29T07:16:27Z | 2020-04-29T07:16:27Z | NONE | Hi @simonw it's debian as Windows Subsystem for Linux
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
625060561 | https://github.com/simonw/datasette/issues/744#issuecomment-625060561 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyNTA2MDU2MQ== | aborruso 30607 | 2020-05-07T06:38:24Z | 2020-05-07T06:38:24Z | NONE | Hi @simonw probably I could try to do it in Python for windows. I do not like to do these things in win enviroment. Because probably WSL Linux env (in which I do a lot of great things) is not an environment that will be tested for datasette. In win I shouldn't have any problems. Am I right? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
625066073 | https://github.com/simonw/datasette/issues/744#issuecomment-625066073 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyNTA2NjA3Mw== | aborruso 30607 | 2020-05-07T06:53:09Z | 2020-05-07T06:53:09Z | NONE | @simonw another error starting from Windows. I run
And I have
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
625083715 | https://github.com/simonw/datasette/issues/744#issuecomment-625083715 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyNTA4MzcxNQ== | aborruso 30607 | 2020-05-07T07:34:18Z | 2020-05-07T07:34:18Z | NONE | In Windows I'm not very strong. I use debian (inside WSL). However these are the possible steps:
It's a very basic Python env that I do not use. This time only to reach my goal: try to publish using custom template |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
625091976 | https://github.com/simonw/datasette/issues/744#issuecomment-625091976 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYyNTA5MTk3Ng== | aborruso 30607 | 2020-05-07T07:51:25Z | 2020-05-07T07:51:25Z | NONE | I have installed Then I have removed from
And now I have ``` Traceback (most recent call last): File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\datasette\publish\heroku.py", line 210, in temporary_heroku_directory yield File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\datasette\publish\heroku.py", line 96, in heroku list_output = check_output(["heroku", "apps:list", "--json"]).decode( File "c:\python37\lib\subprocess.py", line 395, in check_output kwargs).stdout File "c:\python37\lib\subprocess.py", line 472, in run with Popen(*popenargs, kwargs) as process: File "c:\python37\lib\subprocess.py", line 775, in init restore_signals, start_new_session) File "c:\python37\lib\subprocess.py", line 1178, in _execute_child startupinfo) FileNotFoundError: [WinError 2] The specified file could not be found During handling of the above exception, another exception occurred: Traceback (most recent call last): File "c:\python37\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "c:\python37\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\aborr\AppData\Roaming\Python\Python37\Scripts\datasette.exe__main__.py", line 9, in <module> File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\click\core.py", line 829, in call return self.main(args, kwargs) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\click\core.py", line 782, in main rv = self.invoke(ctx) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\click\core.py", line 1066, in invoke return ctx.invoke(self.callback, ctx.params) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\click\core.py", line 610, in invoke return callback(args, **kwargs) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\datasette\publish\heroku.py", line 120, in heroku call(["heroku", "builds:create", "-a", app_name, "--include-vcs-ignore"]) File "c:\python37\lib\contextlib.py", line 130, in exit self.gen.throw(type, value, traceback) File "C:\Users\aborr\AppData\Roaming\Python\Python37\site-packages\datasette\publish\heroku.py", line 213, in temporary_heroku_directory tmp.cleanup() File "c:\python37\lib\tempfile.py", line 809, in cleanup _shutil.rmtree(self.name) File "c:\python37\lib\shutil.py", line 513, in rmtree return _rmtree_unsafe(path, onerror) File "c:\python37\lib\shutil.py", line 401, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File "c:\python37\lib\shutil.py", line 399, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] Unable to access file. The file is being used by another process: 'C:\Users\aborr\AppData\Local\Temp\tmpkcxy8i_q' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
632249565 | https://github.com/simonw/datasette/issues/744#issuecomment-632249565 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYzMjI0OTU2NQ== | aborruso 30607 | 2020-05-21T17:47:40Z | 2020-05-21T17:47:40Z | NONE | @simonw can I test it know? What I must do to update it? Thank you |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
632255088 | https://github.com/simonw/datasette/issues/744#issuecomment-632255088 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYzMjI1NTA4OA== | aborruso 30607 | 2020-05-21T17:58:51Z | 2020-05-21T17:58:51Z | NONE | Thank you very much!! I will try and I write you here |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
632305868 | https://github.com/simonw/datasette/issues/744#issuecomment-632305868 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYzMjMwNTg2OA== | aborruso 30607 | 2020-05-21T19:43:23Z | 2020-05-21T19:43:23Z | NONE | @simonw now I have
Do I must open a new issue? Thank you |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
634283355 | https://github.com/simonw/datasette/issues/744#issuecomment-634283355 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYzNDI4MzM1NQ== | aborruso 30607 | 2020-05-26T21:15:34Z | 2020-05-26T21:15:34Z | NONE |
Thank you very much |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
634446887 | https://github.com/simonw/datasette/issues/744#issuecomment-634446887 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYzNDQ0Njg4Nw== | aborruso 30607 | 2020-05-27T06:01:28Z | 2020-05-27T06:01:28Z | NONE | Dear @simonw thank you for your time, now IT WORKS!!! I hope that this edit to datasette code is not for an exceptional case (my PC configuration) and that it will be useful to other users. Thank you again!! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
635386935 | https://github.com/simonw/datasette/issues/744#issuecomment-635386935 | https://api.github.com/repos/simonw/datasette/issues/744 | MDEyOklzc3VlQ29tbWVudDYzNTM4NjkzNQ== | aborruso 30607 | 2020-05-28T14:32:53Z | 2020-05-28T14:32:53Z | NONE | Wow, I'm in some way very proud! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
link_or_copy_directory() error - Invalid cross-device link 608058890 | |
710768396 | https://github.com/simonw/sqlite-utils/issues/69#issuecomment-710768396 | https://api.github.com/repos/simonw/sqlite-utils/issues/69 | MDEyOklzc3VlQ29tbWVudDcxMDc2ODM5Ng== | aborruso 30607 | 2020-10-17T07:46:59Z | 2020-10-17T07:46:59Z | NONE | Great @simonw thank you very much |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature request: enable extensions loading 534507142 | |
710778368 | https://github.com/simonw/sqlite-utils/issues/188#issuecomment-710778368 | https://api.github.com/repos/simonw/sqlite-utils/issues/188 | MDEyOklzc3VlQ29tbWVudDcxMDc3ODM2OA== | aborruso 30607 | 2020-10-17T08:52:58Z | 2020-10-17T08:52:58Z | NONE | I have done a stupid question. If I run
I have Thank you for this great tool |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
About loading spatialite 723708310 | |
778008752 | https://github.com/simonw/datasette/issues/1220#issuecomment-778008752 | https://api.github.com/repos/simonw/datasette/issues/1220 | MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg== | aborruso 30607 | 2021-02-12T06:37:34Z | 2021-02-12T06:37:34Z | NONE | I have used my path, I'm running it from the folder in wich I have the db. Do I must an absolute path? Do I must create exactly that folder? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116 | |
778467759 | https://github.com/simonw/datasette/issues/1220#issuecomment-778467759 | https://api.github.com/repos/simonw/datasette/issues/1220 | MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ== | aborruso 30607 | 2021-02-12T21:35:17Z | 2021-02-12T21:35:17Z | NONE | Thank you |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116 | |
1279924827 | https://github.com/simonw/datasette/issues/1845#issuecomment-1279924827 | https://api.github.com/repos/simonw/datasette/issues/1845 | IC_kwDOBm6k_c5MShpb | kindly 30636 | 2022-10-16T08:54:53Z | 2022-10-16T08:54:53Z | NONE |
This would be great. My organization deals with very nested JSON open data and I have been wanting to find a way to hook into datasette so that the analysts do not have to first convert to sqlite first. This can kind of be done with datasette-lite. From this random nested JSON API: https://api.nobelprize.org/v1/prize.json You can use the API of https://flatterer.herokuapp.com to return a multi table sqlite database: This is great and fun, but it would be great if there was some plugin mechanism that you could feed a local datasette a nested JSON file directly, possibly hooking into other flattening tools for this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Reconsider the Datasette first-run experience 1410305897 | |
782745199 | https://github.com/simonw/datasette/issues/782#issuecomment-782745199 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4Mjc0NTE5OQ== | frankieroberto 30665 | 2021-02-20T20:32:03Z | 2021-02-20T20:32:03Z | NONE | I think it’s a good idea if the top level item of the response JSON is always an object, rather than an array, at least as the default. Mainly because it allows you to add extra keys in a backwards-compatible way. Also just seems more expected somehow. The API design guidance for the UK government also recommends this: https://www.gov.uk/guidance/gds-api-technical-and-data-standards#use-json I also strongly dislike having versioned APIs (eg with a |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 } |
Redesign default .json format 627794879 | |
782746755 | https://github.com/simonw/datasette/issues/782#issuecomment-782746755 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4Mjc0Njc1NQ== | frankieroberto 30665 | 2021-02-20T20:44:05Z | 2021-02-20T20:44:05Z | NONE | Minor suggestion: rename I like the idea of specifying a limit of 0 if you don’t want any rows data - and returning an empty array under the Have you given any thought as to whether to pretty print (format with spaces) the output or not? Can be useful for debugging/exploring in a browser or other basic tools which don’t parse the JSON. Could be default (can’t be much bigger with gzip?) or opt-in. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign default .json format 627794879 | |
783265830 | https://github.com/simonw/datasette/issues/782#issuecomment-783265830 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4MzI2NTgzMA== | frankieroberto 30665 | 2021-02-22T10:21:14Z | 2021-02-22T10:21:14Z | NONE | @simonw:
Interesting! Although I don't think it matters too much what the underlying implementation is - I more meant that |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign default .json format 627794879 | |
951731255 | https://github.com/simonw/datasette/pull/1204#issuecomment-951731255 | https://api.github.com/repos/simonw/datasette/issues/1204 | IC_kwDOBm6k_c44ukQ3 | 20after4 30934 | 2021-10-26T09:01:28Z | 2021-10-26T09:01:28Z | NONE |
Why not return a data structure instead of just a template name? I've already done some custom hacking to modify datasette but the plugin mechanism you are building here would be much cleaner than what I've built. I'd be happy to help with testing this PR and fleshing it out further if you are still considering merging this. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Plugin includes 793002853 | |
951740637 | https://github.com/simonw/datasette/issues/878#issuecomment-951740637 | https://api.github.com/repos/simonw/datasette/issues/878 | IC_kwDOBm6k_c44umjd | 20after4 30934 | 2021-10-26T09:12:15Z | 2021-10-26T09:12:15Z | NONE | This sounds really ambitious but also really awesome. I like the idea that basically any piece of a page could be selectively replaced. It sort of sounds like a python asyncio version of https://github.com/observablehq/runtime |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
New pattern for views that return either JSON or HTML, available for plugins 648435885 | |
981966693 | https://github.com/simonw/datasette/issues/1532#issuecomment-981966693 | https://api.github.com/repos/simonw/datasette/issues/1532 | IC_kwDOBm6k_c46h59l | 20after4 30934 | 2021-11-29T19:56:52Z | 2021-11-29T19:56:52Z | NONE | FWIW I've written some web components that consume the json api and I think it's a really nice way to work with datasette. I like the combination with datasette+sqlite as a back-end feeding data to a front-end that's entirely javascript + html. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use datasette-table Web Component to guide the design of the JSON API for 1.0 1065429936 | |
981980048 | https://github.com/simonw/datasette/issues/1304#issuecomment-981980048 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c46h9OQ | 20after4 30934 | 2021-11-29T20:13:53Z | 2021-11-29T20:14:11Z | NONE | There isn't any way to do this with sqlite as far as I know. The only option is to insert the right number of ? placeholders into the sql template and then provide an array of values. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
982745406 | https://github.com/simonw/datasette/issues/1532#issuecomment-982745406 | https://api.github.com/repos/simonw/datasette/issues/1532 | IC_kwDOBm6k_c46k4E- | 20after4 30934 | 2021-11-30T15:28:57Z | 2021-11-30T15:28:57Z | NONE | It's a really great API and the documentation is really great too. Honestly, in more than 20 years of professional experience, I haven't worked with any software API that was more of a joy to use. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use datasette-table Web Component to guide the design of the JSON API for 1.0 1065429936 | |
988461884 | https://github.com/simonw/datasette/issues/1304#issuecomment-988461884 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c466rs8 | 20after4 30934 | 2021-12-08T03:20:26Z | 2021-12-08T03:20:26Z | NONE | The easiest or most straightforward thing to do is to use named parameters like:
And simply construct the list of placeholders dynamically based on the number of values. Doing this is possible with datasette if you forgo "canned queries" and just use the raw query endpoint and pass the query sql, along with p1, p2 ... in the request. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
988463455 | https://github.com/simonw/datasette/issues/1304#issuecomment-988463455 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c466sFf | 20after4 30934 | 2021-12-08T03:23:14Z | 2021-12-08T03:23:14Z | NONE | I actually think it would be a useful thing to add support for in datasette. It wouldn't be difficult to unwind an array of params and add the placeholders automatically. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
988468238 | https://github.com/simonw/datasette/issues/1528#issuecomment-988468238 | https://api.github.com/repos/simonw/datasette/issues/1528 | IC_kwDOBm6k_c466tQO | 20after4 30934 | 2021-12-08T03:35:45Z | 2021-12-08T03:35:45Z | NONE | FWIW I implemented something similar with a bit of plugin code: ```python @hookimpl def canned_queries(datasette: Datasette, database: str) -> Mapping[str, str]: # load "canned queries" from the filesystem under # www/sql/db/query_name.sql queries = {}
``` |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
Add new `"sql_file"` key to Canned Queries in metadata? 1060631257 | |
1722943484 | https://github.com/simonw/datasette/pull/2052#issuecomment-1722943484 | https://api.github.com/repos/simonw/datasette/issues/2052 | IC_kwDOBm6k_c5msgf8 | 20after4 30934 | 2023-09-18T08:14:47Z | 2023-09-18T08:14:47Z | NONE | This is such a well thought out contribution. I don't think I've seen such a thoroughly considered PR on any project in recent memory. |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214 | |
941274088 | https://github.com/dogsheep/swarm-to-sqlite/issues/12#issuecomment-941274088 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/12 | IC_kwDODD6af844GrPo | fs111 33631 | 2021-10-12T18:31:57Z | 2021-10-12T18:31:57Z | NONE | I am running into the same problem. Is there any workaround? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
403 when getting token 951817328 | |
1008279307 | https://github.com/simonw/datasette/pull/1574#issuecomment-1008279307 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c48GR8L | fs111 33631 | 2022-01-09T11:26:06Z | 2022-01-09T11:26:06Z | NONE | @fgregg my thinking was backwards compatibility. I don't know what people do to their builds, I just wanted a smaller image for my use case. @simonw any chance to take a look at this? If there is no interest, feel free to close the PR |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
introduce new option for datasette package to use a slim base image 1084193403 | |
1084216224 | https://github.com/simonw/datasette/pull/1574#issuecomment-1084216224 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c5An9Og | fs111 33631 | 2022-03-31T07:45:25Z | 2022-03-31T07:45:25Z | NONE | @simonw I like that you want to go "slim by default". Do you want another PR for that or should I just wait? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
introduce new option for datasette package to use a slim base image 1084193403 | |
1214765672 | https://github.com/simonw/datasette/pull/1574#issuecomment-1214765672 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c5IZ9po | fs111 33631 | 2022-08-15T08:49:31Z | 2022-08-15T08:49:31Z | NONE | closing as this is now the default |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
introduce new option for datasette package to use a slim base image 1084193403 | |
592999503 | https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503 | https://api.github.com/repos/simonw/sqlite-utils/issues/46 | MDEyOklzc3VlQ29tbWVudDU5Mjk5OTUwMw== | chrishas35 35075 | 2020-02-29T22:08:20Z | 2020-02-29T22:08:20Z | NONE | @simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, Initial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so:
I haven't dug too much into the existing code yet, but does this make sense? Worth doing? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
extracts= option for insert/update/etc 471780443 | |
593122605 | https://github.com/simonw/sqlite-utils/issues/89#issuecomment-593122605 | https://api.github.com/repos/simonw/sqlite-utils/issues/89 | MDEyOklzc3VlQ29tbWVudDU5MzEyMjYwNQ== | chrishas35 35075 | 2020-03-01T17:33:11Z | 2020-03-01T17:33:11Z | NONE | If you're happy with the proposed implementation, I have code & tests written that I'll get ready for a PR. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to customize columns used by extracts= feature 573578548 | |
803502424 | https://github.com/simonw/sqlite-utils/issues/249#issuecomment-803502424 | https://api.github.com/repos/simonw/sqlite-utils/issues/249 | MDEyOklzc3VlQ29tbWVudDgwMzUwMjQyNA== | prabhur 36287 | 2021-03-21T02:43:32Z | 2021-03-21T02:43:32Z | NONE |
Wow. Wasn't expecting a response this quick, especially during a weekend. :-) Sincerely appreciate it.
I tried the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Full text search possibly broken? 836963850 | |
1261194164 | https://github.com/simonw/datasette/issues/1624#issuecomment-1261194164 | https://api.github.com/repos/simonw/datasette/issues/1624 | IC_kwDOBm6k_c5LLEu0 | palfrey 38532 | 2022-09-28T16:54:22Z | 2022-09-28T16:54:22Z | NONE | https://github.com/simonw/datasette-cors seems to workaround this |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Index page `/` has no CORS headers 1122427321 | |
633234781 | https://github.com/dogsheep/dogsheep-photos/issues/20#issuecomment-633234781 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20 | MDEyOklzc3VlQ29tbWVudDYzMzIzNDc4MQ== | dmd 41439 | 2020-05-24T13:56:13Z | 2020-05-24T13:56:13Z | NONE | As that seems to be closed, can you give a hint on how to make this work? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to serve thumbnailed Apple Photo from its place on disk 613006393 | |
1537744000 | https://github.com/simonw/sqlite-utils/issues/540#issuecomment-1537744000 | https://api.github.com/repos/simonw/sqlite-utils/issues/540 | IC_kwDOCGYnMM5bqByA | pquentin 42327 | 2023-05-08T04:56:12Z | 2023-05-08T04:56:12Z | NONE | Hey @simonw, urllib3 maintainer here :wave: Sorry for breaking your CI. I understand you may prefer to pin the Python version, but note that specifying just I can open PRs to sqlite-utils / datasette if you're interested |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
sphinx.builders.linkcheck build error 1699184583 | |
472844001 | https://github.com/simonw/datasette/issues/409#issuecomment-472844001 | https://api.github.com/repos/simonw/datasette/issues/409 | MDEyOklzc3VlQ29tbWVudDQ3Mjg0NDAwMQ== | Uninen 43100 | 2019-03-14T13:04:20Z | 2019-03-14T13:04:42Z | NONE | It seems this affects the Datasette Publish -site as well: https://github.com/simonw/datasette-publish-support/issues/3 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Zeit API v1 does not work for new users - need to migrate to v2 408376825 | |
1316289392 | https://github.com/simonw/datasette/issues/1886#issuecomment-1316289392 | https://api.github.com/repos/simonw/datasette/issues/1886 | IC_kwDOBm6k_c5OdPtw | rtanglao 45195 | 2022-11-16T03:54:17Z | 2022-11-16T03:58:56Z | NONE | Happy Birthday Datasette! Thanks Simon!! I use datasette on everything most notably my flickr metadata SQLite DB to make art. Datasette lite on my 2019 flickr metadata is super helpful too: https://lite.datasette.io/?csv=https%3A%2F%2Fraw.githubusercontent.com%2Frtanglao%2Frt-flickr-sqlite-csv%2Fmain%2F2019-roland-flickr-metadata.csv Even better datasette lite on all firefox support questions from 2021: https://lite.datasette.io/?url=https%3A%2F%2Fraw.githubusercontent.com%2Frtanglao%2Frt-kits-api3%2Fmain%2FYEARLY_CSV_FILES%2F2021-firefox-sumo-questions.db Thanks again Simon! So great! What a gift to the world!!!!!! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Call for birthday presents: if you're using Datasette, let us know how you're using it here 1447050738 | |
697973420 | https://github.com/simonw/datasette/issues/619#issuecomment-697973420 | https://api.github.com/repos/simonw/datasette/issues/619 | MDEyOklzc3VlQ29tbWVudDY5Nzk3MzQyMA== | obra 45416 | 2020-09-23T21:07:58Z | 2020-09-23T21:07:58Z | NONE | I've just run into this after crafting a complex query and discovered that hitting back loses my query. Even showing me the whole bad query would be a huge improvement over the current status quo. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"Invalid SQL" page should let you edit the SQL 520655983 | |
698110186 | https://github.com/simonw/datasette/issues/123#issuecomment-698110186 | https://api.github.com/repos/simonw/datasette/issues/123 | MDEyOklzc3VlQ29tbWVudDY5ODExMDE4Ng== | obra 45416 | 2020-09-24T04:49:51Z | 2020-09-24T04:49:51Z | NONE | As a half-measure, I'd get value out of being able to upload a CSV and have datasette run csv-to-sqlite on it. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette serve should accept paths/URLs to CSVs and other file formats 275125561 | |
698174957 | https://github.com/simonw/datasette/issues/123#issuecomment-698174957 | https://api.github.com/repos/simonw/datasette/issues/123 | MDEyOklzc3VlQ29tbWVudDY5ODE3NDk1Nw== | obra 45416 | 2020-09-24T07:42:05Z | 2020-09-24T07:42:05Z | NONE | Oh. Awesome. On Thu, Sep 24, 2020 at 12:28:53AM -0700, Simon Willison wrote:
-- |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette serve should accept paths/URLs to CSVs and other file formats 275125561 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
issue_url 622 ✖