issue_comments
8,883 rows where user = 9599 sorted by issue_url
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date)
issue >1000
- Show column metadata plus links for foreign keys on arbitrary query results 51
- Redesign default .json format 50
- ?_extra= support (draft) 48
- Rethink how .ext formats (v.s. ?_format=) works before 1.0 47
- Updated Dockerfile with SpatiaLite version 5.0 45
- Complete refactor of TableView and table.html template 45
- Port Datasette to ASGI 38
- Authentication (and permissions) as a core concept 38
- JavaScript plugin hooks mechanism similar to pluggy 38
- invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 35
- await datasette.client.get(path) mechanism for executing internal requests 33
- Maintain an in-memory SQLite table of connected databases and their tables 31
- Deploy a live instance of demos/apache-proxy 31
- Server hang on parallel execution of queries to named in-memory databases 30
- Ability to sort (and paginate) by column 29
- Research: demonstrate if parallel SQL queries are worthwhile 29
- Default API token authentication mechanism 29
- Port as many tests as possible to async def tests against ds_client 28
- Add ?_extra= mechanism for requesting extra properties in JSON 27
- Export to CSV 27
- Optimize all those calls to index_list and foreign_key_list 27
- Ability for a canned query to write to the database 26
- table.transform() method for advanced alter table 26
- Upgrade to CodeMirror 6, add SQL autocomplete 26
- Proof of concept for Datasette on AWS Lambda with EFS 25
- New pattern for views that return either JSON or HTML, available for plugins 25
- DeprecationWarning: pkg_resources is deprecated as an API 25
- Support cross-database joins 24
- Redesign register_output_renderer callback 24
- "datasette insert" command and plugin hook 23
- API explorer tool 23
- Option for importing CSV data using the SQLite .import mechanism 22
- UI to create reduced scope tokens from the `/-/create-token` page 22
- Datasette Plugins 21
- table.extract(...) method and "sqlite-utils extract" command 21
- ?sort=colname~numeric to sort by by column cast to real 21
- Use YAML examples in documentation by default, not JSON 21
- Idea: import CSV to memory, run SQL, export in a single command 21
- base_url is omitted in JSON and CSV views 21
- Switch documentation theme to Furo 21
- If a row has a primary key of `null` various things break 21
- "flash messages" mechanism 20
- Move CI to GitHub Issues 20
- load_template hook doesn't work for include/extends 20
- Mechanism for storing metadata in _metadata tables 20
- Introduce concept of a database `route`, separate from its name 20
- CSV files with too many values in a row cause errors 20
- register_permissions(datasette) plugin hook 20
- API tokens with view-table but not view-database/view-instance cannot access the table 20
- Better way of representing binary data in .csv output 19
- Introspect if table is FTS4 or FTS5 19
- A proper favicon 19
- Make it easier to insert geometries, with documentation and maybe code 19
- `datasette create-token` ability to create tokens with a reduced set of permissions 19
- Ability to ship alpha and beta releases 18
- Magic parameters for canned queries 18
- Figure out why SpatiaLite 5.0 hangs the database page on Linux 18
- Update screenshots in documentation to match latest designs 18
- datasette.client internal requests mechanism 17
- Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" 17
- API to insert a single record into an existing table 17
- Facets 16
- ?_col= and ?_nocol= support for toggling columns on table view 16
- Support "allow" block on root, databases and tables, not just queries 16
- Action menu for table columns 16
- Consider using CSP to protect against future XSS 16
- `--batch-size 1` doesn't seem to commit for every item 16
- Intermittent "Too many open files" error running tests 16
- Update a single record in an existing table 16
- Resolve the difference between `wrap_view()` and `BaseView` 16
- Package as standalone binary 15
- Bug: Sort by column with NULL in next_page URL 15
- Mechanism for customizing the SQL used to select specific columns in the table view 15
- The ".upsert()" method is misnamed 15
- --dirs option for scanning directories for SQLite databases 15
- Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread() 15
- latest.datasette.io is no longer updating 15
- link_or_copy_directory() error - Invalid cross-device link 15
- "sqlite-utils convert" command to replace the separate "sqlite-transform" tool 15
- Tests reliably failing on Python 3.7 15
- Autocomplete text entry for filter values that correspond to facets 15
- De-tangling Metadata before Datasette 1.0 15
- Documentation with recommendations on running Datasette in production without using Docker 14
- .execute_write() and .execute_write_fn() methods on Database 14
- Upload all my photos to a secure S3 bucket 14
- Canned query permissions mechanism 14
- "datasette -p 0 --root" gives the wrong URL 14
- Make it possible to download BLOB data from the Datasette UI 14
- Plugin hook for loading templates 14
- --lines and --text and --convert and --import 14
- Documentation should clarify /stable/ vs /latest/ 14
- "permissions" propery in metadata for configuring arbitrary permissions 14
- Design plugin hook for extras 14
- `handle_exception` plugin hook for custom error handling 14
- Refactor out the keyset pagination code 14
- Ability to customize presentation of specific columns in HTML view 13
- Allow plugins to define additional URL routes and views 13
- Handle spatialite geometry columns better 13
- Fix all the places that currently use .inspect() data 13
- Plugin hook: filters_from_request 13
- If you apply ?_facet_array=tags then &_facet=tags does nothing 13
- Mechanism for skipping CSRF checks on API posts 13
- Support column descriptions in metadata.json 13
- table.transform() method 13
- Policy on documenting "public" datasette.utils functions 13
- WIP: Add Gmail takeout mbox import 13
- sqlite-utils extract could handle nested objects 13
- `register_commands()` plugin hook to register extra CLI commands 13
- Support STRICT tables 13
- Refactor TableView to use asyncinject 13
- Write API in Datasette core 13
- Make sure CORS works for write APIs 13
- Potential feature: special support for `?a=1&a=2` on the query page 13
- Add “updated” to metadata 12
- Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1 12
- Package datasette for installation using homebrew 12
- _facet_array should work against views 12
- Port Datasette from Sanic to ASGI + Uvicorn 12
- Stream all results for arbitrary SQL and canned queries 12
- "Invalid SQL" page should let you edit the SQL 12
- Import machine-learning detected labels (dog, llama etc) from Apple Photos 12
- Having view-table permission but NOT view-database should still grant access to /db/table 12
- Efficiently calculate list of databases/tables a user can view 12
- Support creating descending order indexes 12
- Serve using UNIX domain socket 12
- Rethink approach to [ and ] in column names (currently throws error) 12
- Fix compatibility with Python 3.10 12
- Research: CTEs and union all to calculate facets AND query at the same time 12
- Traces should include SQL executed by subtasks created with `asyncio.gather` 12
- Ensure "pip install datasette" still works with Python 3.6 12
- Tilde encoding: use ~ instead of - for dash-encoding 12
- Code examples in the documentation should be formatted with Black 12
- Implement ?_extra and new API design for TableView 12
- Mechanism for ensuring a table has all the columns 12
- API for bulk inserting records into a table 12
- `/db/-/create` API for creating tables 12
- Errors when using table filters behind a proxy 12
- WIP new JSON for queries 12
- Make detailed notes on how table, query and row views work right now 12
- .transform() instead of modifying sqlite_master for add_foreign_keys 12
- Implement command-line tool interface 11
- Dockerfile should build more recent SQLite with FTS5 and spatialite support 11
- Option to expose expanded foreign keys in JSON/CSV 11
- Get Datasette tests passing on Windows in GitHub Actions 11
- Mechanism for adding arbitrary pages like /about 11
- Prototoype for Datasette on PostgreSQL 11
- Mechanism for checking if a SQLite database file is safe to open 11
- Expand plugins documentation to multiple pages 11
- Mechanism for plugins to add action menu items for various things 11
- --since feature can be confused by retweets 11
- Datasette secret mechanism - initially for signed cookies 11
- Writable canned queries live demo on Glitch 11
- POST to /db/canned-query that returns JSON should be supported (for API clients) 11
- datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates 11
- Writable canned queries with magic parameters fail if POST body is empty 11
- .json and .csv exports fail to apply base_url 11
- Database class mechanism for cross-connection in-memory databases 11
- Race condition errors in new refresh_schemas() mechanism 11
- Plugin hook for dynamic metadata 11
- "Query parameters" form shows wrong input fields if query contains "03:31" style times 11
- sqlite-utils index-foreign-keys fails due to pre-existing index 11
- `sqlite-utils insert --convert` option 11
- Research how much of a difference analyze / sqlite_stat1 makes 11
- Optional Pandas integration 11
- Research: how much overhead does the n=1 time limit have? 11
- Document how to use a `--convert` function that runs initialization code first 11
- Misleading progress bar against utf-16-le CSV input 11
- google cloudrun updated their limits on maxscale based on memory and cpu count 11
- sqlite-utils query --functions mechanism for registering extra functions 11
- Expose convert recipes to `sqlite-utils --functions` 11
- `prepare_jinja2_environment()` hook should take `datasette` argument 11
- Ensure insert API has good tests for rowid and compound primark key tables 11
- New JSON design for query views 11
- Set up some example datasets on a Cloudflare-backed domain 10
- Filter UI on table page 10
- Table view should support filtering via many-to-many relationships 10
- base_url configuration setting 10
- New design for facet abstraction, including querystring and metadata.json 10
- Improvements to table label detection 10
- Syntactic sugar for creating m2m records 10
- Mechanism for turning nested JSON into foreign keys / many-to-many 10
- extracts= should support multiple-column extracts 10
- Documented internals API for use in plugins 10
- --cp option for datasette publish and datasette package for shipping additional files and directories 10
- Mechanism for writing to database via a queue 10
- base_url doesn't entirely work for running Datasette inside Binder 10
- See if I can get Datasette working on Zeit Now v2 10
- Release Datasette 0.44 10
- Rename master branch to main 10
- Plugin hook for instance/database/table metadata 10
- Refactor default views to use register_routes 10
- CLI utility for inserting binary files into SQLite 10
- FTS table with 7 rows has _fts_docsize table with 9,141 rows 10
- Navigation menu plus plugin hook 10
- register_output_renderer() should support streaming data 10
- Ability for plugins to collaborate when adding extra HTML to blocks in default templates 10
- Async support 10
- Test Datasette Docker images built for different architectures 10
- `default_allow_sql` setting (a re-imagining of the old `allow_sql` setting) 10
- render_cell() hook should support returning an awaitable 10
- Docker configuration for exercising Datasette behind Apache mod_proxy 10
- Python library methods for calling ANALYZE 10
- Remove Hashed URL mode 10
- Options for how `r.parsedate()` should handle invalid dates 10
- If user can see table but NOT database/instance nav links should not display 10
- test_recreate failing on Windows Python 3.11 10
- `.json` errors should be returned as JSON 10
- Failing test: httpx.InvalidURL: URL too long 10
- Config file with support for defining canned queries 9
- Default to opening files in mutable mode, special option for immutable files 9
- Option to display binary data 9
- Refactor TableView.data() method 9
- Set up a live demo Datasette instance 9
- Move hashed URL mode out to a plugin 9
- Ability to serve thumbnailed Apple Photo from its place on disk 9
- New WIP writable canned queries 9
- Example permissions plugin 9
- Research feasibility of 100% test coverage 9
- canned_queries() plugin hook 9
- Improve performance of extract operations 9
- Figure out how to run an environment that exercises the base_url proxy setting 9
- Switch to .blob render extension for BLOB downloads 9
- sqlite-utils search command 9
- Datasette on Amazon Linux on ARM returns 404 for static assets 9
- Better internal database_name for _internal database 9
- Mechanism for minifying JavaScript that ships with Datasette 9
- Adopt Prettier for JavaScript code formatting 9
- Use _counts to speed up counts 9
- Use force_https_urls on when deploying with Cloud Run 9
- --no-headers option for CSV and TSV 9
- CSV ?_stream=on redundantly calculates facets for every page 9
- Research: syntactic sugar for using --get with SQL queries, maybe "datasette query" 9
- Add reference page to documentation using Sphinx autodoc 9
- create-index should run analyze after creating index 9
- Table+query JSON and CSV links broken when using `base_url` setting 9
- Advanced class-based `conversions=` mechanism 9
- Writable canned queries fail with useless non-error against immutable databases 9
- Get Datasette compatible with Pyodide 9
- Add --ignore option to more commands 9
- Ability to set a custom favicon 9
- Ability to load JSON records held in a file with a single top level key that is a list of objects 9
- SQL query field can't begin by a comment 9
- Tool for simulating permission checks against actors 9
- Release Datasette 1.0a0 9
- Refactor test suite to use mostly `async def` tests 9
- Use sqlean if available in environment 9
- Get `add_foreign_keys()` to work without modifying `sqlite_master` 9
- `datasette publish` needs support for the new config/metadata split 9
- Make URLs immutable 8
- datasette publish heroku 8
- Mechanism for ranking results from SQLite full-text search 8
- URL hashing now optional: turn on with --config hash_urls:1 (#418) 8
- sqlite-utils create-table command 8
- Enforce import sort order with isort 8
- Add a universal navigation bar which can be modified by plugins 8
- Command to fetch stargazers for one or more repos 8
- Commits in GitHub API can have null author 8
- extra_template_vars() sending wrong view_name for index 8
- Import photo metadata from Apple Photos into SQLite 8
- Visually distinguish integer and text columns 8
- Allow-list pragma_table_info(tablename) and similar 8
- Rename project to dogsheep-photos 8
- Consolidate request.raw_args and request.args 8
- Database page loads too slowly with many large tables (due to table counts) 8
- base_url doesn't seem to work when adding criteria and clicking "apply" 8
- Upgrade CodeMirror 8
- Mechanism for defining custom display of results 8
- the JSON object must be str, bytes or bytearray, not 'Undefined' 8
- OPTIONS requests return a 500 error 8
- GENERATED column support 8
- Establish pattern for release branches to support bug fixes 8
- Mechanism for executing JavaScript unit tests 8
- Make original path available to render hooks 8
- --sniff option for sniffing delimiters 8
- Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 8
- Ability to increase size of the SQL editor window 8
- "invalid reference format" publishing Docker image 8
- Tests failing with FileNotFoundError in runner.isolated_filesystem 8
- Show count of facet values if ?_facet_size=max 8
- Test against pysqlite3 running SQLite 3.37 8
- Documented JavaScript variables on different templates made available for plugins 8
- Add new spatialite helper methods 8
- Get rid of the no-longer necessary ?_format=json hack for tables called x.json 8
- Refactor and simplify Datasette routing and views 8
- Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply 8
- Table/database that is private due to inherited permissions does not show padlock 8
- Serve schema JSON to the SQL editor to enable autocomplete 8
- Some plugins show "home" breadcrumbs twice in the top left 8
- `table.upsert_all` fails to write rows when `not_null` is present 8
- Datasette should serve Access-Control-Max-Age 8
- Deploy failing with "plugins/alternative_route.py: Not a directory" 8
- ?_group_count=country - return counts by specific column(s) 7
- Ability to bundle and serve additional static files 7
- Metadata should be a nested arbitrary KV store 7
- Keyset pagination doesn't work correctly for compound primary keys 7
- Support for units 7
- prepare_context() plugin hook 7
- Improve and document foreign_keys=... argument to insert/create/etc 7
- Datasette Library 7
- Utility mechanism for plugins to render templates 7
- Syntax for ?_through= that works as a form field 7
- ?_searchmode=raw option for running FTS searches without escaping characters 7
- datasette publish cloudrun --memory option 7
- Update SQLite bundled with Docker container 7
- index.html is not reliably loaded from a plugin 7
- .columns_dict doesn't work for all possible column types 7
- Option to automatically configure based on directory layout 7
- Replace "datasette publish --extra-options" with "--setting" 7
- sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns 7
- Group permission checks by request on /-/permissions debug page 7
- Demo is failing to deploy 7
- Docker container is no longer being pushed (it's stuck on 0.45) 7
- Push to Docker Hub failed - but it shouldn't run for alpha releases anyway 7
- Simplify imports of common classes 7
- SQLITE_MAX_VARS maybe hard-coded too low 7
- Commands for making authenticated API calls 7
- Pagination 7
- Support the dbstat table 7
- Much, much faster extract() implementation 7
- Documented HTML hooks for JavaScript plugin authors 7
- Wide tables should scroll horizontally within the page 7
- Fix last remaining links to "/" that do not respect base_url 7
- Bring date parsing into Datasette core 7
- Documentation and unit tests for urls.row() urls.row_blob() methods 7
- "View all" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them 7
- GitHub Actions workflow to build and sign macOS binary executables 7
- --crossdb option for joining across databases 7
- Custom pages don't work with base_url setting 7
- table.pks_and_rows_where() method returning primary keys along with the rows 7
- Latest Datasette tags missing from Docker Hub 7
- "More" link for facets that shows _facet_size=max results 7
- ?_nocol= does not interact well with default facets 7
- sqlite-utils memory command for directly querying CSV/JSON data 7
- sqlite-utils memory should handle TSV and JSON in addition to CSV 7
- Introspection property for telling if a table is a rowid table 7
- absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 7
- Manage /robots.txt in Datasette core, block robots by default 7
- Query page .csv and .json links are not correctly URL-encoded on Vercel under unknown specific conditions 7
- [Enhancement] Please allow 'insert-files' to insert content as text. 7
- Extra options to `lookup()` which get passed to `insert()` 7
- Columns starting with an underscore behave poorly in filters 7
- Allow passing a file of code to "sqlite-utils convert" 7
- Test failure in test_rebuild_fts 7
- Allow to set `facets_array` in metadata (like current `facets`) 7
- `.execute_write(... block=True)` should be the default behaviour 7
- Link to stable docs from older versions 7
- Add SpatiaLite helpers to CLI 7
- Support for generated columns 7
- I forgot to include the changelog in the 3.25.1 release 7
- Remove hashed URL mode 7
- Extract out `check_permissions()` from `BaseView 7
- "Error: near "(": syntax error" when using sqlite-utils indexes CLI 7
- `--nolock` feature for opening locked databases 7
- Add new entrypoint option to `--load-extension` 7
- Upgrade Datasette Docker to Python 3.11 7
- Figure out design for JSON errors (consider RFC 7807) 7
- /db/table/-/upsert API 7
- datasette package --spatialite throws error during build 7
- Hacker News Datasette write demo 7
- First working version 7
- 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" 7
- table.create(..., replace=True) 7
- [feature request]`datasette install plugins.json` options 7
- CLI equivalents to `transform(add_foreign_keys=)` 7
- Cascade for restricted token view-table/view-database/view-instance operations 7
- Addressable pages for every row in a table 6
- Default HTML/CSS needs to look reasonable and be responsive 6
- Support Django-style filters in querystring arguments 6
- Detect foreign keys and use them to link HTML pages together 6
- Nasty bug: last column not being correctly displayed 6
- Load plugins from a `--plugins-dir=plugins/` directory 6
- Ability for plugins to define extra JavaScript and CSS 6
- inspect() should detect many-to-many relationships 6
- Build Dockerfile with recent Sqlite + Spatialite 6
- inspect should record column types 6
- Deploy demo of Datasette on every commit that passes tests 6
- Plugin hook for loading metadata.json 6
- Faceted browse against a JSON list of tags 6
- ?_where=sql-fragment parameter for table views 6
- "datasette publish cloudrun" command to publish to Google Cloud Run 6
- Additional Column Constraints? 6
- Easier way of creating custom row templates 6
- Experiment with type hints 6
- Command for running a search and saving tweets for that search 6
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 6
- Improve UI of "datasette publish cloudrun" to reduce chances of accidentally over-writing a service 6
- Mechanism for indicating foreign key relationships in the table and query page URLs 6
- allow leading comments in SQL input field 6
- Problem with square bracket in CSV column name 6
- "Templates considered" comment broken in >=0.35 6
- Documentation for the "request" object 6
- Support YAML in metadata - metadata.yaml 6
- Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records 6
- Command for retrieving dependents for a repo 6
- Support decimal.Decimal type 6
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 6
- allow_by_query setting for configuring permissions with a SQL statement 6
- python tests/fixtures.py command has a bug 6
- Mechanism for specifying allow_sql permission in metadata.json 6
- Way to enable a default=False permission for anonymous users 6
- Ability to set ds_actor cookie such that it expires 6
- startup() plugin hook 6
- Incorrect URLs when served behind a proxy with base_url set 6
- "Too many open files" error running tests 6
- datasette.add_message() doesn't work inside plugins 6
- Consider dropping explicit CSRF protection entirely? 6
- Support reverse pagination (previous page, has-previous-items) 6
- Datasette sdist is missing templates (hence broken when installing from Homebrew) 6
- End-user documentation 6
- extra_ plugin hooks should take the same arguments 6
- Private/secret databases: database files that are only visible to plugins 6
- Mechanism for differentiating between "by me" and "liked by me" 6
- Rendering glitch with column headings on mobile 6
- Redesign application homepage 6
- Change "--config foo:bar" to "--setting foo bar" 6
- Add Link: pagination HTTP headers 6
- Figure out how to display images from <en-media> tags inline in Datasette 6
- "Edit SQL" button on canned queries 6
- Method for datasette.client() to forward on authentication 6
- export.xml file name varies with different language settings 6
- Better display of binary data on arbitrary query results page 6
- Table actions menu on view pages, not on query pages 6
- PrefixedUrlString mechanism broke everything 6
- Support order by relevance against FTS4 6
- sqlite-utils analyze-tables command and table.analyze_column() method 6
- Invalid SQL: "no such table: pragma_database_list" on database page 6
- Add support for Jinja2 version 3.0 6
- `sqlite-utils indexes` command 6
- `db.query()` method (renamed `db.execute_returning_dicts()`) 6
- "searchmode": "raw" in table metadata 6
- `table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option 6
- sqlite-utils insert errors should show SQL and parameters, if possible 6
- Mechanism to cause specific branches to deploy their own demos 6
- ReadTheDocs build failed for 0.59.2 release 6
- New pattern for async view classes 6
- Idea: hover to reveal details of linked row 6
- Release Datasette 0.60 6
- Drop support for Python 3.6 6
- Support mutating row in `--convert` without returning it 6
- Maybe let plugins define custom serve options? 6
- datasette one.db one.db opens database twice, as one and one_2 6
- Use dash encoding for table names and row primary keys in URLs 6
- Ship Datasette 0.61 6
- .db downloads should be served with an ETag 6
- Upgrade `--load-extension` to accept entrypoints like Datasette 6
- Ability to set a custom facet_size per table 6
- truncate_cells_html does not work for links? 6
- progressbar for inserts/upserts of all fileformats, closes #485 6
- Expose `sql` and `params` arguments to various plugin hooks 6
- Interactive demo of Datasette 1.0 write APIs 6
- /db/table/-/upsert 6
- `datasette.create_token(...)` method for creating signed API tokens 6
- datasette --root running in Docker doesn't reliably show the magic URL 6
- `publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 6
- Folder support 6
- Try out Trogon for a tui interface 6
- Make as many examples in the CLI docs as possible copy-and-pastable 6
- Table renaming: db.rename_table() and sqlite-utils rename-table 6
- Plugin system 6
- Bump sphinx, furo, blacken-docs dependencies 6
- Consider a request/response wrapping hook slightly higher level than asgi_wrapper() 6
- `table.transform()` should preserve `rowid` values 6
- Plugin hook: `actors_from_ids()` 6
- "Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 6
- Detailed upgrade instructions for metadata.yaml -> datasette.yaml 6
- Experiment with patterns for concurrent long running queries 5
- Create neat example database 5
- Redesign JSON output, ditch jsono, offer variants controlled by parameter instead 5
- Datasette serve should accept paths/URLs to CSVs and other file formats 5
- add "format sql" button to query page, uses sql-formatter 5
- Refactor views 5
- Validate metadata.json on startup 5
- Ability to enable/disable specific features via --config 5
- Custom URL routing with independent tests 5
- Travis should push tagged images to Docker Hub for each release 5
- Get Datasette working with Zeit Now v2's 100MB image size limit 5
- CSV export in "Advanced export" pane doesn't respect query 5
- Hashed URLs should be optional 5
- Define mechanism for plugins to return structured data 5
- Plugin for allowing CORS from specified hosts 5
- Design changes to homepage to support mutable files 5
- Rename metadata.json to config.json 5
- Full text search of all tables at once? 5
- Populate "endpoint" key in ASGI scope 5
- extra_template_vars plugin hook 5
- Rethink progress bars for various commands 5
- [enhancement] Method to delete a row in python 5
- Testing utilities should be available to plugins 5
- Handle really wide tables better 5
- If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar 5
- stargazers command, refs #4 5
- Add this view for seeing new releases 5
- Provide a cookiecutter template for creating new plugins 5
- on_create mechanism for after table creation 5
- Datasette.render_template() method 5
- Rethink how sanity checks work 5
- Release automation: automate the bit that posts the GitHub release 5
- table.disable_fts() method and "sqlite-utils disable-fts ..." command 5
- twitter-to-sqlite user-timeline [screen_names] --sql / --attach 5
- Option in metadata.json to set default sort order for a table 5
- Feature: record history of follower counts 5
- Custom CSS class on body for styling canned queries 5
- Repos have a big blob of JSON in the organization column 5
- Annotate photos using the Google Cloud Vision API 5
- Question: Access to immutable database-path 5
- Create a public demo 5
- Unit test that checks that all plugin hooks have corresponding unit tests 5
- Ability to sign in to Datasette as a root account 5
- CSRF protection 5
- Add insert --truncate option 5
- Consider using enable_callback_tracebacks(True) 5
- Fix the demo - it breaks because of the tags table change 5
- Feature: pull request reviews and comments 5
- Mechanism for passing additional options to `datasette my.db` that affect plugins 5
- Features for enabling and disabling WAL mode 5
- Add homebrew installation to documentation 5
- Path parameters for custom pages 5
- insert_all(..., alter=True) should work for new columns introduced after the first 100 records 5
- .delete_where() does not auto-commit (unlike .insert() or .upsert()) 5
- Progress bar for sqlite-utils insert 5
- Better handling of encodings other than utf-8 for "sqlite-utils insert" 5
- How should datasette.client interact with base_url 5
- Add documentation on serving Datasette behind a proxy using base_url 5
- .extract() shouldn't extract null values 5
- Add search highlighting snippets 5
- Default menu links should check a real permission 5
- load_template() plugin hook 5
- Rethink how table.search() method works 5
- Foreign key links break for compound foreign keys 5
- Rename datasette.config() method to datasette.setting() 5
- Show pysqlite3 version on /-/versions 5
- "Stream all rows" is not at all obvious 5
- More flexible CORS support in core, to encourage good security practices 5
- Release notes for Datasette 0.54 5
- Research using CTEs for faster facet counts 5
- Upgrade to Python 3.9.4 5
- ?_facet_size=X to increase number of facets results on the page 5
- `table.xindexes` using `PRAGMA index_xinfo(table)` 5
- Error: Use either --since or --since_id, not both 5
- .transform(types=) turns rowid into a concrete column 5
- Stop using generated columns in fixtures.db 5
- `datasette publish cloudrun --cpu X` option 5
- Ability to search for text across all columns in a table 5
- Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects') 5
- Way to test SQLite 3.37 (and potentially other versions) in CI 5
- Command for creating an empty database 5
- Support for CHECK constraints 5
- filters_from_request plugin hook, now used in TableView 5
- Scripted exports 5
- Improvements to help make Datasette a better tool for learning SQL 5
- Reconsider policy on blocking queries containing the string "pragma" 5
- Test failures with SQLite 3.37.0+ due to column affinity case 5
- Implement redirects from old % encoding to new dash encoding 5
- Adopt a code of conduct 5
- Display autodoc type information more legibly 5
- Research running SQL in table view in parallel using `asyncio.gather()` 5
- Support `rows_where()`, `delete_where()` etc for attached alias databases 5
- CSV `extras_key=` and `ignore_extras=` equivalents for CLI tool 5
- Upgrade to 3.10.6-slim-bullseye Docker base image 5
- 500 error in github-to-sqlite demo 5
- Link from documentation to source code 5
- Move "datasette --get" from Getting Started to CLI Reference 5
- db[table].create(..., transform=True) and create-table --transform 5
- NoneType' object has no attribute 'actor' 5
- Create a new table from one or more records, `sqlite-utils` style 5
- Design URLs for the write API 5
- Make it easier to fix URL proxy problems 5
- upsert of new row with check constraints fails 5
- ignore:true/replace:true options for /db/-/create API 5
- register_permissions() plugin hook 5
- More useful error message if enable_load_extension is not available 5
- codespell test failure 5
- Plan for getting the new JSON format query views working 5
- Build HTML version of /content?sql=... 5
- Add writable canned query demo to latest.datasette.io 5
- Datasette --get --actor option 5
- DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins 5
- Don't show foreign key links to tables the user cannot access 5
- Protect against malicious SQL that causes damage even though our DB is immutable 4
- Homepage UI for editing metadata file 4
- Switch to ujson 4
- Pick a name 4
- Ship a Docker image of the whole thing 4
- datasette publish hyper 4
- Support for title/source/license metadata 4
- Enforce pagination (or at least limits) for arbitrary custom SQL 4
- ?_json=foo&_json=bar query string argument 4
- datasette publish can fail if /tmp is on a different device 4
- Figure out how to bundle a more up-to-date SQLite 4
- Ability to apply sort on mobile in portrait mode 4
- metadata.json support for plugin configuration options 4
- datasette publish lambda plugin 4
- Explore "distinct values for column" in inspect() 4
- Add links to example Datasette instances to appropiate places in docs 4
- Mechanism for automatically picking up changes when on-disk .db file changes 4
- Support table names ending with .json or .csv 4
- Wildcard support in query parameters 4
- Limit text display in cells containing large amounts of text 4
- Datasette on Zeit Now returns http URLs for facet and next links 4
- Requesting support for query description 4
- Ability to display facet counts for many-to-many relationships 4
- add_column() should support REFERENCES {other_table}({other_column}) 4
- Figure out what to do about table counts in a mutable world 4
- Tracing support for seeing what SQL queries were executed 4
- Paginate + search for databases/tables on the homepage 4
- Replace most of `.inspect()` (and `datasette inspect`) with table counting 4
- Decide what to do about /-/inspect 4
- Option to facet by date using month or year 4
- Allow .insert(..., foreign_keys=()) to auto-detect table and primary key 4
- Facets not correctly persisted in hidden form fields 4
- Support opening multiple databases with the same stem 4
- Decide what goes into Datasette 1.0 4
- Get tests running on Windows using Travis CI 4
- Ability to list views, and to access db["view_name"].rows / rows_where / etc 4
- More advanced connection pooling 4
- Option to fetch only checkins more recent than the current max checkin 4
- --sql and --attach options for feeding commands from SQL queries 4
- Use better pagination (and implement progress bar) 4
- Command to import home-timeline 4
- retweets-of-me command 4
- Failed to import workout points 4
- Datasette should work with Python 3.8 (and drop compatibility with Python 3.5) 4
- Mechanism for register_output_renderer to suggest extension or not 4
- Remove .detect_column_types() from table, make it a documented API 4
- Add documentation on Database introspection methods to internals.rst 4
- Custom pages mechanism, refs #648 4
- escape_fts() does not correctly escape * wildcards 4
- Directory configuration mode should support metadata.yaml 4
- Cloud Run fails to serve database files larger than 32MB 4
- Ability to set custom default _size on a per-table basis 4
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 4
- add_foreign_key(...., ignore=True) 4
- register_output_renderer can_render mechanism 4
- Publish secrets 4
- Example authentication plugin 4
- /-/metadata and so on should respect view-instance permission 4
- Log out mechanism for clearing ds_actor cookie 4
- Take advantage of .coverage being a SQLite database 4
- Use white-space: pre-wrap on ALL table cell contents 4
- github-to-sqlite tags command for fetching tags 4
- Output binary columns in "sqlite-utils query" JSON 4
- Security issue: read-only canned queries leak CSRF token in URL 4
- sqlite-utils insert: options for column types 4
- 'datasette --get' option, refs #926 4
- Test failures caused by failed attempts to mock pip 4
- --load-extension option for sqlite-utils query 4
- Try out CodeMirror SQL hints 4
- Idea: conversions= could take Python functions 4
- sqlite-utils transform sub-command 4
- sqlite-utils transform/insert --detect-types 4
- column name links broken in 0.50.1 4
- extra_js_urls and extra_css_urls should respect base_url setting 4
- Table/database action menu cut off if too short 4
- changes to allow for compound foreign keys 4
- Rebrand and redirect config.rst as settings.rst 4
- Support for generated columns 4
- sqlite-utils analyze-tables command 4
- Searching for "github-to-sqlite" throws an error 4
- reset_counts() method and command 4
- view_name = "query" for the query page 4
- Support SSL/TLS directly 4
- --port option should validate port is between 0 and 65535 4
- 500 error caused by faceting if a column called `n` exists 4
- Share button for copying current URL 4
- Refresh SpatiaLite documentation 4
- Add Docker multi-arch support with Buildx 4
- Can't use apt-get in Dockerfile when using datasetteproj/datasette as base 4
- Figure out how to publish alpha/beta releases to Docker Hub 4
- Intermittent CI failure: restore_working_directory FileNotFoundError 4
- row.update() or row.pk 4
- db.schema property and sqlite-utils schema command 4
- Automatic type detection for CSV data 4
- Big performance boost on faceting: skip the inner order by 4
- Command for fetching Hacker News threads from the search API 4
- Ability to default to hiding the SQL for a canned query 4
- Document exceptions that can be raised by db.execute() and friends 4
- Add reference documentation generated from docstrings 4
- Ability to insert file contents as text, in addition to blob 4
- xml.etree.ElementTree.ParseError: not well-formed (invalid token) 4
- sqlite-utils memory can't deal with multiple files with the same name 4
- ?_sort=rowid with _next= returns error 4
- `table.lookup()` option to populate additional columns when creating a record 4
- Improve Apache proxy documentation, link to demo 4
- Provide function to generate hash_id from specified columns 4
- Add `Link: rel="alternate"` header pointing to JSON for a table/query 4
- Maybe return JSON from HTML pages if `Accept: application/json` is sent 4
- `sqlite-utils insert --extract colname` 4
- Writable canned queries fail to load custom templates 4
- Allow users to pass a full convert() function definition 4
- Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key 4
- Better error message if `--convert` code fails to return a dict 4
- `--fmt` should imply `-t` 4
- Add documentation page with the output of `--help` 4
- Release notes for 0.60 4
- Add KNN and data_licenses to hidden tables list 4
- Move canned queries closer to the SQL input area 4
- `sqlite-utils bulk --batch-size` option 4
- Add SpatiaLite helpers to CLI 4
- `deterministic=True` fails on versions of SQLite prior to 3.8.3 4
- Sensible `cache-control` headers for static assets, including those served by plugins 4
- Automated test for Pyodide compatibility 4
- minor a11y: <select> has no visual indicator when tabbed to 4
- 500 error if sorted by a column not in the ?_col= list 4
- i18n support 4
- Adjust height of textarea for no JS case 4
- Parts of YAML file do not work when db name is "off" 4
- Database() constructor currently defaults is_mutable to False 4
- fails before generating views. ERR: table sqlite_master may not be modified 4
- `sqlite-utils transform` should set empty strings to null when converting text columns to integer/float 4
- Turn --flatten into a documented utility function 4
- Tests failing due to updated tabulate library 4
- `max_signed_tokens_ttl` setting for a maximum duration on API tokens 4
- Delete a single record from an existing table 4
- API to drop a table 4
- 1.0a0 release notes 4
- Extract logic for resolving a URL to a database / table / row 4
- `publish heroku` failing due to old Python version 4
- Docs for replace:true and ignore:true options for insert API 4
- installpython3.com is now a spam website 4
- Reconsider pattern where plugins could break existing template context 4
- `Table.convert()` skips falsey values 4
- Custom SQL queries should use new JSON ?_extra= format 4
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 4
- GitHub Action to lint Python code with ruff 4
- Datasette cannot be installed with Rye 4
- `--raw-lines` option, like `--raw` for multiple lines 4
- Implement new /content.json?sql=... 4
- Query view shouldn't return `columns` 4
- Plugin hook for database queries that are run 4
- form label { width: 15% } is a bad default 4
- datasette -s/--setting option for setting nested configuration options 4
- Bump sphinx, furo, blacken-docs dependencies 4
- Add spatialite arm64 linux path 4
- Implement sensible query pagination 3
- Command line tool for uploading one or more DBs to Now 3
- date, year, month and day querystring lookups 3
- Implement a better database index page 3
- Add more detailed API documentation to the README 3
- UI for editing named parameters 3
- Consider data-package as a format for metadata 3
- Option to open readonly but not immutable 3
- UI support for running FTS searches 3
- If view is filtered, search should apply within those filtered rows 3
- ?_search=x should work if used directly against a FTS virtual table 3
- Show extra instructions with the interrupted 3
- _group_count= feature improvements 3
- Datasette CSS should include content hash in the URL 3
- A primary key column that has foreign key restriction associated won't rendering label column 3
- Custom template for named canned query 3
- Ability to bundle metadata and templates inside the SQLite file 3
- Run pks_for_table in inspect, executing once at build time rather than constantly 3
- Don't duplicate simple primary keys in the link column 3
- Allow plugins to add new cli sub commands 3
- datasette publish --install=name-of-plugin 3
- label_column option in metadata.json 3
- External metadata.json 3
- Facets should not execute for ?shape=array|object 3
- "config" section in metadata.json (root, database and table level) 3
- Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0 3
- Support multiple filters of the same type 3
- ?_ttl= parameter to control caching 3
- Avoid plugins accidentally loading dependencies twice 3
- Per-database and per-table /-/ URL namespace 3
- Ability to configure SQLite cache_size 3
- datasette inspect takes a very long time on large dbs 3
- Ensure --help examples in docs are always up to date 3
- Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 3
- render_cell(value) plugin hook 3
- Use pysqlite3 if available 3
- Update official datasetteproject/datasette Docker container to SQLite 3.26.0 3
- Ensure downloading a 100+MB SQLite database file works 3
- Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs 3
- Experiment: run Jinja in async mode 3
- .insert_all() should accept a generator and process it efficiently 3
- Utilities for adding indexes 3
- Refactor facets to a class and new plugin, refs #427 3
- Fix the "datasette now publish ... --alias=x" option 3
- Make it so Docker build doesn't delay PyPI release 3
- Option to ignore inserts if primary key exists already 3
- Test against Python 3.8-dev using Travis 3
- asgi_wrapper plugin hook 3
- Unable to use rank when fts-table generated with csvs-to-sqlite 3
- Mechanism for secrets in plugin configuration 3
- datasette publish option for setting plugin configuration secrets 3
- Potential improvements to facet-by-date 3
- Support unicode in url 3
- CodeMirror fails to load on database page 3
- .add_column() doesn't match indentation of initial creation 3
- Script uses a lot of RAM 3
- "Too many SQL variables" on large inserts 3
- Add triggers while enabling FTS 3
- "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user 3
- Extract "source" into a separate lookup table 3
- Track and use the 'since' value 3
- since_id support for home-timeline 3
- --since support for various commands for refresh-by-cron 3
- _where= parameter is not persisted in hidden form fields 3
- /-/plugins shows incorrect name for plugins 3
- Static assets no longer loading for installed plugins 3
- Add this repos_starred view 3
- `import` command fails on empty files 3
- rowid is not included in dropdown filter menus 3
- Custom queries with 0 results should say "0 results" 3
- Don't suggest column for faceting if all values are 1 3
- Command for importing events 3
- Add a glossary to the documentation 3
- Template debug mode that outputs template context 3
- Copy and paste doesn't work reliably on iPhone for SQL editor 3
- Tests are failing due to missing FTS5 3
- Assets table with downloads 3
- upsert_all() throws issue when upserting to empty table 3
- order_by mechanism 3
- Escape_fts5_query-hookimplementation does not work with queries to standard tables 3
- Tutorial command no longer works 3
- prepare_connection() plugin hook should accept optional datasette argument 3
- Cashe-header missing in http-response 3
- Variables from extra_template_vars() not exposed in _context=1 3
- Search box CSS doesn't look great on OS X Safari 3
- Handle "User not found" error 3
- WIP implementation of writable canned queries 3
- Adding a "recreate" flag to the `Database` constructor 3
- --plugin-secret over-rides existing metadata.json plugin config 3
- Pull repository contributors 3
- Mechanism for forcing column-type, over-riding auto-detection 3
- Issue and milestone should have foreign key to repo 3
- Issue comments don't appear to populate issues foreign key 3
- Configuration directory mode 3
- Fall back to authentication via ENV 3
- Create index on issue_comments(user) and other foreign keys 3
- Mechanism for creating views if they don't yet exist 3
- Add notlike table filter 3
- Question: Any fixed date for the release with the uft8-encoding fix? 3
- Way of seeing full schema for a database 3
- Add PyPI project urls to setup.py 3
- Error pages not correctly loading CSS 3
- request.url and request.scheme should obey force_https_urls config setting 3
- CSRF protection for /-/messages tool and writable canned queries 3
- Documentation for new "params" setting for canned queries 3
- Ability to customize what happens when a view permission fails 3
- Documentation is inconsistent about "id" as required field on actor 3
- Document the ds_actor signed cookie 3
- Horizontal scrollbar on changelog page on mobile 3
- Script to generate larger SQLite test files 3
- Support for compound (composite) foreign keys 3
- "Logged in as: XXX - logout" navigation item 3
- Canned query page should show the name of the canned query 3
- Ability to remove a foreign key 3
- Some links don't honor base_url 3
- Add a table of contents to the README 3
- "allow": true for anyone, "allow": false for nobody 3
- Interactive debugging tool for "allow" blocks 3
- Ability to insert files piped to insert-files stdin 3
- Support tokenize option for FTS 3
- Refactor TableView class so things like datasette-graphql can reuse the logic 3
- "datasette install" and "datasette uninstall" commands 3
- db.execute_write_fn(create_tables, block=True) hangs a thread if connection fails 3
- Pass columns to extra CSS/JS/etc plugin hooks 3
- Code for finding SpatiaLite in the usual locations 3
- --load-extension=spatialite shortcut option 3
- insert_all(..., alter=True) should work for new columns introduced after the first 100 records 3
- Datasette plugin to provide custom page for running faceted, ranked searches 3
- Timeline view 3
- table.optimize() should delete junk rows from *_fts_docsize 3
- Documentation for 404.html, 500.html templates 3
- Add --tar option to "datasette publish heroku" 3
- request an "-o" option on "datasette server" to open the default browser at the running url 3
- Add docs for .transform(column_order=) 3
- Default table view JSON should include CREATE TABLE 3
- Better handling of multiple matching template wildcard paths 3
- Documentation covering buildpack deployment 3
- Datasette should default to running Uvicorn with workers=1 3
- from_json jinja2 filter 3
- Remove xfail tests when new httpx is released 3
- json / CSV links are broken in Datasette 0.50 3
- Add a "delete" icon next to filters (in addition to "remove filter") 3
- Fix issues relating to base_url 3
- Fallback to databases in inspect-data.json when no -i options are passed 3
- datasette.urls.static_plugins(...) method 3
- datasette.urls.table(..., format="json") argument 3
- Add horizontal scrollbar to tables 3
- .blob output renderer 3
- Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows 3
- .csv should link to .blob downloads 3
- Table actions menu plus plugin hook 3
- latest.datasette.io should include plugins from fixtures 3
- database_actions plugin hook 3
- 3.0 release with some minor breaking changes 3
- table.search() improvements plus sqlite-utils search command 3
- Foreign keys with blank titles result in non-clickable links 3
- OperationalError('interrupted') can 500 on row page 3
- Custom widgets for canned query forms 3
- Support linking to compound foreign keys 3
- --load-extension=spatialite not working with datasetteproject/datasette docker image 3
- github-to-sqlite workflows command 3
- "datasette inspect" outputs invalid JSON if an error is logged 3
- "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 3
- Prettier package not actually being cached 3
- Certain database names results in 404: "Database not found: None" 3
- Retire "Ecosystem" page in favour of datasette.io/plugins and /tools 3
- "Statement may not contain PRAGMA" error is not strictly true 3
- `datasette publish upload` mechanism for uploading databases to an existing Datasette instance 3
- ?_size= argument is not persisted by hidden form fields in the table filters 3
- Rename /:memory: to /_memory 3
- gzip support for HTML (and JSON) responses 3
- Re-submitting filter form duplicates _x querystring arguments 3
- Error reading csv files with large column data 3
- Hitting `_csv.Error: field larger than field limit (131072)` 3
- db["my_table"].drop(ignore=True) parameter, plus sqlite-utils drop-table --ignore and drop-view --ignore 3
- Suggest for ArrayFacet possibly confused by blank values 3
- Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions 3
- Allow canned query params to specify default values 3
- Escaping FTS search strings 3
- Try implementing SQLite timeouts using .interrupt() instead of using .set_progress_handler() 3
- Handle byte order marks (BOMs) in CSV files 3
- Speed up tests with pytest-xdist 3
- Avoid error sorting by relationships if related tables are not allowed 3
- Columns named "link" display in bold 3
- Improve `path_with_replaced_args()` and friends and document them 3
- Supporting additional output formats, like GeoJSON 3
- Release Datasette 0.57 3
- Add some types, enforce with mypy 3
- DRAFT: A new plugin hook for dynamic metadata 3
- Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns) 3
- Mechanism for plugins to exclude certain paths from CSRF checks 3
- Use HN algolia endpoint to retrieve trees 3
- utils.parse_metadata() should be a documented internal function 3
- `table.convert(..., where=)` and `sqlite-utils convert ... --where=` 3
- Rename Datasette.__init__(config=) parameter to settings= 3
- Modify base.html template to support optional sticky footer 3
- Try blacken-docs 3
- Win32 "used by another process" error with datasette publish 3
- Datasette 1.0 JSON API (and documentation) 3
- Datasette 1.0 documented template context (maybe via API docs) 3
- "Links from other tables" broken for columns starting with underscore 3
- Research pattern for re-registering existing Click tools with register_commands 3
- A way of creating indexes on newly created tables 3
- Optional caching mechanism for table.lookup() 3
- Custom pages don't work on windows 3
- Redesign CSV export to improve usability 3
- `keep_blank_values=True` when parsing `request.args` 3
- Redesign `facet_results` JSON structure prior to Datasette 1.0 3
- Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 3
- Offer `python -m sqlite_utils` as an alternative to `sqlite-utils` 3
- `explain query plan select` is too strict about whitespace 3
- List `--fmt` options in the docs 3
- `sqlite-utils bulk` command 3
- Add a CLI reference page to the docs, inspired by sqlite-utils 3
- Tests failing against Python 3.6 3
- Link: rel="alternate" to JSON for queries too 3
- Support IF NOT EXISTS for table creation 3
- Update Dockerfile generated by `datasette publish` 3
- Refactor URL routing to enable testing 3
- Make route matched pattern groups more consistent 3
- Reconsider ensure_permissions() logic, can it be less confusing? 3
- Make "<Binary: 2427344 bytes>" easier to read 3
- `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 3
- Refactor `RowView` and remove `RowTableShared` 3
- ?_trace=1 doesn't work on Global Power Plants demo 3
- Remove python-baseconv dependency 3
- CLI eats my cursor 3
- `detect_fts()` identifies the wrong table if tables have names that are subsets of each other 3
- Combining `rows_where()` and `search()` to limit which rows are searched 3
- `sqlite_utils.utils.TypeTracker` should be a documented API 3
- Incorrect syntax highlighting in docs CLI reference 3
- Cross-link CLI to Python docs 3
- Research an upgrade to CodeMirror 6 3
- Remove upper bound dependencies as a default policy 3
- Featured table(s) on the homepage 3
- Ability to merge databases and tables 3
- Preserve query on timeout 3
- Switch to keyword-only arguments for a bunch of internal methods 3
- Support JSON values returned from .convert() functions 3
- docker image is duplicating db files somehow 3
- Private database page should show padlock on every table 3
- Flaky test: test_serve_localhost_http 3
- allow_signed_tokens setting for disabling API signed token mechanism 3
- datasette create-token CLI command 3
- Release 0.63 3
- Make `cursor.rowcount` accessible (wontfix) 3
- mypy failures in CI 3
- latest.datasette.io Cloud Run deploys failing 3
- Incorrect link from the API explorer to the JSON API documentation 3
- Upgrade for Sphinx 6.0 (once Furo has support for it) 3
- array facet: don't materialize unnecessary columns 3
- Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError 3
- Initial proof of concept with ChatGPT 3
- Implement a SQL view to make it easier to query files in a nested folder 3
- sphinx.builders.linkcheck build error 3
- AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7 3
- Drop support for Python 3.7 3
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 3
- register_command plugin hook 3
- `datasette install -e` option 3
- feat: Implement a prepare_connection plugin hook 3
- `prepare_connection()` plugin hook 3
- Plugin hook for adding new output formats 3
- Implement and document extras for the new query view page 3
- Implement canned queries against new query JSON work 3
- Turn DatabaseDownload into an async view function 3
- database color shows only on index page, not other pages 3
- …
user 1
- simonw · 8,883 ✖
id | html_url | issue_url ▼ | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
1461226402 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/1#issuecomment-1461226402 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1 | IC_kwDOJHON9s5XGIui | simonw 9599 | 2023-03-09T03:47:11Z | 2023-03-09T03:47:11Z | MEMBER | Me:
ChatGPT:
ChatGPT: ```zsh !/usr/bin/zshosascript -e 'tell application "Notes"
repeat with eachNote in every note
set noteTitle to the name of eachNote
set noteBody to the body of eachNote
set output to noteTitle & "\n" & noteBody & "\n"
display dialog output
log output
end repeat
end tell'
Me:
ChatGPT: ```zsh !/bin/zshosascript -e 'tell application "Notes" repeat with eachNote in every note set noteTitle to the name of eachNote set noteBody to the body of eachNote set output to noteTitle & "\n" & noteBody & "\n" log output end repeat end tell' ``` This really helps! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Initial proof of concept with ChatGPT 1616347574 | |
1461230197 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/1#issuecomment-1461230197 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1 | IC_kwDOJHON9s5XGJp1 | simonw 9599 | 2023-03-09T03:51:36Z | 2023-03-09T03:51:36Z | MEMBER | After a few more rounds I got to this script, which outputs them to a ```zsh !/bin/zshosascript -e '
set notesFile to "/tmp/notes.txt"
set fileRef to open for access notesFile with write permission
tell application "Notes"
repeat with eachNote in every note
set noteId to the id of eachNote
set noteTitle to the name of eachNote
set noteBody to the body of eachNote
write "------------------------" & "\n" to fileRef
write noteId & "\n" to fileRef
write noteTitle & "\n\n" to fileRef
write noteBody & "\n" to fileRef
end repeat
end tell
close access fileRef'
cleaned_notes = [{ "id": n.split("\n")[0], "title": n.split("\n")[1], "body": "\n".join(n.split("\n")[2:]).strip() } for n in notes] db = sqlite_utils.Database("/tmp/notes.db") db["notes"].insert_all(cleaned_notes) ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Initial proof of concept with ChatGPT 1616347574 | |
1461230436 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/1#issuecomment-1461230436 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1 | IC_kwDOJHON9s5XGJtk | simonw 9599 | 2023-03-09T03:51:52Z | 2023-03-09T03:51:52Z | MEMBER | This did the job! Next step is to turn that into a Python script. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Initial proof of concept with ChatGPT 1616347574 | |
1462962682 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462962682 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11 | IC_kwDOJHON9s5XMwn6 | simonw 9599 | 2023-03-09T23:20:35Z | 2023-03-09T23:22:41Z | MEMBER | Here's a query that returns all notes in folder 1, including notes in descendant folders:
``` SQLite schema: CREATE TABLE [folders] ( [id] INTEGER PRIMARY KEY, [long_id] TEXT, [name] TEXT, [parent] INTEGER, FOREIGN KEY([parent]) REFERENCES folders ); Write a recursive CTE that returns the following: folder_id | descendant_folder_id With a row for every nested child of every folder - so the top level folder has lots of rows
Convert all SQL keywords to lower case, and re-indent ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Implement a SQL view to make it easier to query files in a nested folder 1618130434 | |
1462965256 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462965256 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11 | IC_kwDOJHON9s5XMxQI | simonw 9599 | 2023-03-09T23:22:12Z | 2023-03-09T23:22:12Z | MEMBER | Here's what the CTE from that looks like: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Implement a SQL view to make it easier to query files in a nested folder 1618130434 | |
1462968053 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462968053 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11 | IC_kwDOJHON9s5XMx71 | simonw 9599 | 2023-03-09T23:24:01Z | 2023-03-09T23:24:01Z | MEMBER | I improved the readability by removing some unnecessary table aliases:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Implement a SQL view to make it easier to query files in a nested folder 1618130434 | |
1461232709 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461232709 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGKRF | simonw 9599 | 2023-03-09T03:54:28Z | 2023-03-09T03:54:28Z | MEMBER | I think the AppleScript I want to pass to |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1461234311 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461234311 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGKqH | simonw 9599 | 2023-03-09T03:56:24Z | 2023-03-09T03:56:24Z | MEMBER | I opened the "Script Editor" app on my computer, used Window -> Library to open the Library panel, then clicked on the Notes app there. I got this: So the notes object has these properties:
I'm going to ignore the concept of attachments for the moment. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1461234591 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461234591 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGKuf | simonw 9599 | 2023-03-09T03:56:45Z | 2023-03-09T03:56:45Z | MEMBER | My prototype showed that images embedded in notes come out in the HTML export as bas64 image URLs, which is neat. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1461259490 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461259490 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGQzi | simonw 9599 | 2023-03-09T04:24:27Z | 2023-03-09T04:24:27Z | MEMBER | Converting AppleScript date strings to ISO format is hard! https://forum.latenightsw.com/t/formatting-dates/841 has a recipe I'll try:
Not clear to me how timezones work here. I'm going to ignore them for the moment. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1461260978 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461260978 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGRKy | simonw 9599 | 2023-03-09T04:27:18Z | 2023-03-09T04:27:18Z | MEMBER | Before that conversion:
After:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1461262577 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461262577 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGRjx | simonw 9599 | 2023-03-09T04:30:00Z | 2023-03-09T04:30:00Z | MEMBER | It doesn't have tests yet. I guess I'll need to mock |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1461285545 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461285545 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2 | IC_kwDOJHON9s5XGXKp | simonw 9599 | 2023-03-09T05:06:24Z | 2023-03-09T05:06:24Z | MEMBER | OK, this works! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
First working version 1616354999 | |
1462554175 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/4#issuecomment-1462554175 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4 | IC_kwDOJHON9s5XLM4_ | simonw 9599 | 2023-03-09T18:19:34Z | 2023-03-09T18:19:34Z | MEMBER | It looks like the iteration order is most-recently-modified-first - I tried editing a note a bit further back in my notes app and it was the first one output by |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support incremental updates 1616429236 | |
1462556829 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/4#issuecomment-1462556829 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4 | IC_kwDOJHON9s5XLNid | simonw 9599 | 2023-03-09T18:20:56Z | 2023-03-09T18:20:56Z | MEMBER | In terms of the UI: I'm tempted to say that the default behaviour is for it to run until it sees a note that it already knows about AND that has matching update/created dates, and then stop. You can do a full import again ignoring that logic with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support incremental updates 1616429236 | |
1462562735 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462562735 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7 | IC_kwDOJHON9s5XLO-v | simonw 9599 | 2023-03-09T18:23:56Z | 2023-03-09T18:25:22Z | MEMBER | From the Script Editor library docs: A note has a:
Here's what a folder looks like:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Folder support 1617769847 | |
1462564717 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462564717 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7 | IC_kwDOJHON9s5XLPdt | simonw 9599 | 2023-03-09T18:25:39Z | 2023-03-09T18:25:39Z | MEMBER | So it looks like folders can be hierarchical? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Folder support 1617769847 | |
1462570187 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462570187 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7 | IC_kwDOJHON9s5XLQzL | simonw 9599 | 2023-03-09T18:30:24Z | 2023-03-09T18:30:24Z | MEMBER | I used ChatGPT to write this:
Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p698 Folder Name: JSK Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Folder Name: Nested inside blog posts Folder Container: Blog posts Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3526 Folder Name: New Folder Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3839 Folder Name: New Folder 1 Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p2 Folder Name: Notes Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6059 Folder Name: Quick Notes Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7283 Folder Name: UK Christmas 2022 Folder Container: iCloud ``` So I think the correct approach here is to run code at the start to list all of the folders (no need to do fancy recursion though, just a flat list with the parent containers is enough) and create a model of that hierarchy in SQLite. Then when I import notes I can foreign key reference them back to their containing folder. I'm tempted to use |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Folder support 1617769847 | |
1462682795 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462682795 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7 | IC_kwDOJHON9s5XLsSr | simonw 9599 | 2023-03-09T19:52:20Z | 2023-03-09T19:52:44Z | MEMBER | Created through several rounds with ChatGPT (including hints like "rewrite that using setdefault()"): ```python def topological_sort(nodes): children = {} for node in nodes: parent_id = node["parent"] if parent_id is not None: children.setdefault(parent_id, []).append(node)
``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Folder support 1617769847 | |
1462691466 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462691466 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7 | IC_kwDOJHON9s5XLuaK | simonw 9599 | 2023-03-09T19:59:52Z | 2023-03-09T19:59:52Z | MEMBER | Improved script:
ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p698 Name: JSK Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Name: Nested inside blog posts Container: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6113 ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3526 Name: New Folder Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3839 Name: New Folder 1 Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p2 Name: Notes Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6059 Name: Quick Notes Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7283
Name: UK Christmas 2022
Container:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Folder support 1617769847 | |
1462693867 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462693867 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7 | IC_kwDOJHON9s5XLu_r | simonw 9599 | 2023-03-09T20:01:39Z | 2023-03-09T20:02:11Z | MEMBER | My
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Folder support 1617769847 | |
686238498 | https://github.com/dogsheep/dogsheep-beta/issues/10#issuecomment-686238498 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/10 | MDEyOklzc3VlQ29tbWVudDY4NjIzODQ5OA== | simonw 9599 | 2020-09-03T04:05:05Z | 2020-09-03T04:05:05Z | MEMBER | Since the first two categories are |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Category 3: received 691557547 | |
686618669 | https://github.com/dogsheep/dogsheep-beta/issues/11#issuecomment-686618669 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11 | MDEyOklzc3VlQ29tbWVudDY4NjYxODY2OQ== | simonw 9599 | 2020-09-03T16:47:34Z | 2020-09-03T16:53:25Z | MEMBER | I think a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Public / Private mechanism 692125110 | |
686774592 | https://github.com/dogsheep/dogsheep-beta/issues/13#issuecomment-686774592 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/13 | MDEyOklzc3VlQ29tbWVudDY4Njc3NDU5Mg== | simonw 9599 | 2020-09-03T21:30:21Z | 2020-09-03T21:30:21Z | MEMBER | This is partially supported: the custom search SQL we run doesn't escape them, but the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support advanced FTS queries 692386625 | |
695124698 | https://github.com/dogsheep/dogsheep-beta/issues/15#issuecomment-695124698 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/15 | MDEyOklzc3VlQ29tbWVudDY5NTEyNDY5OA== | simonw 9599 | 2020-09-18T23:17:38Z | 2020-09-18T23:17:38Z | MEMBER | This can be part of the demo instance in #6. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add a bunch of config examples 694136490 | |
694548909 | https://github.com/dogsheep/dogsheep-beta/issues/16#issuecomment-694548909 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16 | MDEyOklzc3VlQ29tbWVudDY5NDU0ODkwOQ== | simonw 9599 | 2020-09-17T23:15:09Z | 2020-09-17T23:15:09Z | MEMBER | I have sort by date now, #21. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Timeline view 694493566 | |
695851036 | https://github.com/dogsheep/dogsheep-beta/issues/16#issuecomment-695851036 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16 | MDEyOklzc3VlQ29tbWVudDY5NTg1MTAzNg== | simonw 9599 | 2020-09-20T23:34:57Z | 2020-09-20T23:34:57Z | MEMBER | Really basic starting point is to add facet by date. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Timeline view 694493566 | |
695877627 | https://github.com/dogsheep/dogsheep-beta/issues/16#issuecomment-695877627 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16 | MDEyOklzc3VlQ29tbWVudDY5NTg3NzYyNw== | simonw 9599 | 2020-09-21T02:42:29Z | 2020-09-21T02:42:29Z | MEMBER | Fun twist: assuming |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Timeline view 694493566 | |
687880459 | https://github.com/dogsheep/dogsheep-beta/issues/17#issuecomment-687880459 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17 | MDEyOklzc3VlQ29tbWVudDY4Nzg4MDQ1OQ== | simonw 9599 | 2020-09-06T19:36:32Z | 2020-09-06T19:36:32Z | MEMBER | At some point I may even want to support search types which are indexed from (and inflated from) more than one database file. I'm going to ignore that for the moment though. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rename "table" to "type" 694500679 | |
689226390 | https://github.com/dogsheep/dogsheep-beta/issues/17#issuecomment-689226390 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17 | MDEyOklzc3VlQ29tbWVudDY4OTIyNjM5MA== | simonw 9599 | 2020-09-09T00:36:07Z | 2020-09-09T00:36:07Z | MEMBER | Alternative names:
I think |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rename "table" to "type" 694500679 | |
688622995 | https://github.com/dogsheep/dogsheep-beta/issues/18#issuecomment-688622995 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18 | MDEyOklzc3VlQ29tbWVudDY4ODYyMjk5NQ== | simonw 9599 | 2020-09-08T05:15:21Z | 2020-09-08T05:15:21Z | MEMBER | Alternatively it could run as it does now but add a I'm not sure which would be more efficient. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deleted records stay in the search index 695553522 | |
688623097 | https://github.com/dogsheep/dogsheep-beta/issues/18#issuecomment-688623097 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18 | MDEyOklzc3VlQ29tbWVudDY4ODYyMzA5Nw== | simonw 9599 | 2020-09-08T05:15:51Z | 2020-09-08T05:15:51Z | MEMBER | I'm inclined to go with the first, simpler option. I have longer term plans for efficient incremental index updates based on clever trickery with triggers. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deleted records stay in the search index 695553522 | |
688625430 | https://github.com/dogsheep/dogsheep-beta/issues/19#issuecomment-688625430 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19 | MDEyOklzc3VlQ29tbWVudDY4ODYyNTQzMA== | simonw 9599 | 2020-09-08T05:24:50Z | 2020-09-08T05:24:50Z | MEMBER | I thought about allowing tables to define a incremental indexing SQL query - maybe something that can return just records touched in the past hour, or records since a recorded "last indexed record" value. The problem with this is deletes - if you delete a record, how does the indexer know to remove it? See #18 - that's already caused problems. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out incremental re-indexing 695556681 | |
688626037 | https://github.com/dogsheep/dogsheep-beta/issues/19#issuecomment-688626037 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19 | MDEyOklzc3VlQ29tbWVudDY4ODYyNjAzNw== | simonw 9599 | 2020-09-08T05:27:07Z | 2020-09-08T05:27:07Z | MEMBER | A really clever way to do this would be with triggers. The indexer script would add triggers to each of the database tables that it is indexing - each in their own database. Those triggers would then maintain a This would add a small amount of overhead to insert/update/delete queries run against the table. My hunch is that the overhead would be miniscule, but I could still allow people to opt-out for tables that are so high traffic that this would matter. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out incremental re-indexing 695556681 | |
685115519 | https://github.com/dogsheep/dogsheep-beta/issues/2#issuecomment-685115519 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2 | MDEyOklzc3VlQ29tbWVudDY4NTExNTUxOQ== | simonw 9599 | 2020-09-01T20:31:57Z | 2020-09-01T20:31:57Z | MEMBER | Actually this doesn't work: you can't turn on stemming for specific tables, because all of the content goes into a single So stemming needs to be a global option. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Apply porter stemming 689809225 | |
685121074 | https://github.com/dogsheep/dogsheep-beta/issues/2#issuecomment-685121074 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2 | MDEyOklzc3VlQ29tbWVudDY4NTEyMTA3NA== | simonw 9599 | 2020-09-01T20:42:00Z | 2020-09-01T20:42:00Z | MEMBER | Documentation at the bottom of the Usage section here: https://github.com/dogsheep/dogsheep-beta/blob/0.2/README.md#usage |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Apply porter stemming 689809225 | |
694551406 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694551406 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1MTQwNg== | simonw 9599 | 2020-09-17T23:22:07Z | 2020-09-17T23:22:07Z | MEMBER | Neat, I can debug this with the new
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
694551646 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694551646 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1MTY0Ng== | simonw 9599 | 2020-09-17T23:22:48Z | 2020-09-17T23:22:48Z | MEMBER | Looks like its happening in a Jinja fragment template for one of the results: ``` /Users/simon/Dropbox/Development/dogsheep-beta/dogsheep_beta/init.py(169)process_results() -> output = compiled.render({result, {"json": json}}) /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/asyncsupport.py(71)render() -> return original_render(self, args, kwargs) /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/environment.py(1090)render() -> self.environment.handle_exception() /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/environment.py(832)handle_exception() -> reraise(rewrite_traceback_stack(source=source)) /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/_compat.py(28)reraise() -> raise value.with_traceback(tb) <template>(5)top-level template code()
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
694552393 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694552393 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1MjM5Mw== | simonw 9599 | 2020-09-17T23:25:01Z | 2020-09-17T23:25:17Z | MEMBER | Ran |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
694552681 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694552681 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1MjY4MQ== | simonw 9599 | 2020-09-17T23:25:54Z | 2020-09-17T23:25:54Z | MEMBER | This is the template fragment it's rendering:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
694553579 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694553579 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1MzU3OQ== | simonw 9599 | 2020-09-17T23:28:37Z | 2020-09-17T23:28:37Z | MEMBER | More investigation in pdb: ``` (dogsheep-beta) dogsheep-beta % datasette . --get '/-/beta?q=pycon&sort=oldest' --pdb
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
694554584 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694554584 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1NDU4NA== | simonw 9599 | 2020-09-17T23:31:25Z | 2020-09-17T23:31:25Z | MEMBER | I'd prefer it if errors in these template fragments were displayed as errors inline where the fragment should have been inserted, rather than 500ing the whole page - especially since the template fragments are user-provided and could have all kinds of odd errors in them which should be as easy to debug as possible. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
694557425 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694557425 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NDU1NzQyNQ== | simonw 9599 | 2020-09-17T23:41:01Z | 2020-09-17T23:41:01Z | MEMBER | I removed all of the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
695113871 | https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-695113871 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24 | MDEyOklzc3VlQ29tbWVudDY5NTExMzg3MQ== | simonw 9599 | 2020-09-18T22:30:17Z | 2020-09-18T22:30:17Z | MEMBER | I think I know what's going on here: https://github.com/dogsheep/dogsheep-beta/blob/0f1b951c5131d16f3c8559a8e4d79ed5c559e3cb/dogsheep_beta/init.py#L166-L171 This is a logic bug - the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
the JSON object must be str, bytes or bytearray, not 'Undefined' 703970814 | |
695108895 | https://github.com/dogsheep/dogsheep-beta/issues/25#issuecomment-695108895 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/25 | MDEyOklzc3VlQ29tbWVudDY5NTEwODg5NQ== | simonw 9599 | 2020-09-18T22:11:32Z | 2020-09-18T22:11:32Z | MEMBER | I'm going to make this a new plugin configuration setting, |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
template_debug mechanism 704685890 | |
695109140 | https://github.com/dogsheep/dogsheep-beta/issues/25#issuecomment-695109140 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/25 | MDEyOklzc3VlQ29tbWVudDY5NTEwOTE0MA== | simonw 9599 | 2020-09-18T22:12:20Z | 2020-09-18T22:12:20Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
template_debug mechanism 704685890 | ||
695855646 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695855646 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg1NTY0Ng== | simonw 9599 | 2020-09-21T00:16:11Z | 2020-09-21T00:16:11Z | MEMBER | Should I do this with offset/limit or should I do proper keyset pagination? I think keyset because then it will work well for the full search interface with no filters or search string. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
695855723 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695855723 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg1NTcyMw== | simonw 9599 | 2020-09-21T00:16:52Z | 2020-09-21T00:17:53Z | MEMBER | It feels a bit weird to implement keyset pagination against results sorted by I may just ignore that though. If you want reliable pagination you can get it by sorting by date. Maybe it doesn't even make sense to offer pagination if you sort by relevance? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
695856398 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695856398 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg1NjM5OA== | simonw 9599 | 2020-09-21T00:22:20Z | 2020-09-21T00:22:20Z | MEMBER | I'm going to try for keyset pagination sorted by relevance just as a learning exercise. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
695856967 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695856967 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg1Njk2Nw== | simonw 9599 | 2020-09-21T00:26:59Z | 2020-09-21T00:26:59Z | MEMBER | It's a shame Datasette doesn't currently have an easy way to implement sorted-by-rank keyset-paginated using a TableView or QueryView. I'll have to do this using the custom SQL query constructed in the plugin: https://github.com/dogsheep/dogsheep-beta/blob/bed9df2b3ef68189e2e445427721a28f4e9b4887/dogsheep_beta/init.py#L8-L43 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
695875274 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695875274 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg3NTI3NA== | simonw 9599 | 2020-09-21T02:28:58Z | 2020-09-21T02:28:58Z | MEMBER | Datasette's implementation is complex because it has to support compound primary keys: https://github.com/simonw/datasette/blob/a258339a935d8d29a95940ef1db01e98bb85ae63/datasette/utils/init.py#L88-L114 - but that's not something that's needed for dogsheep-beta. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
695879237 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695879237 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg3OTIzNw== | simonw 9599 | 2020-09-21T02:53:29Z | 2020-09-21T02:53:29Z | MEMBER | If previous page ended at |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
695879531 | https://github.com/dogsheep/dogsheep-beta/issues/26#issuecomment-695879531 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26 | MDEyOklzc3VlQ29tbWVudDY5NTg3OTUzMQ== | simonw 9599 | 2020-09-21T02:55:28Z | 2020-09-21T02:55:54Z | MEMBER | Actually for the tie-breaker it should be something like https://latest.datasette.io/fixtures?sql=select+pk%2C+created%2C+planet_int%2C+on_earth%2C+state%2C+city_id%2C+neighborhood%2C+tags%2C+complex_array%2C+distinct_some_null+from+facetable+where+%28created+%3E+%3Ap1+or+%28created+%3D+%3Ap1+and+%28%28pk+%3E+%3Ap0%29%29%29%29+order+by+created%2C+pk+limit+11&p0=10&p1=2019-01-16+08%3A00%3A00
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Pagination 705215230 | |
711089647 | https://github.com/dogsheep/dogsheep-beta/issues/28#issuecomment-711089647 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/28 | MDEyOklzc3VlQ29tbWVudDcxMTA4OTY0Nw== | simonw 9599 | 2020-10-17T22:43:13Z | 2020-10-17T22:43:13Z | MEMBER | Since my personal Dogsheep uses Datasette authentication, I'm going to need to pass through cookies. https://github.com/simonw/datasette/issues/1020 will solve that in the future but for now I need to solve it explicitly. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Switch to using datasette.client 723861683 | |
712266834 | https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-712266834 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29 | MDEyOklzc3VlQ29tbWVudDcxMjI2NjgzNA== | simonw 9599 | 2020-10-19T16:01:23Z | 2020-10-19T16:01:23Z | MEMBER | Might just be a documented pattern for how to configure this in YAML templates. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add search highlighting snippets 724759588 | |
747029636 | https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747029636 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29 | MDEyOklzc3VlQ29tbWVudDc0NzAyOTYzNg== | simonw 9599 | 2020-12-16T21:14:03Z | 2020-12-16T21:14:03Z | MEMBER | I think I can do this as a cunning trick in
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add search highlighting snippets 724759588 | |
747030964 | https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747030964 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29 | MDEyOklzc3VlQ29tbWVudDc0NzAzMDk2NA== | simonw 9599 | 2020-12-16T21:14:54Z | 2020-12-16T21:14:54Z | MEMBER | To do this I'll need the search term to be passed to the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add search highlighting snippets 724759588 | |
747031608 | https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747031608 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29 | MDEyOklzc3VlQ29tbWVudDc0NzAzMTYwOA== | simonw 9599 | 2020-12-16T21:15:18Z | 2020-12-16T21:15:18Z | MEMBER | Should I pass any other details to the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add search highlighting snippets 724759588 | |
747034481 | https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747034481 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29 | MDEyOklzc3VlQ29tbWVudDc0NzAzNDQ4MQ== | simonw 9599 | 2020-12-16T21:17:05Z | 2020-12-16T21:17:05Z | MEMBER | I'm just going to add |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add search highlighting snippets 724759588 | |
684250044 | https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-684250044 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3 | MDEyOklzc3VlQ29tbWVudDY4NDI1MDA0NA== | simonw 9599 | 2020-09-01T05:01:09Z | 2020-09-01T05:01:23Z | MEMBER | Maybe this starts out as a custom templated canned query. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette plugin to provide custom page for running faceted, ranked searches 689810340 | |
685961809 | https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-685961809 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3 | MDEyOklzc3VlQ29tbWVudDY4NTk2MTgwOQ== | simonw 9599 | 2020-09-02T19:54:24Z | 2020-09-02T19:54:24Z | MEMBER | This should implement search highlighting too, as seen on https://til.simonwillison.net/til/search?q=cloud |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette plugin to provide custom page for running faceted, ranked searches 689810340 | |
686689612 | https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-686689612 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3 | MDEyOklzc3VlQ29tbWVudDY4NjY4OTYxMg== | simonw 9599 | 2020-09-03T18:44:20Z | 2020-09-03T18:44:20Z | MEMBER | Facets are now displayed but selecting them doesn't work yet. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette plugin to provide custom page for running faceted, ranked searches 689810340 | |
748426501 | https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426501 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31 | MDEyOklzc3VlQ29tbWVudDc0ODQyNjUwMQ== | simonw 9599 | 2020-12-19T06:12:22Z | 2020-12-19T06:12:22Z | MEMBER | I deliberately added support for advanced FTS in https://github.com/dogsheep/dogsheep-beta/commit/cbb2491b85d7ff416d6d429b60109e6c2d6d50b9 for #13 but that's the cause of this bug. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Searching for "github-to-sqlite" throws an error 771316301 | |
748426581 | https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426581 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31 | MDEyOklzc3VlQ29tbWVudDc0ODQyNjU4MQ== | simonw 9599 | 2020-12-19T06:13:17Z | 2020-12-19T06:13:17Z | MEMBER | One fix for this could be to try running the raw query, but if it throws an error run it again with the query escaped. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Searching for "github-to-sqlite" throws an error 771316301 | |
748426663 | https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426663 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31 | MDEyOklzc3VlQ29tbWVudDc0ODQyNjY2Mw== | simonw 9599 | 2020-12-19T06:14:06Z | 2020-12-19T06:14:06Z | MEMBER | Looks like I already do that here: https://github.com/dogsheep/dogsheep-beta/blob/9ba4401017ac24ffa3bc1db38e0910ea49de7616/dogsheep_beta/init.py#L141-L146 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Searching for "github-to-sqlite" throws an error 771316301 | |
748426877 | https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426877 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31 | MDEyOklzc3VlQ29tbWVudDc0ODQyNjg3Nw== | simonw 9599 | 2020-12-19T06:16:11Z | 2020-12-19T06:16:11Z | MEMBER | Here's why:
But the error being raised here is:
I'm going to attempt the escaped on on every error. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Searching for "github-to-sqlite" throws an error 771316301 | |
684395444 | https://github.com/dogsheep/dogsheep-beta/issues/4#issuecomment-684395444 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4 | MDEyOklzc3VlQ29tbWVudDY4NDM5NTQ0NA== | simonw 9599 | 2020-09-01T06:00:03Z | 2020-09-01T06:00:03Z | MEMBER | I ran |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Optimize the FTS table 689839399 | |
686689366 | https://github.com/dogsheep/dogsheep-beta/issues/5#issuecomment-686689366 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5 | MDEyOklzc3VlQ29tbWVudDY4NjY4OTM2Ng== | simonw 9599 | 2020-09-03T18:43:50Z | 2020-09-03T18:43:50Z | MEMBER | No longer needed thanks to #9 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add a context column that's not searchable 689847361 | |
685895540 | https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685895540 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7 | MDEyOklzc3VlQ29tbWVudDY4NTg5NTU0MA== | simonw 9599 | 2020-09-02T17:46:44Z | 2020-09-02T17:46:44Z | MEMBER | Some opet questions about this:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for differentiating between "by me" and "liked by me" 691265198 | |
685962280 | https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685962280 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7 | MDEyOklzc3VlQ29tbWVudDY4NTk2MjI4MA== | simonw 9599 | 2020-09-02T19:55:26Z | 2020-09-02T19:59:58Z | MEMBER | Relevant: https://charlesleifer.com/blog/a-tour-of-tagging-schemas-many-to-many-bitmaps-and-more/ SQLite supports bitwise operators Binary AND (&) and Binary OR (|) - I could try those. Not sure how they interact with indexes though. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for differentiating between "by me" and "liked by me" 691265198 | |
685965516 | https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685965516 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7 | MDEyOklzc3VlQ29tbWVudDY4NTk2NTUxNg== | simonw 9599 | 2020-09-02T20:01:54Z | 2020-09-02T20:01:54Z | MEMBER | Relevant post: https://sqlite.org/forum/forumpost/9f06fedaa5 - drh says:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for differentiating between "by me" and "liked by me" 691265198 | |
685966361 | https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685966361 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7 | MDEyOklzc3VlQ29tbWVudDY4NTk2NjM2MQ== | simonw 9599 | 2020-09-02T20:03:29Z | 2020-09-02T20:03:41Z | MEMBER | I'm going to implement the first version of this as an indexed integer I'll think about a full tagging system separately. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for differentiating between "by me" and "liked by me" 691265198 | |
685966707 | https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685966707 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7 | MDEyOklzc3VlQ29tbWVudDY4NTk2NjcwNw== | simonw 9599 | 2020-09-02T20:04:08Z | 2020-09-02T20:04:08Z | MEMBER | I'll make |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for differentiating between "by me" and "liked by me" 691265198 | |
685970384 | https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685970384 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7 | MDEyOklzc3VlQ29tbWVudDY4NTk3MDM4NA== | simonw 9599 | 2020-09-02T20:11:41Z | 2020-09-02T20:11:59Z | MEMBER | Default categories:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for differentiating between "by me" and "liked by me" 691265198 | |
685960072 | https://github.com/dogsheep/dogsheep-beta/issues/8#issuecomment-685960072 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/8 | MDEyOklzc3VlQ29tbWVudDY4NTk2MDA3Mg== | simonw 9599 | 2020-09-02T19:50:47Z | 2020-09-02T19:50:47Z | MEMBER | This doesn't actually help, because the Datasette table view page doesn't then support adding the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Create a view for running faceted searches 691369691 | |
686153967 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686153967 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjE1Mzk2Nw== | simonw 9599 | 2020-09-03T00:17:16Z | 2020-09-03T00:17:55Z | MEMBER | Maybe I can take advantage of https://sqlite.org/np1queryprob.html here - I could define a SQL query for fetching the "display" version of each item, and include a Jinja template fragment in the configuration as well. Maybe something like this:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686154486 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686154486 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjE1NDQ4Ng== | simonw 9599 | 2020-09-03T00:18:54Z | 2020-09-03T00:18:54Z | MEMBER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686154627 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686154627 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjE1NDYyNw== | simonw 9599 | 2020-09-03T00:19:22Z | 2020-09-03T00:19:22Z | MEMBER | If this performs well enough (100 displayed items will be 100 extra |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686158454 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686158454 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjE1ODQ1NA== | simonw 9599 | 2020-09-03T00:32:42Z | 2020-09-03T00:32:42Z | MEMBER | If this turns out to be too inefficient I could add a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686163754 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686163754 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjE2Mzc1NA== | simonw 9599 | 2020-09-03T00:46:21Z | 2020-09-03T00:46:21Z | MEMBER | Challenge: the Let's say it can either be duplicated in the
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686688963 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686688963 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjY4ODk2Mw== | simonw 9599 | 2020-09-03T18:42:59Z | 2020-09-03T18:42:59Z | MEMBER | I'm pleased with how this works now. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686689122 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686689122 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4NjY4OTEyMg== | simonw 9599 | 2020-09-03T18:43:20Z | 2020-09-03T18:43:20Z | MEMBER | Needs documentation. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | |
686767208 | https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686767208 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9 | MDEyOklzc3VlQ29tbWVudDY4Njc2NzIwOA== | simonw 9599 | 2020-09-03T21:12:14Z | 2020-09-03T21:12:14Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for defining custom display of results 691521965 | ||
623193947 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623193947 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5Mzk0Nw== | simonw 9599 | 2020-05-03T22:36:17Z | 2020-05-03T22:36:17Z | MEMBER | I'm going to use osxphotos for this. Since I've already got code to upload photos and insert them into a table based on their |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623195197 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623195197 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5NTE5Nw== | simonw 9599 | 2020-05-03T22:44:33Z | 2020-05-03T22:44:33Z | MEMBER | Command will be this:
This will populate a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623198653 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623198653 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5ODY1Mw== | simonw 9599 | 2020-05-03T23:09:57Z | 2020-05-03T23:09:57Z | MEMBER | For locations: I'll add |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623198986 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623198986 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5ODk4Ng== | simonw 9599 | 2020-05-03T23:12:31Z | 2020-05-03T23:12:46Z | MEMBER | To get the taken date in UTC: ``` from datetime import timezone (Pdb) photo.date.astimezone(timezone.utc).isoformat() '2018-02-13T20:21:31.620000+00:00' (Pdb) photo.date.astimezone(timezone.utc).isoformat().split(".") ['2018-02-13T20:21:31', '620000+00:00'] (Pdb) photo.date.astimezone(timezone.utc).isoformat().split(".")[0] '2018-02-13T20:21:31' (Pdb) photo.date.astimezone(timezone.utc).isoformat().split(".")[0] + "+00:00" '2018-02-13T20:21:31+00:00' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623199214 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623199214 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5OTIxNA== | simonw 9599 | 2020-05-03T23:14:08Z | 2020-05-03T23:14:08Z | MEMBER | Albums have UUIDs:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623199701 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623199701 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5OTcwMQ== | simonw 9599 | 2020-05-03T23:17:38Z | 2020-05-03T23:17:38Z | MEMBER | Record burst_uuid as a column:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623199750 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623199750 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzE5OTc1MA== | simonw 9599 | 2020-05-03T23:17:58Z | 2020-05-03T23:17:58Z | MEMBER | Reading this source code is really useful for figuring out how to store a photo in a DB table: https://github.com/RhetTbull/osxphotos/blob/7444b6d173918a3ad2a07aefce5ecf054786c787/osxphotos/photoinfo.py |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
623232984 | https://github.com/dogsheep/dogsheep-photos/issues/1#issuecomment-623232984 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1 | MDEyOklzc3VlQ29tbWVudDYyMzIzMjk4NA== | simonw 9599 | 2020-05-04T02:41:32Z | 2020-05-04T02:41:32Z | MEMBER | Needs documentation. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import photo metadata from Apple Photos into SQLite 602533300 | |
618796564 | https://github.com/dogsheep/dogsheep-photos/issues/12#issuecomment-618796564 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/12 | MDEyOklzc3VlQ29tbWVudDYxODc5NjU2NA== | simonw 9599 | 2020-04-24T04:35:25Z | 2020-04-24T04:35:25Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
If less than 500MB, show size in MB not GB 606033104 | ||
620273692 | https://github.com/dogsheep/dogsheep-photos/issues/13#issuecomment-620273692 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13 | MDEyOklzc3VlQ29tbWVudDYyMDI3MzY5Mg== | simonw 9599 | 2020-04-27T22:42:50Z | 2020-04-27T22:42:50Z | MEMBER | ```
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Also upload movie files 607888367 | |
620309185 | https://github.com/dogsheep/dogsheep-photos/issues/13#issuecomment-620309185 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13 | MDEyOklzc3VlQ29tbWVudDYyMDMwOTE4NQ== | simonw 9599 | 2020-04-28T00:39:45Z | 2020-04-28T00:39:45Z | MEMBER | I'm going to leave this until I have the mechanism for associating a live photo video with the photo. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Also upload movie files 607888367 | |
620769348 | https://github.com/dogsheep/dogsheep-photos/issues/14#issuecomment-620769348 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14 | MDEyOklzc3VlQ29tbWVudDYyMDc2OTM0OA== | simonw 9599 | 2020-04-28T18:09:21Z | 2020-04-28T18:09:21Z | MEMBER | Pricing is pretty good: free for first 1,000 calls per month, then $1.50 per thousand after that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Annotate photos using the Google Cloud Vision API 608512747 | |
620771067 | https://github.com/dogsheep/dogsheep-photos/issues/14#issuecomment-620771067 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14 | MDEyOklzc3VlQ29tbWVudDYyMDc3MTA2Nw== | simonw 9599 | 2020-04-28T18:12:34Z | 2020-04-28T18:15:38Z | MEMBER | Python library docs: https://googleapis.dev/python/vision/latest/index.html I'm creating a new project for this called simonwillison-photos: https://console.cloud.google.com/projectcreate https://console.cloud.google.com/home/dashboard?project=simonwillison-photos Then I enabled the Vision API. The direct link to https://console.cloud.google.com/flows/enableapi?apiid=vision-json.googleapis.com which they provided in the docs didn't work - it gave me a "You don't have sufficient permissions to use the requested API" error - but starting at the "Enable APIs" page and searching for it worked fine. I created a new service account as an "owner" of that project: https://console.cloud.google.com/apis/credentials/serviceaccountkey (and complained about it on Twitter and through their feedback form)
```python from google.cloud import vision client = vision.ImageAnnotatorClient.from_service_account_file("simonwillison-photos-18c570b301fe.json") Photo of a lemurresponse = client.annotate_image(
{
"image": {
"source": {
"image_uri": "https://photos.simonwillison.net/i/1b3414ee9ade67ce04ade9042e6d4b433d1e523c9a16af17f490e2c0a619755b.jpeg"
}
},
"features": [
{"type": vision.enums.Feature.Type.IMAGE_PROPERTIES},
{"type": vision.enums.Feature.Type.OBJECT_LOCALIZATION},
{"type": vision.enums.Feature.Type.LABEL_DETECTION},
],
}
)
response
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Annotate photos using the Google Cloud Vision API 608512747 | |
620771698 | https://github.com/dogsheep/dogsheep-photos/issues/14#issuecomment-620771698 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14 | MDEyOklzc3VlQ29tbWVudDYyMDc3MTY5OA== | simonw 9599 | 2020-04-28T18:13:48Z | 2020-04-28T18:13:48Z | MEMBER | For face detection:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Annotate photos using the Google Cloud Vision API 608512747 | |
620772190 | https://github.com/dogsheep/dogsheep-photos/issues/14#issuecomment-620772190 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14 | MDEyOklzc3VlQ29tbWVudDYyMDc3MjE5MA== | simonw 9599 | 2020-04-28T18:14:43Z | 2020-04-28T18:14:43Z | MEMBER | Database schema for this will require some thought. Just dumping the output into a JSON column isn't going to be flexible enough - I want to be able to FTS against labels and OCR text, and potentially query against other characteristics too. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Annotate photos using the Google Cloud Vision API 608512747 | |
620774507 | https://github.com/dogsheep/dogsheep-photos/issues/14#issuecomment-620774507 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14 | MDEyOklzc3VlQ29tbWVudDYyMDc3NDUwNw== | simonw 9599 | 2020-04-28T18:19:06Z | 2020-04-28T18:19:06Z | MEMBER | The default timeout is a bit aggressive and sometimes failed for me if my resizing proxy took too long to fetch and resize the image.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Annotate photos using the Google Cloud Vision API 608512747 | |
623723026 | https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-623723026 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15 | MDEyOklzc3VlQ29tbWVudDYyMzcyMzAyNg== | simonw 9599 | 2020-05-04T21:41:30Z | 2020-05-04T21:41:30Z | MEMBER | I'm going to put these in a table called
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Expose scores from ZCOMPUTEDASSETATTRIBUTES 612151767 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
updated_at (date) >1000 ✖