issue_comments
8,883 rows where user = 9599 sorted by author_association
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date)
issue >1000
- Show column metadata plus links for foreign keys on arbitrary query results 51
- Redesign default .json format 50
- ?_extra= support (draft) 48
- Rethink how .ext formats (v.s. ?_format=) works before 1.0 47
- Updated Dockerfile with SpatiaLite version 5.0 45
- Complete refactor of TableView and table.html template 45
- Port Datasette to ASGI 38
- Authentication (and permissions) as a core concept 38
- JavaScript plugin hooks mechanism similar to pluggy 38
- invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 35
- await datasette.client.get(path) mechanism for executing internal requests 33
- Maintain an in-memory SQLite table of connected databases and their tables 31
- Deploy a live instance of demos/apache-proxy 31
- Server hang on parallel execution of queries to named in-memory databases 30
- Ability to sort (and paginate) by column 29
- Research: demonstrate if parallel SQL queries are worthwhile 29
- Default API token authentication mechanism 29
- Port as many tests as possible to async def tests against ds_client 28
- Add ?_extra= mechanism for requesting extra properties in JSON 27
- Export to CSV 27
- Optimize all those calls to index_list and foreign_key_list 27
- Ability for a canned query to write to the database 26
- table.transform() method for advanced alter table 26
- Upgrade to CodeMirror 6, add SQL autocomplete 26
- Proof of concept for Datasette on AWS Lambda with EFS 25
- New pattern for views that return either JSON or HTML, available for plugins 25
- DeprecationWarning: pkg_resources is deprecated as an API 25
- Support cross-database joins 24
- Redesign register_output_renderer callback 24
- "datasette insert" command and plugin hook 23
- API explorer tool 23
- Option for importing CSV data using the SQLite .import mechanism 22
- UI to create reduced scope tokens from the `/-/create-token` page 22
- Datasette Plugins 21
- table.extract(...) method and "sqlite-utils extract" command 21
- ?sort=colname~numeric to sort by by column cast to real 21
- Use YAML examples in documentation by default, not JSON 21
- Idea: import CSV to memory, run SQL, export in a single command 21
- base_url is omitted in JSON and CSV views 21
- Switch documentation theme to Furo 21
- If a row has a primary key of `null` various things break 21
- "flash messages" mechanism 20
- Move CI to GitHub Issues 20
- load_template hook doesn't work for include/extends 20
- Mechanism for storing metadata in _metadata tables 20
- Introduce concept of a database `route`, separate from its name 20
- CSV files with too many values in a row cause errors 20
- register_permissions(datasette) plugin hook 20
- API tokens with view-table but not view-database/view-instance cannot access the table 20
- Better way of representing binary data in .csv output 19
- Introspect if table is FTS4 or FTS5 19
- A proper favicon 19
- Make it easier to insert geometries, with documentation and maybe code 19
- `datasette create-token` ability to create tokens with a reduced set of permissions 19
- Ability to ship alpha and beta releases 18
- Magic parameters for canned queries 18
- Figure out why SpatiaLite 5.0 hangs the database page on Linux 18
- Update screenshots in documentation to match latest designs 18
- datasette.client internal requests mechanism 17
- Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" 17
- API to insert a single record into an existing table 17
- Facets 16
- ?_col= and ?_nocol= support for toggling columns on table view 16
- Support "allow" block on root, databases and tables, not just queries 16
- Action menu for table columns 16
- Consider using CSP to protect against future XSS 16
- `--batch-size 1` doesn't seem to commit for every item 16
- Intermittent "Too many open files" error running tests 16
- Update a single record in an existing table 16
- Resolve the difference between `wrap_view()` and `BaseView` 16
- Package as standalone binary 15
- Bug: Sort by column with NULL in next_page URL 15
- Mechanism for customizing the SQL used to select specific columns in the table view 15
- The ".upsert()" method is misnamed 15
- --dirs option for scanning directories for SQLite databases 15
- Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread() 15
- latest.datasette.io is no longer updating 15
- link_or_copy_directory() error - Invalid cross-device link 15
- "sqlite-utils convert" command to replace the separate "sqlite-transform" tool 15
- Tests reliably failing on Python 3.7 15
- Autocomplete text entry for filter values that correspond to facets 15
- De-tangling Metadata before Datasette 1.0 15
- Documentation with recommendations on running Datasette in production without using Docker 14
- .execute_write() and .execute_write_fn() methods on Database 14
- Upload all my photos to a secure S3 bucket 14
- Canned query permissions mechanism 14
- "datasette -p 0 --root" gives the wrong URL 14
- Make it possible to download BLOB data from the Datasette UI 14
- Plugin hook for loading templates 14
- --lines and --text and --convert and --import 14
- Documentation should clarify /stable/ vs /latest/ 14
- "permissions" propery in metadata for configuring arbitrary permissions 14
- Design plugin hook for extras 14
- `handle_exception` plugin hook for custom error handling 14
- Refactor out the keyset pagination code 14
- Ability to customize presentation of specific columns in HTML view 13
- Allow plugins to define additional URL routes and views 13
- Handle spatialite geometry columns better 13
- Fix all the places that currently use .inspect() data 13
- Plugin hook: filters_from_request 13
- If you apply ?_facet_array=tags then &_facet=tags does nothing 13
- Mechanism for skipping CSRF checks on API posts 13
- Support column descriptions in metadata.json 13
- table.transform() method 13
- Policy on documenting "public" datasette.utils functions 13
- WIP: Add Gmail takeout mbox import 13
- sqlite-utils extract could handle nested objects 13
- `register_commands()` plugin hook to register extra CLI commands 13
- Support STRICT tables 13
- Refactor TableView to use asyncinject 13
- Write API in Datasette core 13
- Make sure CORS works for write APIs 13
- Potential feature: special support for `?a=1&a=2` on the query page 13
- Add “updated” to metadata 12
- Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1 12
- Package datasette for installation using homebrew 12
- _facet_array should work against views 12
- Port Datasette from Sanic to ASGI + Uvicorn 12
- Stream all results for arbitrary SQL and canned queries 12
- "Invalid SQL" page should let you edit the SQL 12
- Import machine-learning detected labels (dog, llama etc) from Apple Photos 12
- Having view-table permission but NOT view-database should still grant access to /db/table 12
- Efficiently calculate list of databases/tables a user can view 12
- Support creating descending order indexes 12
- Serve using UNIX domain socket 12
- Rethink approach to [ and ] in column names (currently throws error) 12
- Fix compatibility with Python 3.10 12
- Research: CTEs and union all to calculate facets AND query at the same time 12
- Traces should include SQL executed by subtasks created with `asyncio.gather` 12
- Ensure "pip install datasette" still works with Python 3.6 12
- Tilde encoding: use ~ instead of - for dash-encoding 12
- Code examples in the documentation should be formatted with Black 12
- Implement ?_extra and new API design for TableView 12
- Mechanism for ensuring a table has all the columns 12
- API for bulk inserting records into a table 12
- `/db/-/create` API for creating tables 12
- Errors when using table filters behind a proxy 12
- WIP new JSON for queries 12
- Make detailed notes on how table, query and row views work right now 12
- .transform() instead of modifying sqlite_master for add_foreign_keys 12
- Implement command-line tool interface 11
- Dockerfile should build more recent SQLite with FTS5 and spatialite support 11
- Option to expose expanded foreign keys in JSON/CSV 11
- Get Datasette tests passing on Windows in GitHub Actions 11
- Mechanism for adding arbitrary pages like /about 11
- Prototoype for Datasette on PostgreSQL 11
- Mechanism for checking if a SQLite database file is safe to open 11
- Expand plugins documentation to multiple pages 11
- Mechanism for plugins to add action menu items for various things 11
- --since feature can be confused by retweets 11
- Datasette secret mechanism - initially for signed cookies 11
- Writable canned queries live demo on Glitch 11
- POST to /db/canned-query that returns JSON should be supported (for API clients) 11
- datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates 11
- Writable canned queries with magic parameters fail if POST body is empty 11
- .json and .csv exports fail to apply base_url 11
- Database class mechanism for cross-connection in-memory databases 11
- Race condition errors in new refresh_schemas() mechanism 11
- Plugin hook for dynamic metadata 11
- "Query parameters" form shows wrong input fields if query contains "03:31" style times 11
- sqlite-utils index-foreign-keys fails due to pre-existing index 11
- `sqlite-utils insert --convert` option 11
- Research how much of a difference analyze / sqlite_stat1 makes 11
- Optional Pandas integration 11
- Research: how much overhead does the n=1 time limit have? 11
- Document how to use a `--convert` function that runs initialization code first 11
- Misleading progress bar against utf-16-le CSV input 11
- google cloudrun updated their limits on maxscale based on memory and cpu count 11
- sqlite-utils query --functions mechanism for registering extra functions 11
- Expose convert recipes to `sqlite-utils --functions` 11
- `prepare_jinja2_environment()` hook should take `datasette` argument 11
- Ensure insert API has good tests for rowid and compound primark key tables 11
- New JSON design for query views 11
- Set up some example datasets on a Cloudflare-backed domain 10
- Filter UI on table page 10
- Table view should support filtering via many-to-many relationships 10
- base_url configuration setting 10
- New design for facet abstraction, including querystring and metadata.json 10
- Improvements to table label detection 10
- Syntactic sugar for creating m2m records 10
- Mechanism for turning nested JSON into foreign keys / many-to-many 10
- extracts= should support multiple-column extracts 10
- Documented internals API for use in plugins 10
- --cp option for datasette publish and datasette package for shipping additional files and directories 10
- Mechanism for writing to database via a queue 10
- base_url doesn't entirely work for running Datasette inside Binder 10
- See if I can get Datasette working on Zeit Now v2 10
- Release Datasette 0.44 10
- Rename master branch to main 10
- Plugin hook for instance/database/table metadata 10
- Refactor default views to use register_routes 10
- CLI utility for inserting binary files into SQLite 10
- FTS table with 7 rows has _fts_docsize table with 9,141 rows 10
- Navigation menu plus plugin hook 10
- register_output_renderer() should support streaming data 10
- Ability for plugins to collaborate when adding extra HTML to blocks in default templates 10
- Async support 10
- Test Datasette Docker images built for different architectures 10
- `default_allow_sql` setting (a re-imagining of the old `allow_sql` setting) 10
- render_cell() hook should support returning an awaitable 10
- Docker configuration for exercising Datasette behind Apache mod_proxy 10
- Python library methods for calling ANALYZE 10
- Remove Hashed URL mode 10
- Options for how `r.parsedate()` should handle invalid dates 10
- If user can see table but NOT database/instance nav links should not display 10
- test_recreate failing on Windows Python 3.11 10
- `.json` errors should be returned as JSON 10
- Failing test: httpx.InvalidURL: URL too long 10
- Config file with support for defining canned queries 9
- Default to opening files in mutable mode, special option for immutable files 9
- Option to display binary data 9
- Refactor TableView.data() method 9
- Set up a live demo Datasette instance 9
- Move hashed URL mode out to a plugin 9
- Ability to serve thumbnailed Apple Photo from its place on disk 9
- New WIP writable canned queries 9
- Example permissions plugin 9
- Research feasibility of 100% test coverage 9
- canned_queries() plugin hook 9
- Improve performance of extract operations 9
- Figure out how to run an environment that exercises the base_url proxy setting 9
- Switch to .blob render extension for BLOB downloads 9
- sqlite-utils search command 9
- Datasette on Amazon Linux on ARM returns 404 for static assets 9
- Better internal database_name for _internal database 9
- Mechanism for minifying JavaScript that ships with Datasette 9
- Adopt Prettier for JavaScript code formatting 9
- Use _counts to speed up counts 9
- Use force_https_urls on when deploying with Cloud Run 9
- --no-headers option for CSV and TSV 9
- CSV ?_stream=on redundantly calculates facets for every page 9
- Research: syntactic sugar for using --get with SQL queries, maybe "datasette query" 9
- Add reference page to documentation using Sphinx autodoc 9
- create-index should run analyze after creating index 9
- Table+query JSON and CSV links broken when using `base_url` setting 9
- Advanced class-based `conversions=` mechanism 9
- Writable canned queries fail with useless non-error against immutable databases 9
- Get Datasette compatible with Pyodide 9
- Add --ignore option to more commands 9
- Ability to set a custom favicon 9
- Ability to load JSON records held in a file with a single top level key that is a list of objects 9
- SQL query field can't begin by a comment 9
- Tool for simulating permission checks against actors 9
- Release Datasette 1.0a0 9
- Refactor test suite to use mostly `async def` tests 9
- Use sqlean if available in environment 9
- Get `add_foreign_keys()` to work without modifying `sqlite_master` 9
- `datasette publish` needs support for the new config/metadata split 9
- Make URLs immutable 8
- datasette publish heroku 8
- Mechanism for ranking results from SQLite full-text search 8
- URL hashing now optional: turn on with --config hash_urls:1 (#418) 8
- sqlite-utils create-table command 8
- Enforce import sort order with isort 8
- Add a universal navigation bar which can be modified by plugins 8
- Command to fetch stargazers for one or more repos 8
- Commits in GitHub API can have null author 8
- extra_template_vars() sending wrong view_name for index 8
- Import photo metadata from Apple Photos into SQLite 8
- Visually distinguish integer and text columns 8
- Allow-list pragma_table_info(tablename) and similar 8
- Rename project to dogsheep-photos 8
- Consolidate request.raw_args and request.args 8
- Database page loads too slowly with many large tables (due to table counts) 8
- base_url doesn't seem to work when adding criteria and clicking "apply" 8
- Upgrade CodeMirror 8
- Mechanism for defining custom display of results 8
- the JSON object must be str, bytes or bytearray, not 'Undefined' 8
- OPTIONS requests return a 500 error 8
- GENERATED column support 8
- Establish pattern for release branches to support bug fixes 8
- Mechanism for executing JavaScript unit tests 8
- Make original path available to render hooks 8
- --sniff option for sniffing delimiters 8
- Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 8
- Ability to increase size of the SQL editor window 8
- "invalid reference format" publishing Docker image 8
- Tests failing with FileNotFoundError in runner.isolated_filesystem 8
- Show count of facet values if ?_facet_size=max 8
- Test against pysqlite3 running SQLite 3.37 8
- Documented JavaScript variables on different templates made available for plugins 8
- Add new spatialite helper methods 8
- Get rid of the no-longer necessary ?_format=json hack for tables called x.json 8
- Refactor and simplify Datasette routing and views 8
- Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply 8
- Table/database that is private due to inherited permissions does not show padlock 8
- Serve schema JSON to the SQL editor to enable autocomplete 8
- Some plugins show "home" breadcrumbs twice in the top left 8
- `table.upsert_all` fails to write rows when `not_null` is present 8
- Datasette should serve Access-Control-Max-Age 8
- Deploy failing with "plugins/alternative_route.py: Not a directory" 8
- ?_group_count=country - return counts by specific column(s) 7
- Ability to bundle and serve additional static files 7
- Metadata should be a nested arbitrary KV store 7
- Keyset pagination doesn't work correctly for compound primary keys 7
- Support for units 7
- prepare_context() plugin hook 7
- Improve and document foreign_keys=... argument to insert/create/etc 7
- Datasette Library 7
- Utility mechanism for plugins to render templates 7
- Syntax for ?_through= that works as a form field 7
- ?_searchmode=raw option for running FTS searches without escaping characters 7
- datasette publish cloudrun --memory option 7
- Update SQLite bundled with Docker container 7
- index.html is not reliably loaded from a plugin 7
- .columns_dict doesn't work for all possible column types 7
- Option to automatically configure based on directory layout 7
- Replace "datasette publish --extra-options" with "--setting" 7
- sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns 7
- Group permission checks by request on /-/permissions debug page 7
- Demo is failing to deploy 7
- Docker container is no longer being pushed (it's stuck on 0.45) 7
- Push to Docker Hub failed - but it shouldn't run for alpha releases anyway 7
- Simplify imports of common classes 7
- SQLITE_MAX_VARS maybe hard-coded too low 7
- Commands for making authenticated API calls 7
- Pagination 7
- Support the dbstat table 7
- Much, much faster extract() implementation 7
- Documented HTML hooks for JavaScript plugin authors 7
- Wide tables should scroll horizontally within the page 7
- Fix last remaining links to "/" that do not respect base_url 7
- Bring date parsing into Datasette core 7
- Documentation and unit tests for urls.row() urls.row_blob() methods 7
- "View all" option for facets, to provide a (paginated) list of ALL of the facet counts plus a link to view them 7
- GitHub Actions workflow to build and sign macOS binary executables 7
- --crossdb option for joining across databases 7
- Custom pages don't work with base_url setting 7
- table.pks_and_rows_where() method returning primary keys along with the rows 7
- Latest Datasette tags missing from Docker Hub 7
- "More" link for facets that shows _facet_size=max results 7
- ?_nocol= does not interact well with default facets 7
- sqlite-utils memory command for directly querying CSV/JSON data 7
- sqlite-utils memory should handle TSV and JSON in addition to CSV 7
- Introspection property for telling if a table is a rowid table 7
- absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 7
- Manage /robots.txt in Datasette core, block robots by default 7
- Query page .csv and .json links are not correctly URL-encoded on Vercel under unknown specific conditions 7
- [Enhancement] Please allow 'insert-files' to insert content as text. 7
- Extra options to `lookup()` which get passed to `insert()` 7
- Columns starting with an underscore behave poorly in filters 7
- Allow passing a file of code to "sqlite-utils convert" 7
- Test failure in test_rebuild_fts 7
- Allow to set `facets_array` in metadata (like current `facets`) 7
- `.execute_write(... block=True)` should be the default behaviour 7
- Link to stable docs from older versions 7
- Add SpatiaLite helpers to CLI 7
- Support for generated columns 7
- I forgot to include the changelog in the 3.25.1 release 7
- Remove hashed URL mode 7
- Extract out `check_permissions()` from `BaseView 7
- "Error: near "(": syntax error" when using sqlite-utils indexes CLI 7
- `--nolock` feature for opening locked databases 7
- Add new entrypoint option to `--load-extension` 7
- Upgrade Datasette Docker to Python 3.11 7
- Figure out design for JSON errors (consider RFC 7807) 7
- /db/table/-/upsert API 7
- datasette package --spatialite throws error during build 7
- Hacker News Datasette write demo 7
- First working version 7
- 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" 7
- table.create(..., replace=True) 7
- [feature request]`datasette install plugins.json` options 7
- CLI equivalents to `transform(add_foreign_keys=)` 7
- Cascade for restricted token view-table/view-database/view-instance operations 7
- Addressable pages for every row in a table 6
- Default HTML/CSS needs to look reasonable and be responsive 6
- Support Django-style filters in querystring arguments 6
- Detect foreign keys and use them to link HTML pages together 6
- Nasty bug: last column not being correctly displayed 6
- Load plugins from a `--plugins-dir=plugins/` directory 6
- Ability for plugins to define extra JavaScript and CSS 6
- inspect() should detect many-to-many relationships 6
- Build Dockerfile with recent Sqlite + Spatialite 6
- inspect should record column types 6
- Deploy demo of Datasette on every commit that passes tests 6
- Plugin hook for loading metadata.json 6
- Faceted browse against a JSON list of tags 6
- ?_where=sql-fragment parameter for table views 6
- "datasette publish cloudrun" command to publish to Google Cloud Run 6
- Additional Column Constraints? 6
- Easier way of creating custom row templates 6
- Experiment with type hints 6
- Command for running a search and saving tweets for that search 6
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 6
- Improve UI of "datasette publish cloudrun" to reduce chances of accidentally over-writing a service 6
- Mechanism for indicating foreign key relationships in the table and query page URLs 6
- allow leading comments in SQL input field 6
- Problem with square bracket in CSV column name 6
- "Templates considered" comment broken in >=0.35 6
- Documentation for the "request" object 6
- Support YAML in metadata - metadata.yaml 6
- Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records 6
- Command for retrieving dependents for a repo 6
- Support decimal.Decimal type 6
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 6
- allow_by_query setting for configuring permissions with a SQL statement 6
- python tests/fixtures.py command has a bug 6
- Mechanism for specifying allow_sql permission in metadata.json 6
- Way to enable a default=False permission for anonymous users 6
- Ability to set ds_actor cookie such that it expires 6
- startup() plugin hook 6
- Incorrect URLs when served behind a proxy with base_url set 6
- "Too many open files" error running tests 6
- datasette.add_message() doesn't work inside plugins 6
- Consider dropping explicit CSRF protection entirely? 6
- Support reverse pagination (previous page, has-previous-items) 6
- Datasette sdist is missing templates (hence broken when installing from Homebrew) 6
- End-user documentation 6
- extra_ plugin hooks should take the same arguments 6
- Private/secret databases: database files that are only visible to plugins 6
- Mechanism for differentiating between "by me" and "liked by me" 6
- Rendering glitch with column headings on mobile 6
- Redesign application homepage 6
- Change "--config foo:bar" to "--setting foo bar" 6
- Add Link: pagination HTTP headers 6
- Figure out how to display images from <en-media> tags inline in Datasette 6
- "Edit SQL" button on canned queries 6
- Method for datasette.client() to forward on authentication 6
- export.xml file name varies with different language settings 6
- Better display of binary data on arbitrary query results page 6
- Table actions menu on view pages, not on query pages 6
- PrefixedUrlString mechanism broke everything 6
- Support order by relevance against FTS4 6
- sqlite-utils analyze-tables command and table.analyze_column() method 6
- Invalid SQL: "no such table: pragma_database_list" on database page 6
- Add support for Jinja2 version 3.0 6
- `sqlite-utils indexes` command 6
- `db.query()` method (renamed `db.execute_returning_dicts()`) 6
- "searchmode": "raw" in table metadata 6
- `table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option 6
- sqlite-utils insert errors should show SQL and parameters, if possible 6
- Mechanism to cause specific branches to deploy their own demos 6
- ReadTheDocs build failed for 0.59.2 release 6
- New pattern for async view classes 6
- Idea: hover to reveal details of linked row 6
- Release Datasette 0.60 6
- Drop support for Python 3.6 6
- Support mutating row in `--convert` without returning it 6
- Maybe let plugins define custom serve options? 6
- datasette one.db one.db opens database twice, as one and one_2 6
- Use dash encoding for table names and row primary keys in URLs 6
- Ship Datasette 0.61 6
- .db downloads should be served with an ETag 6
- Upgrade `--load-extension` to accept entrypoints like Datasette 6
- Ability to set a custom facet_size per table 6
- truncate_cells_html does not work for links? 6
- progressbar for inserts/upserts of all fileformats, closes #485 6
- Expose `sql` and `params` arguments to various plugin hooks 6
- Interactive demo of Datasette 1.0 write APIs 6
- /db/table/-/upsert 6
- `datasette.create_token(...)` method for creating signed API tokens 6
- datasette --root running in Docker doesn't reliably show the magic URL 6
- `publish cloudrun` reuses image tags, which can lead to very surprising deploy problems 6
- Folder support 6
- Try out Trogon for a tui interface 6
- Make as many examples in the CLI docs as possible copy-and-pastable 6
- Table renaming: db.rename_table() and sqlite-utils rename-table 6
- Plugin system 6
- Bump sphinx, furo, blacken-docs dependencies 6
- Consider a request/response wrapping hook slightly higher level than asgi_wrapper() 6
- `table.transform()` should preserve `rowid` values 6
- Plugin hook: `actors_from_ids()` 6
- "Test DATASETTE_LOAD_PLUGINS" test shows errors but did not fail the CI run 6
- Detailed upgrade instructions for metadata.yaml -> datasette.yaml 6
- Experiment with patterns for concurrent long running queries 5
- Create neat example database 5
- Redesign JSON output, ditch jsono, offer variants controlled by parameter instead 5
- Datasette serve should accept paths/URLs to CSVs and other file formats 5
- add "format sql" button to query page, uses sql-formatter 5
- Refactor views 5
- Validate metadata.json on startup 5
- Ability to enable/disable specific features via --config 5
- Custom URL routing with independent tests 5
- Travis should push tagged images to Docker Hub for each release 5
- Get Datasette working with Zeit Now v2's 100MB image size limit 5
- CSV export in "Advanced export" pane doesn't respect query 5
- Hashed URLs should be optional 5
- Define mechanism for plugins to return structured data 5
- Plugin for allowing CORS from specified hosts 5
- Design changes to homepage to support mutable files 5
- Rename metadata.json to config.json 5
- Full text search of all tables at once? 5
- Populate "endpoint" key in ASGI scope 5
- extra_template_vars plugin hook 5
- Rethink progress bars for various commands 5
- [enhancement] Method to delete a row in python 5
- Testing utilities should be available to plugins 5
- Handle really wide tables better 5
- If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar 5
- stargazers command, refs #4 5
- Add this view for seeing new releases 5
- Provide a cookiecutter template for creating new plugins 5
- on_create mechanism for after table creation 5
- Datasette.render_template() method 5
- Rethink how sanity checks work 5
- Release automation: automate the bit that posts the GitHub release 5
- table.disable_fts() method and "sqlite-utils disable-fts ..." command 5
- twitter-to-sqlite user-timeline [screen_names] --sql / --attach 5
- Option in metadata.json to set default sort order for a table 5
- Feature: record history of follower counts 5
- Custom CSS class on body for styling canned queries 5
- Repos have a big blob of JSON in the organization column 5
- Annotate photos using the Google Cloud Vision API 5
- Question: Access to immutable database-path 5
- Create a public demo 5
- Unit test that checks that all plugin hooks have corresponding unit tests 5
- Ability to sign in to Datasette as a root account 5
- CSRF protection 5
- Add insert --truncate option 5
- Consider using enable_callback_tracebacks(True) 5
- Fix the demo - it breaks because of the tags table change 5
- Feature: pull request reviews and comments 5
- Mechanism for passing additional options to `datasette my.db` that affect plugins 5
- Features for enabling and disabling WAL mode 5
- Add homebrew installation to documentation 5
- Path parameters for custom pages 5
- insert_all(..., alter=True) should work for new columns introduced after the first 100 records 5
- .delete_where() does not auto-commit (unlike .insert() or .upsert()) 5
- Progress bar for sqlite-utils insert 5
- Better handling of encodings other than utf-8 for "sqlite-utils insert" 5
- How should datasette.client interact with base_url 5
- Add documentation on serving Datasette behind a proxy using base_url 5
- .extract() shouldn't extract null values 5
- Add search highlighting snippets 5
- Default menu links should check a real permission 5
- load_template() plugin hook 5
- Rethink how table.search() method works 5
- Foreign key links break for compound foreign keys 5
- Rename datasette.config() method to datasette.setting() 5
- Show pysqlite3 version on /-/versions 5
- "Stream all rows" is not at all obvious 5
- More flexible CORS support in core, to encourage good security practices 5
- Release notes for Datasette 0.54 5
- Research using CTEs for faster facet counts 5
- Upgrade to Python 3.9.4 5
- ?_facet_size=X to increase number of facets results on the page 5
- `table.xindexes` using `PRAGMA index_xinfo(table)` 5
- Error: Use either --since or --since_id, not both 5
- .transform(types=) turns rowid into a concrete column 5
- Stop using generated columns in fixtures.db 5
- `datasette publish cloudrun --cpu X` option 5
- Ability to search for text across all columns in a table 5
- Upgrade to httpx 0.20.0 (request() got an unexpected keyword argument 'allow_redirects') 5
- Way to test SQLite 3.37 (and potentially other versions) in CI 5
- Command for creating an empty database 5
- Support for CHECK constraints 5
- filters_from_request plugin hook, now used in TableView 5
- Scripted exports 5
- Improvements to help make Datasette a better tool for learning SQL 5
- Reconsider policy on blocking queries containing the string "pragma" 5
- Test failures with SQLite 3.37.0+ due to column affinity case 5
- Implement redirects from old % encoding to new dash encoding 5
- Adopt a code of conduct 5
- Display autodoc type information more legibly 5
- Research running SQL in table view in parallel using `asyncio.gather()` 5
- Support `rows_where()`, `delete_where()` etc for attached alias databases 5
- CSV `extras_key=` and `ignore_extras=` equivalents for CLI tool 5
- Upgrade to 3.10.6-slim-bullseye Docker base image 5
- 500 error in github-to-sqlite demo 5
- Link from documentation to source code 5
- Move "datasette --get" from Getting Started to CLI Reference 5
- db[table].create(..., transform=True) and create-table --transform 5
- NoneType' object has no attribute 'actor' 5
- Create a new table from one or more records, `sqlite-utils` style 5
- Design URLs for the write API 5
- Make it easier to fix URL proxy problems 5
- upsert of new row with check constraints fails 5
- ignore:true/replace:true options for /db/-/create API 5
- register_permissions() plugin hook 5
- More useful error message if enable_load_extension is not available 5
- codespell test failure 5
- Plan for getting the new JSON format query views working 5
- Build HTML version of /content?sql=... 5
- Add writable canned query demo to latest.datasette.io 5
- Datasette --get --actor option 5
- DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins 5
- Don't show foreign key links to tables the user cannot access 5
- Protect against malicious SQL that causes damage even though our DB is immutable 4
- Homepage UI for editing metadata file 4
- Switch to ujson 4
- Pick a name 4
- Ship a Docker image of the whole thing 4
- datasette publish hyper 4
- Support for title/source/license metadata 4
- Enforce pagination (or at least limits) for arbitrary custom SQL 4
- ?_json=foo&_json=bar query string argument 4
- datasette publish can fail if /tmp is on a different device 4
- Figure out how to bundle a more up-to-date SQLite 4
- Ability to apply sort on mobile in portrait mode 4
- metadata.json support for plugin configuration options 4
- datasette publish lambda plugin 4
- Explore "distinct values for column" in inspect() 4
- Add links to example Datasette instances to appropiate places in docs 4
- Mechanism for automatically picking up changes when on-disk .db file changes 4
- Support table names ending with .json or .csv 4
- Wildcard support in query parameters 4
- Limit text display in cells containing large amounts of text 4
- Datasette on Zeit Now returns http URLs for facet and next links 4
- Requesting support for query description 4
- Ability to display facet counts for many-to-many relationships 4
- add_column() should support REFERENCES {other_table}({other_column}) 4
- Figure out what to do about table counts in a mutable world 4
- Tracing support for seeing what SQL queries were executed 4
- Paginate + search for databases/tables on the homepage 4
- Replace most of `.inspect()` (and `datasette inspect`) with table counting 4
- Decide what to do about /-/inspect 4
- Option to facet by date using month or year 4
- Allow .insert(..., foreign_keys=()) to auto-detect table and primary key 4
- Facets not correctly persisted in hidden form fields 4
- Support opening multiple databases with the same stem 4
- Decide what goes into Datasette 1.0 4
- Get tests running on Windows using Travis CI 4
- Ability to list views, and to access db["view_name"].rows / rows_where / etc 4
- More advanced connection pooling 4
- Option to fetch only checkins more recent than the current max checkin 4
- --sql and --attach options for feeding commands from SQL queries 4
- Use better pagination (and implement progress bar) 4
- Command to import home-timeline 4
- retweets-of-me command 4
- Failed to import workout points 4
- Datasette should work with Python 3.8 (and drop compatibility with Python 3.5) 4
- Mechanism for register_output_renderer to suggest extension or not 4
- Remove .detect_column_types() from table, make it a documented API 4
- Add documentation on Database introspection methods to internals.rst 4
- Custom pages mechanism, refs #648 4
- escape_fts() does not correctly escape * wildcards 4
- Directory configuration mode should support metadata.yaml 4
- Cloud Run fails to serve database files larger than 32MB 4
- Ability to set custom default _size on a per-table basis 4
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 4
- add_foreign_key(...., ignore=True) 4
- register_output_renderer can_render mechanism 4
- Publish secrets 4
- Example authentication plugin 4
- /-/metadata and so on should respect view-instance permission 4
- Log out mechanism for clearing ds_actor cookie 4
- Take advantage of .coverage being a SQLite database 4
- Use white-space: pre-wrap on ALL table cell contents 4
- github-to-sqlite tags command for fetching tags 4
- Output binary columns in "sqlite-utils query" JSON 4
- Security issue: read-only canned queries leak CSRF token in URL 4
- sqlite-utils insert: options for column types 4
- 'datasette --get' option, refs #926 4
- Test failures caused by failed attempts to mock pip 4
- --load-extension option for sqlite-utils query 4
- Try out CodeMirror SQL hints 4
- Idea: conversions= could take Python functions 4
- sqlite-utils transform sub-command 4
- sqlite-utils transform/insert --detect-types 4
- column name links broken in 0.50.1 4
- extra_js_urls and extra_css_urls should respect base_url setting 4
- Table/database action menu cut off if too short 4
- changes to allow for compound foreign keys 4
- Rebrand and redirect config.rst as settings.rst 4
- Support for generated columns 4
- sqlite-utils analyze-tables command 4
- Searching for "github-to-sqlite" throws an error 4
- reset_counts() method and command 4
- view_name = "query" for the query page 4
- Support SSL/TLS directly 4
- --port option should validate port is between 0 and 65535 4
- 500 error caused by faceting if a column called `n` exists 4
- Share button for copying current URL 4
- Refresh SpatiaLite documentation 4
- Add Docker multi-arch support with Buildx 4
- Can't use apt-get in Dockerfile when using datasetteproj/datasette as base 4
- Figure out how to publish alpha/beta releases to Docker Hub 4
- Intermittent CI failure: restore_working_directory FileNotFoundError 4
- row.update() or row.pk 4
- db.schema property and sqlite-utils schema command 4
- Automatic type detection for CSV data 4
- Big performance boost on faceting: skip the inner order by 4
- Command for fetching Hacker News threads from the search API 4
- Ability to default to hiding the SQL for a canned query 4
- Document exceptions that can be raised by db.execute() and friends 4
- Add reference documentation generated from docstrings 4
- Ability to insert file contents as text, in addition to blob 4
- xml.etree.ElementTree.ParseError: not well-formed (invalid token) 4
- sqlite-utils memory can't deal with multiple files with the same name 4
- ?_sort=rowid with _next= returns error 4
- `table.lookup()` option to populate additional columns when creating a record 4
- Improve Apache proxy documentation, link to demo 4
- Provide function to generate hash_id from specified columns 4
- Add `Link: rel="alternate"` header pointing to JSON for a table/query 4
- Maybe return JSON from HTML pages if `Accept: application/json` is sent 4
- `sqlite-utils insert --extract colname` 4
- Writable canned queries fail to load custom templates 4
- Allow users to pass a full convert() function definition 4
- Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key 4
- Better error message if `--convert` code fails to return a dict 4
- `--fmt` should imply `-t` 4
- Add documentation page with the output of `--help` 4
- Release notes for 0.60 4
- Add KNN and data_licenses to hidden tables list 4
- Move canned queries closer to the SQL input area 4
- `sqlite-utils bulk --batch-size` option 4
- Add SpatiaLite helpers to CLI 4
- `deterministic=True` fails on versions of SQLite prior to 3.8.3 4
- Sensible `cache-control` headers for static assets, including those served by plugins 4
- Automated test for Pyodide compatibility 4
- minor a11y: <select> has no visual indicator when tabbed to 4
- 500 error if sorted by a column not in the ?_col= list 4
- i18n support 4
- Adjust height of textarea for no JS case 4
- Parts of YAML file do not work when db name is "off" 4
- Database() constructor currently defaults is_mutable to False 4
- fails before generating views. ERR: table sqlite_master may not be modified 4
- `sqlite-utils transform` should set empty strings to null when converting text columns to integer/float 4
- Turn --flatten into a documented utility function 4
- Tests failing due to updated tabulate library 4
- `max_signed_tokens_ttl` setting for a maximum duration on API tokens 4
- Delete a single record from an existing table 4
- API to drop a table 4
- 1.0a0 release notes 4
- Extract logic for resolving a URL to a database / table / row 4
- `publish heroku` failing due to old Python version 4
- Docs for replace:true and ignore:true options for insert API 4
- installpython3.com is now a spam website 4
- Reconsider pattern where plugins could break existing template context 4
- `Table.convert()` skips falsey values 4
- Custom SQL queries should use new JSON ?_extra= format 4
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 4
- GitHub Action to lint Python code with ruff 4
- Datasette cannot be installed with Rye 4
- `--raw-lines` option, like `--raw` for multiple lines 4
- Implement new /content.json?sql=... 4
- Query view shouldn't return `columns` 4
- Plugin hook for database queries that are run 4
- form label { width: 15% } is a bad default 4
- datasette -s/--setting option for setting nested configuration options 4
- Bump sphinx, furo, blacken-docs dependencies 4
- Add spatialite arm64 linux path 4
- Implement sensible query pagination 3
- Command line tool for uploading one or more DBs to Now 3
- date, year, month and day querystring lookups 3
- Implement a better database index page 3
- Add more detailed API documentation to the README 3
- UI for editing named parameters 3
- Consider data-package as a format for metadata 3
- Option to open readonly but not immutable 3
- UI support for running FTS searches 3
- If view is filtered, search should apply within those filtered rows 3
- ?_search=x should work if used directly against a FTS virtual table 3
- Show extra instructions with the interrupted 3
- _group_count= feature improvements 3
- Datasette CSS should include content hash in the URL 3
- A primary key column that has foreign key restriction associated won't rendering label column 3
- Custom template for named canned query 3
- Ability to bundle metadata and templates inside the SQLite file 3
- Run pks_for_table in inspect, executing once at build time rather than constantly 3
- Don't duplicate simple primary keys in the link column 3
- Allow plugins to add new cli sub commands 3
- datasette publish --install=name-of-plugin 3
- label_column option in metadata.json 3
- External metadata.json 3
- Facets should not execute for ?shape=array|object 3
- "config" section in metadata.json (root, database and table level) 3
- Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0 3
- Support multiple filters of the same type 3
- ?_ttl= parameter to control caching 3
- Avoid plugins accidentally loading dependencies twice 3
- Per-database and per-table /-/ URL namespace 3
- Ability to configure SQLite cache_size 3
- datasette inspect takes a very long time on large dbs 3
- Ensure --help examples in docs are always up to date 3
- Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 3
- render_cell(value) plugin hook 3
- Use pysqlite3 if available 3
- Update official datasetteproject/datasette Docker container to SQLite 3.26.0 3
- Ensure downloading a 100+MB SQLite database file works 3
- Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs 3
- Experiment: run Jinja in async mode 3
- .insert_all() should accept a generator and process it efficiently 3
- Utilities for adding indexes 3
- Refactor facets to a class and new plugin, refs #427 3
- Fix the "datasette now publish ... --alias=x" option 3
- Make it so Docker build doesn't delay PyPI release 3
- Option to ignore inserts if primary key exists already 3
- Test against Python 3.8-dev using Travis 3
- asgi_wrapper plugin hook 3
- Unable to use rank when fts-table generated with csvs-to-sqlite 3
- Mechanism for secrets in plugin configuration 3
- datasette publish option for setting plugin configuration secrets 3
- Potential improvements to facet-by-date 3
- Support unicode in url 3
- CodeMirror fails to load on database page 3
- .add_column() doesn't match indentation of initial creation 3
- Script uses a lot of RAM 3
- "Too many SQL variables" on large inserts 3
- Add triggers while enabling FTS 3
- "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user 3
- Extract "source" into a separate lookup table 3
- Track and use the 'since' value 3
- since_id support for home-timeline 3
- --since support for various commands for refresh-by-cron 3
- _where= parameter is not persisted in hidden form fields 3
- /-/plugins shows incorrect name for plugins 3
- Static assets no longer loading for installed plugins 3
- Add this repos_starred view 3
- `import` command fails on empty files 3
- rowid is not included in dropdown filter menus 3
- Custom queries with 0 results should say "0 results" 3
- Don't suggest column for faceting if all values are 1 3
- Command for importing events 3
- Add a glossary to the documentation 3
- Template debug mode that outputs template context 3
- Copy and paste doesn't work reliably on iPhone for SQL editor 3
- Tests are failing due to missing FTS5 3
- Assets table with downloads 3
- upsert_all() throws issue when upserting to empty table 3
- order_by mechanism 3
- Escape_fts5_query-hookimplementation does not work with queries to standard tables 3
- Tutorial command no longer works 3
- prepare_connection() plugin hook should accept optional datasette argument 3
- Cashe-header missing in http-response 3
- Variables from extra_template_vars() not exposed in _context=1 3
- Search box CSS doesn't look great on OS X Safari 3
- Handle "User not found" error 3
- WIP implementation of writable canned queries 3
- Adding a "recreate" flag to the `Database` constructor 3
- --plugin-secret over-rides existing metadata.json plugin config 3
- Pull repository contributors 3
- Mechanism for forcing column-type, over-riding auto-detection 3
- Issue and milestone should have foreign key to repo 3
- Issue comments don't appear to populate issues foreign key 3
- Configuration directory mode 3
- Fall back to authentication via ENV 3
- Create index on issue_comments(user) and other foreign keys 3
- Mechanism for creating views if they don't yet exist 3
- Add notlike table filter 3
- Question: Any fixed date for the release with the uft8-encoding fix? 3
- Way of seeing full schema for a database 3
- Add PyPI project urls to setup.py 3
- Error pages not correctly loading CSS 3
- request.url and request.scheme should obey force_https_urls config setting 3
- CSRF protection for /-/messages tool and writable canned queries 3
- Documentation for new "params" setting for canned queries 3
- Ability to customize what happens when a view permission fails 3
- Documentation is inconsistent about "id" as required field on actor 3
- Document the ds_actor signed cookie 3
- Horizontal scrollbar on changelog page on mobile 3
- Script to generate larger SQLite test files 3
- Support for compound (composite) foreign keys 3
- "Logged in as: XXX - logout" navigation item 3
- Canned query page should show the name of the canned query 3
- Ability to remove a foreign key 3
- Some links don't honor base_url 3
- Add a table of contents to the README 3
- "allow": true for anyone, "allow": false for nobody 3
- Interactive debugging tool for "allow" blocks 3
- Ability to insert files piped to insert-files stdin 3
- Support tokenize option for FTS 3
- Refactor TableView class so things like datasette-graphql can reuse the logic 3
- "datasette install" and "datasette uninstall" commands 3
- db.execute_write_fn(create_tables, block=True) hangs a thread if connection fails 3
- Pass columns to extra CSS/JS/etc plugin hooks 3
- Code for finding SpatiaLite in the usual locations 3
- --load-extension=spatialite shortcut option 3
- insert_all(..., alter=True) should work for new columns introduced after the first 100 records 3
- Datasette plugin to provide custom page for running faceted, ranked searches 3
- Timeline view 3
- table.optimize() should delete junk rows from *_fts_docsize 3
- Documentation for 404.html, 500.html templates 3
- Add --tar option to "datasette publish heroku" 3
- request an "-o" option on "datasette server" to open the default browser at the running url 3
- Add docs for .transform(column_order=) 3
- Default table view JSON should include CREATE TABLE 3
- Better handling of multiple matching template wildcard paths 3
- Documentation covering buildpack deployment 3
- Datasette should default to running Uvicorn with workers=1 3
- from_json jinja2 filter 3
- Remove xfail tests when new httpx is released 3
- json / CSV links are broken in Datasette 0.50 3
- Add a "delete" icon next to filters (in addition to "remove filter") 3
- Fix issues relating to base_url 3
- Fallback to databases in inspect-data.json when no -i options are passed 3
- datasette.urls.static_plugins(...) method 3
- datasette.urls.table(..., format="json") argument 3
- Add horizontal scrollbar to tables 3
- .blob output renderer 3
- Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows 3
- .csv should link to .blob downloads 3
- Table actions menu plus plugin hook 3
- latest.datasette.io should include plugins from fixtures 3
- database_actions plugin hook 3
- 3.0 release with some minor breaking changes 3
- table.search() improvements plus sqlite-utils search command 3
- Foreign keys with blank titles result in non-clickable links 3
- OperationalError('interrupted') can 500 on row page 3
- Custom widgets for canned query forms 3
- Support linking to compound foreign keys 3
- --load-extension=spatialite not working with datasetteproject/datasette docker image 3
- github-to-sqlite workflows command 3
- "datasette inspect" outputs invalid JSON if an error is logged 3
- "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 3
- Prettier package not actually being cached 3
- Certain database names results in 404: "Database not found: None" 3
- Retire "Ecosystem" page in favour of datasette.io/plugins and /tools 3
- "Statement may not contain PRAGMA" error is not strictly true 3
- `datasette publish upload` mechanism for uploading databases to an existing Datasette instance 3
- ?_size= argument is not persisted by hidden form fields in the table filters 3
- Rename /:memory: to /_memory 3
- gzip support for HTML (and JSON) responses 3
- Re-submitting filter form duplicates _x querystring arguments 3
- Error reading csv files with large column data 3
- Hitting `_csv.Error: field larger than field limit (131072)` 3
- db["my_table"].drop(ignore=True) parameter, plus sqlite-utils drop-table --ignore and drop-view --ignore 3
- Suggest for ArrayFacet possibly confused by blank values 3
- Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions 3
- Allow canned query params to specify default values 3
- Escaping FTS search strings 3
- Try implementing SQLite timeouts using .interrupt() instead of using .set_progress_handler() 3
- Handle byte order marks (BOMs) in CSV files 3
- Speed up tests with pytest-xdist 3
- Avoid error sorting by relationships if related tables are not allowed 3
- Columns named "link" display in bold 3
- Improve `path_with_replaced_args()` and friends and document them 3
- Supporting additional output formats, like GeoJSON 3
- Release Datasette 0.57 3
- Add some types, enforce with mypy 3
- DRAFT: A new plugin hook for dynamic metadata 3
- Official Datasette Docker image should use SQLite >= 3.31.0 (for generated columns) 3
- Mechanism for plugins to exclude certain paths from CSRF checks 3
- Use HN algolia endpoint to retrieve trees 3
- utils.parse_metadata() should be a documented internal function 3
- `table.convert(..., where=)` and `sqlite-utils convert ... --where=` 3
- Rename Datasette.__init__(config=) parameter to settings= 3
- Modify base.html template to support optional sticky footer 3
- Try blacken-docs 3
- Win32 "used by another process" error with datasette publish 3
- Datasette 1.0 JSON API (and documentation) 3
- Datasette 1.0 documented template context (maybe via API docs) 3
- "Links from other tables" broken for columns starting with underscore 3
- Research pattern for re-registering existing Click tools with register_commands 3
- A way of creating indexes on newly created tables 3
- Optional caching mechanism for table.lookup() 3
- Custom pages don't work on windows 3
- Redesign CSV export to improve usability 3
- `keep_blank_values=True` when parsing `request.args` 3
- Redesign `facet_results` JSON structure prior to Datasette 1.0 3
- Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 3
- Offer `python -m sqlite_utils` as an alternative to `sqlite-utils` 3
- `explain query plan select` is too strict about whitespace 3
- List `--fmt` options in the docs 3
- `sqlite-utils bulk` command 3
- Add a CLI reference page to the docs, inspired by sqlite-utils 3
- Tests failing against Python 3.6 3
- Link: rel="alternate" to JSON for queries too 3
- Support IF NOT EXISTS for table creation 3
- Update Dockerfile generated by `datasette publish` 3
- Refactor URL routing to enable testing 3
- Make route matched pattern groups more consistent 3
- Reconsider ensure_permissions() logic, can it be less confusing? 3
- Make "<Binary: 2427344 bytes>" easier to read 3
- `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 3
- Refactor `RowView` and remove `RowTableShared` 3
- ?_trace=1 doesn't work on Global Power Plants demo 3
- Remove python-baseconv dependency 3
- CLI eats my cursor 3
- `detect_fts()` identifies the wrong table if tables have names that are subsets of each other 3
- Combining `rows_where()` and `search()` to limit which rows are searched 3
- `sqlite_utils.utils.TypeTracker` should be a documented API 3
- Incorrect syntax highlighting in docs CLI reference 3
- Cross-link CLI to Python docs 3
- Research an upgrade to CodeMirror 6 3
- Remove upper bound dependencies as a default policy 3
- Featured table(s) on the homepage 3
- Ability to merge databases and tables 3
- Preserve query on timeout 3
- Switch to keyword-only arguments for a bunch of internal methods 3
- Support JSON values returned from .convert() functions 3
- docker image is duplicating db files somehow 3
- Private database page should show padlock on every table 3
- Flaky test: test_serve_localhost_http 3
- allow_signed_tokens setting for disabling API signed token mechanism 3
- datasette create-token CLI command 3
- Release 0.63 3
- Make `cursor.rowcount` accessible (wontfix) 3
- mypy failures in CI 3
- latest.datasette.io Cloud Run deploys failing 3
- Incorrect link from the API explorer to the JSON API documentation 3
- Upgrade for Sphinx 6.0 (once Furo has support for it) 3
- array facet: don't materialize unnecessary columns 3
- Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError 3
- Initial proof of concept with ChatGPT 3
- Implement a SQL view to make it easier to query files in a nested folder 3
- sphinx.builders.linkcheck build error 3
- AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7 3
- Drop support for Python 3.7 3
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 3
- register_command plugin hook 3
- `datasette install -e` option 3
- feat: Implement a prepare_connection plugin hook 3
- `prepare_connection()` plugin hook 3
- Plugin hook for adding new output formats 3
- Implement and document extras for the new query view page 3
- Implement canned queries against new query JSON work 3
- Turn DatabaseDownload into an async view function 3
- database color shows only on index page, not other pages 3
- …
user 1
- simonw · 8,883 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at | author_association ▼ | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
513437463 | https://github.com/dogsheep/healthkit-to-sqlite/issues/1#issuecomment-513437463 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDUxMzQzNzQ2Mw== | simonw 9599 | 2019-07-20T05:19:59Z | 2019-07-20T05:19:59Z | MEMBER | I ran xml_analyser against the XML HealthKit
The most interesting bit is this:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use XML Analyser to figure out the structure of the export XML 470637068 | |
513439411 | https://github.com/dogsheep/healthkit-to-sqlite/issues/2#issuecomment-513439411 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDUxMzQzOTQxMQ== | simonw 9599 | 2019-07-20T05:58:57Z | 2019-07-20T05:58:57Z | MEMBER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import workouts 470637152 | |
513440090 | https://github.com/dogsheep/healthkit-to-sqlite/issues/4#issuecomment-513440090 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDUxMzQ0MDA5MA== | simonw 9599 | 2019-07-20T06:11:50Z | 2019-07-20T06:11:50Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import Records 470640505 | ||
513514978 | https://github.com/dogsheep/healthkit-to-sqlite/issues/5#issuecomment-513514978 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDUxMzUxNDk3OA== | simonw 9599 | 2019-07-21T02:55:12Z | 2019-07-21T02:55:12Z | MEMBER | I'm going to show this by default. Users can pass |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add progress bar 470691622 | |
513625406 | https://github.com/dogsheep/healthkit-to-sqlite/issues/5#issuecomment-513625406 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDUxMzYyNTQwNg== | simonw 9599 | 2019-07-22T03:20:16Z | 2019-07-22T03:20:16Z | MEMBER | It now renders like this:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add progress bar 470691622 | |
513626742 | https://github.com/dogsheep/healthkit-to-sqlite/issues/6#issuecomment-513626742 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6 | MDEyOklzc3VlQ29tbWVudDUxMzYyNjc0Mg== | simonw 9599 | 2019-07-22T03:28:55Z | 2019-07-22T03:28:55Z | MEMBER | Here's what it looks like now as separate tables: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Break up records into different tables for each type 470856782 | |
514496725 | https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514496725 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7 | MDEyOklzc3VlQ29tbWVudDUxNDQ5NjcyNQ== | simonw 9599 | 2019-07-24T06:20:59Z | 2019-07-24T06:20:59Z | MEMBER | I'm using https://pypi.org/project/memory-profiler/ to explore this in more detail:
Then:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Script uses a lot of RAM 472097220 | |
514498221 | https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514498221 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7 | MDEyOklzc3VlQ29tbWVudDUxNDQ5ODIyMQ== | simonw 9599 | 2019-07-24T06:26:49Z | 2019-07-24T06:26:49Z | MEMBER | Adding |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Script uses a lot of RAM 472097220 | |
514500253 | https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514500253 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7 | MDEyOklzc3VlQ29tbWVudDUxNDUwMDI1Mw== | simonw 9599 | 2019-07-24T06:34:28Z | 2019-07-24T06:34:28Z | MEMBER | Clearing the root element each time saved even more: |
{ "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 2, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Script uses a lot of RAM 472097220 | |
515226724 | https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515226724 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDUxNTIyNjcyNA== | simonw 9599 | 2019-07-25T21:46:01Z | 2019-07-25T21:46:01Z | MEMBER | I can work around this here (prior to the fix in sqlite-utils) by setting the batch size to something a bit lower here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Too many SQL variables 472429048 | |
515322294 | https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515322294 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDUxNTMyMjI5NA== | simonw 9599 | 2019-07-26T06:07:12Z | 2019-07-26T06:07:12Z | MEMBER | @tholo this should be fixed in just-released version 0.3.2 - could you run a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Too many SQL variables 472429048 | |
526701674 | https://github.com/dogsheep/swarm-to-sqlite/issues/2#issuecomment-526701674 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDUyNjcwMTY3NA== | simonw 9599 | 2019-08-30T18:24:26Z | 2019-08-30T18:24:26Z | MEMBER | I renamed |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--save option to dump checkins to a JSON file on disk 487598468 | |
526853542 | https://github.com/dogsheep/swarm-to-sqlite/issues/4#issuecomment-526853542 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDUyNjg1MzU0Mg== | simonw 9599 | 2019-08-31T18:06:32Z | 2019-08-31T18:06:32Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Online tool for getting a Foursquare OAuth token 487601121 | ||
527200332 | https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-527200332 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDUyNzIwMDMzMg== | simonw 9599 | 2019-09-02T16:32:20Z | 2019-09-02T16:32:39Z | MEMBER | Also needed: an option for "fetch all checkins created within the last X days". This should help provide support for that Swarm feature where you can retroactively checkin to places in the past. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to fetch only checkins more recent than the current max checkin 487600595 | |
527682713 | https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-527682713 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDUyNzY4MjcxMw== | simonw 9599 | 2019-09-03T23:48:57Z | 2019-09-03T23:48:57Z | MEMBER | One interesting challenge here is that the JSON format for tweets in the archive is subtly different from the JSON format currently returned by the API. If we want to keep the tweets in the same database table (which feels like the right thing to me) we'll need to handle this. One thing we can do is have a column for We can also ensure that tweets from the API always over-write the version that came from the archive (using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing data from a Twitter Export file 488835586 | |
527684202 | https://github.com/dogsheep/twitter-to-sqlite/issues/5#issuecomment-527684202 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDUyNzY4NDIwMg== | simonw 9599 | 2019-09-03T23:56:28Z | 2019-09-03T23:56:28Z | MEMBER | I previously used betamax here: https://github.com/simonw/github-contents/blob/master/test_github_contents.py |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Write tests that simulate the Twitter API 488874815 | |
527954898 | https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527954898 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDUyNzk1NDg5OA== | simonw 9599 | 2019-09-04T15:31:46Z | 2019-09-04T15:31:46Z | MEMBER | I'm going to call this |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"twitter-to-sqlite user-timeline" command for pulling tweets by a specific user 488833698 | |
527955302 | https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527955302 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDUyNzk1NTMwMg== | simonw 9599 | 2019-09-04T15:32:39Z | 2019-09-04T15:32:39Z | MEMBER | Rate limit is 900 / 15 minutes which is 1 call per second. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"twitter-to-sqlite user-timeline" command for pulling tweets by a specific user 488833698 | |
527990908 | https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527990908 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDUyNzk5MDkwOA== | simonw 9599 | 2019-09-04T16:57:24Z | 2019-09-04T16:57:24Z | MEMBER | I just tried this using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"twitter-to-sqlite user-timeline" command for pulling tweets by a specific user 488833698 | |
529239307 | https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-529239307 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDUyOTIzOTMwNw== | simonw 9599 | 2019-09-08T20:36:49Z | 2019-09-08T20:36:49Z | MEMBER |
If you omit the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--sql and --attach options for feeding commands from SQL queries 490803176 | |
529240286 | https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-529240286 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDUyOTI0MDI4Ng== | simonw 9599 | 2019-09-08T20:48:33Z | 2019-09-08T20:48:33Z | MEMBER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--sql and --attach options for feeding commands from SQL queries 490803176 | |
530028567 | https://github.com/dogsheep/twitter-to-sqlite/issues/9#issuecomment-530028567 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDUzMDAyODU2Nw== | simonw 9599 | 2019-09-10T16:59:25Z | 2019-09-10T16:59:25Z | MEMBER | By default in SQLite foreign key constraints are not enforced (you need to run We will take advantage of this - even though the In the future we may add a command that can backfill missing user records. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
followers-ids and friends-ids subcommands 491791152 | |
530417631 | https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-530417631 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDUzMDQxNzYzMQ== | simonw 9599 | 2019-09-11T14:52:44Z | 2019-09-14T19:09:22Z | MEMBER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--sql and --attach options for feeding commands from SQL queries 490803176 | |
531404891 | https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-531404891 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDUzMTQwNDg5MQ== | simonw 9599 | 2019-09-13T22:01:57Z | 2019-09-13T22:01:57Z | MEMBER | I also wrote about this in https://simonwillison.net/2019/Sep/13/weeknotestwitter-sqlite-datasette-rure/ |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--sql and --attach options for feeding commands from SQL queries 490803176 | |
531516956 | https://github.com/dogsheep/github-to-sqlite/issues/3#issuecomment-531516956 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDUzMTUxNjk1Ng== | simonw 9599 | 2019-09-14T21:56:31Z | 2019-09-14T21:56:31Z | MEMBER | https://api.github.com/users/simonw/repos It would be useful to be able to fetch stargazers, forks etc as well. Not sure if that should be a separate command or a Probably a separate command since |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to fetch all repos belonging to a user or organization 493670426 | |
531517083 | https://github.com/dogsheep/github-to-sqlite/issues/3#issuecomment-531517083 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDUzMTUxNzA4Mw== | simonw 9599 | 2019-09-14T21:58:42Z | 2019-09-14T21:58:42Z | MEMBER | Split stargazers into #4 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to fetch all repos belonging to a user or organization 493670426 | |
531517138 | https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-531517138 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDUzMTUxNzEzOA== | simonw 9599 | 2019-09-14T21:59:59Z | 2019-09-14T21:59:59Z | MEMBER | Paginate through https://api.github.com/repos/simonw/datasette/stargazers Send |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to fetch stargazers for one or more repos 493670730 | |
538711918 | https://github.com/dogsheep/twitter-to-sqlite/issues/11#issuecomment-538711918 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11 | MDEyOklzc3VlQ29tbWVudDUzODcxMTkxOA== | simonw 9599 | 2019-10-06T04:54:17Z | 2019-10-06T04:54:17Z | MEMBER | Shipped in 0.6. Here's the documentation: https://github.com/dogsheep/twitter-to-sqlite#capturing-tweets-in-real-time-with-track-and-follow |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Commands for recording real-time tweets from the streaming API 503045221 | |
538804815 | https://github.com/dogsheep/twitter-to-sqlite/issues/13#issuecomment-538804815 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13 | MDEyOklzc3VlQ29tbWVudDUzODgwNDgxNQ== | simonw 9599 | 2019-10-07T00:33:49Z | 2019-10-07T00:33:49Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
statuses-lookup command 503085013 | ||
538847446 | https://github.com/dogsheep/pocket-to-sqlite/issues/1#issuecomment-538847446 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDUzODg0NzQ0Ng== | simonw 9599 | 2019-10-07T05:41:17Z | 2019-10-07T05:41:17Z | MEMBER | Prototype code:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use better pagination (and implement progress bar) 503233021 | |
538847796 | https://github.com/dogsheep/pocket-to-sqlite/issues/2#issuecomment-538847796 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDUzODg0Nzc5Ng== | simonw 9599 | 2019-10-07T05:43:30Z | 2019-10-07T05:43:30Z | MEMBER | We can persist the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Track and use the 'since' value 503234169 | |
540879620 | https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-540879620 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDU0MDg3OTYyMA== | simonw 9599 | 2019-10-11T02:59:16Z | 2019-10-11T02:59:16Z | MEMBER | Also import ad preferences and all that other junk. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing data from a Twitter Export file 488835586 | |
541112108 | https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112108 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17 | MDEyOklzc3VlQ29tbWVudDU0MTExMjEwOA== | simonw 9599 | 2019-10-11T15:30:15Z | 2019-10-11T15:30:15Z | MEMBER | It should delete the tables entirely. That way it will work even if the table schema has changed. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
import command should empty all archive-* tables first 505674949 | |
541112588 | https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112588 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17 | MDEyOklzc3VlQ29tbWVudDU0MTExMjU4OA== | simonw 9599 | 2019-10-11T15:31:30Z | 2019-10-11T15:31:30Z | MEMBER | No need for an option:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
import command should empty all archive-* tables first 505674949 | |
541118773 | https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118773 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 | MDEyOklzc3VlQ29tbWVudDU0MTExODc3Mw== | simonw 9599 | 2019-10-11T15:48:31Z | 2019-10-11T15:48:31Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to import home-timeline 505928530 | ||
541118934 | https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118934 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 | MDEyOklzc3VlQ29tbWVudDU0MTExODkzNA== | simonw 9599 | 2019-10-11T15:48:54Z | 2019-10-11T15:48:54Z | MEMBER | Rate limit is tight: 15 requests every 15 mins! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to import home-timeline 505928530 | |
541119834 | https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541119834 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 | MDEyOklzc3VlQ29tbWVudDU0MTExOTgzNA== | simonw 9599 | 2019-10-11T15:51:22Z | 2019-10-11T16:51:33Z | MEMBER | In order to support multiple user timelines being saved in the same database, I'm going to import the tweets into the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to import home-timeline 505928530 | |
541141169 | https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541141169 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 | MDEyOklzc3VlQ29tbWVudDU0MTE0MTE2OQ== | simonw 9599 | 2019-10-11T16:51:29Z | 2019-10-11T16:51:29Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to import home-timeline 505928530 | ||
541248629 | https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-541248629 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19 | MDEyOklzc3VlQ29tbWVudDU0MTI0ODYyOQ== | simonw 9599 | 2019-10-11T22:48:56Z | 2019-10-11T22:48:56Z | MEMBER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
since_id support for home-timeline 506087267 | |
541387822 | https://github.com/dogsheep/github-to-sqlite/issues/6#issuecomment-541387822 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6 | MDEyOklzc3VlQ29tbWVudDU0MTM4NzgyMg== | simonw 9599 | 2019-10-13T05:27:39Z | 2019-10-13T05:27:39Z | MEMBER | This should be fixed by https://github.com/dogsheep/github-to-sqlite/commit/552543a74970f8a3a3f87f887be23a0c6eb1cb5b |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
sqlite3.OperationalError: table users has no column named bio 504238461 | |
541387941 | https://github.com/dogsheep/github-to-sqlite/issues/6#issuecomment-541387941 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6 | MDEyOklzc3VlQ29tbWVudDU0MTM4Nzk0MQ== | simonw 9599 | 2019-10-13T05:30:19Z | 2019-10-13T05:30:19Z | MEMBER | Fix released in 0.5: https://github.com/dogsheep/github-to-sqlite/releases/tag/0.5 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
sqlite3.OperationalError: table users has no column named bio 504238461 | |
541388038 | https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-541388038 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 | MDEyOklzc3VlQ29tbWVudDU0MTM4ODAzOA== | simonw 9599 | 2019-10-13T05:31:58Z | 2019-10-13T05:31:58Z | MEMBER | For favourites a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--since support for various commands for refresh-by-cron 506268945 | |
541493242 | https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-541493242 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0MTQ5MzI0Mg== | simonw 9599 | 2019-10-14T03:35:36Z | 2019-10-14T03:35:36Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for running a search and saving tweets for that search 488833975 | ||
541721437 | https://github.com/dogsheep/github-to-sqlite/issues/7#issuecomment-541721437 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7 | MDEyOklzc3VlQ29tbWVudDU0MTcyMTQzNw== | simonw 9599 | 2019-10-14T14:44:12Z | 2019-10-14T14:44:12Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
issue-comments command for importing issue comments 506276893 | ||
541748580 | https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-541748580 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU0MTc0ODU4MA== | simonw 9599 | 2019-10-14T15:30:44Z | 2019-10-14T15:30:44Z | MEMBER | Had several recommendations for https://github.com/tqdm/tqdm which is what goodreads-to-sqlite uses. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rethink progress bars for various commands 492297930 | |
542333836 | https://github.com/dogsheep/twitter-to-sqlite/issues/21#issuecomment-542333836 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21 | MDEyOklzc3VlQ29tbWVudDU0MjMzMzgzNg== | simonw 9599 | 2019-10-15T18:00:48Z | 2019-10-15T18:00:48Z | MEMBER | I'll use |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix & escapes in tweet text 506432572 | |
542832952 | https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542832952 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19 | MDEyOklzc3VlQ29tbWVudDU0MjgzMjk1Mg== | simonw 9599 | 2019-10-16T18:30:11Z | 2019-10-16T18:30:11Z | MEMBER | The
The |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
since_id support for home-timeline 506087267 | |
542849963 | https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542849963 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19 | MDEyOklzc3VlQ29tbWVudDU0Mjg0OTk2Mw== | simonw 9599 | 2019-10-16T19:13:06Z | 2019-10-16T19:13:06Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
since_id support for home-timeline 506087267 | ||
542854749 | https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-542854749 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 | MDEyOklzc3VlQ29tbWVudDU0Mjg1NDc0OQ== | simonw 9599 | 2019-10-16T19:26:01Z | 2019-10-16T19:26:01Z | MEMBER | I'm not going to do this for "accounts that have followed me" and "new accounts that I have followed" - instead I will recommend running the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--since support for various commands for refresh-by-cron 506268945 | |
542855081 | https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855081 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDU0Mjg1NTA4MQ== | simonw 9599 | 2019-10-16T19:26:56Z | 2019-10-16T19:26:56Z | MEMBER | This may be the first case where I want to be able to repair existing databases rather than discarding their contents. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Extract "source" into a separate lookup table 503053800 | |
542855427 | https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855427 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDU0Mjg1NTQyNw== | simonw 9599 | 2019-10-16T19:27:55Z | 2019-10-16T19:27:55Z | MEMBER | I can do that by keeping |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Extract "source" into a separate lookup table 503053800 | |
542858025 | https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542858025 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDU0Mjg1ODAyNQ== | simonw 9599 | 2019-10-16T19:35:31Z | 2019-10-16T19:36:09Z | MEMBER | Maybe this means I need an |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Extract "source" into a separate lookup table 503053800 | |
542875885 | https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542875885 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0Mjg3NTg4NQ== | simonw 9599 | 2019-10-16T20:23:08Z | 2019-10-16T20:23:08Z | MEMBER | https://developer.foursquare.com/docs/api/users/checkins documents
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to fetch only checkins more recent than the current max checkin 487600595 | |
542876047 | https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542876047 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0Mjg3NjA0Nw== | simonw 9599 | 2019-10-16T20:23:36Z | 2019-10-16T20:23:36Z | MEMBER | I'm going to go with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to fetch only checkins more recent than the current max checkin 487600595 | |
542882604 | https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542882604 | https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0Mjg4MjYwNA== | simonw 9599 | 2019-10-16T20:41:23Z | 2019-10-16T20:41:23Z | MEMBER | Documented here: https://github.com/dogsheep/swarm-to-sqlite/blob/0.2/README.md#usage |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Option to fetch only checkins more recent than the current max checkin 487600595 | |
543217890 | https://github.com/dogsheep/twitter-to-sqlite/issues/23#issuecomment-543217890 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23 | MDEyOklzc3VlQ29tbWVudDU0MzIxNzg5MA== | simonw 9599 | 2019-10-17T15:03:10Z | 2019-10-17T15:03:10Z | MEMBER | Thinking about this further: the concept of migrations may end up being in direct conflict with the I'm going to forge ahead anyway and build this because I think it will be an interesting exploration, but it's very likely this turns out to be a bad idea in the long run! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Extremely simple migration system 508190730 | |
543222239 | https://github.com/dogsheep/twitter-to-sqlite/issues/23#issuecomment-543222239 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23 | MDEyOklzc3VlQ29tbWVudDU0MzIyMjIzOQ== | simonw 9599 | 2019-10-17T15:12:33Z | 2019-10-17T15:12:33Z | MEMBER | Migrations will run only if you open a database that previously existed (as opposed to opening a brand new empty database). This means that the first time you run a command against a fresh database, migrations will not run and the This also means that each migration needs to be able to sanity check the database to see if it should run or not. If it should NOT run, it will do nothing but still be marked as having executed by adding to the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Extremely simple migration system 508190730 | |
543265058 | https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543265058 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25 | MDEyOklzc3VlQ29tbWVudDU0MzI2NTA1OA== | simonw 9599 | 2019-10-17T16:51:12Z | 2019-10-17T16:51:12Z | MEMBER | This migration function only runs if there is a table called I think this can happen if the database has just been freshly created (by a command that fetches the user's user timeline for example) and is then run a SECOND time. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ensure migrations don't accidentally create foreign key twice 508578780 | |
543266947 | https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543266947 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25 | MDEyOklzc3VlQ29tbWVudDU0MzI2Njk0Nw== | simonw 9599 | 2019-10-17T16:56:06Z | 2019-10-17T16:56:06Z | MEMBER | I wrote a test that proves that this is a problem. Should be an easy fix though. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ensure migrations don't accidentally create foreign key twice 508578780 | |
543269396 | https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543269396 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU0MzI2OTM5Ng== | simonw 9599 | 2019-10-17T17:02:07Z | 2019-10-17T17:02:07Z | MEMBER | A neat trick that Click does is detecting if an interactive terminal is attached and NOT showing a progress bar if there isn't one. Need to figure out how to do that with tqdm. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rethink progress bars for various commands 492297930 | |
543270714 | https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543270714 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU0MzI3MDcxNA== | simonw 9599 | 2019-10-17T17:05:16Z | 2019-10-17T17:05:16Z | MEMBER | https://github.com/pallets/click/blob/716a5be90f56ce6cd506bb53d5739d09374b1636/click/_termui_impl.py#L93 is how Click does this:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rethink progress bars for various commands 492297930 | |
543271000 | https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543271000 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU0MzI3MTAwMA== | simonw 9599 | 2019-10-17T17:05:59Z | 2019-10-17T17:05:59Z | MEMBER | Looks like tqdm already does a TTY check here: https://github.com/tqdm/tqdm/blob/89b73bdc30c099c5b53725806e7edf3a121c9b3a/tqdm/std.py#L889-L890 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rethink progress bars for various commands 492297930 | |
543273540 | https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543273540 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0MzI3MzU0MA== | simonw 9599 | 2019-10-17T17:12:51Z | 2019-10-17T17:12:51Z | MEMBER | Just importing tweets here isn't enough - how are we supposed to know which tweets were imported by which search? So I think the right thing to do here is to also create a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for running a search and saving tweets for that search 488833975 | |
543290744 | https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543290744 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0MzI5MDc0NA== | simonw 9599 | 2019-10-17T17:57:14Z | 2019-10-17T17:57:14Z | MEMBER | I have a working command now. I'm going to ship it early because it could do with some other people trying it out. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for running a search and saving tweets for that search 488833975 | |
544335363 | https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 | MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw== | simonw 9599 | 2019-10-21T03:32:04Z | 2019-10-21T03:32:04Z | MEMBER | In case anyone is interested, here's an extract from the crontab I'm running these under at the moment:
|
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--since support for various commands for refresh-by-cron 506268945 | |
544646516 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-544646516 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDU0NDY0NjUxNg== | simonw 9599 | 2019-10-21T18:30:14Z | 2019-10-21T18:30:14Z | MEMBER | Thanks to help from Dr. Laura Cantino at Science Hack Day San Francisco I've been able to pull together this query:
See also https://www.snpedia.com/index.php/Rs12913832 - in particular this table: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
544648863 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-544648863 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDU0NDY0ODg2Mw== | simonw 9599 | 2019-10-21T18:36:03Z | 2019-10-21T18:36:03Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | ||
547713287 | https://github.com/dogsheep/twitter-to-sqlite/issues/26#issuecomment-547713287 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26 | MDEyOklzc3VlQ29tbWVudDU0NzcxMzI4Nw== | simonw 9599 | 2019-10-30T02:36:13Z | 2019-10-30T02:36:13Z | MEMBER | Shipped this in 0.13: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.13 See also this Twitter thread: https://twitter.com/simonw/status/1189369677509623809 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing mentions timeline 513074501 | |
549094195 | https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549094195 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDU0OTA5NDE5NQ== | simonw 9599 | 2019-11-03T00:43:16Z | 2019-11-03T00:43:28Z | MEMBER | Also need to take #5 into account - if this command creates incomplete user records, how do we repair them? And make sure that if we run this command first any future commands that populate users don't break (probably just a case of using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
stargazers command, refs #4 516763727 | |
549094229 | https://github.com/dogsheep/github-to-sqlite/issues/5#issuecomment-549094229 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDU0OTA5NDIyOQ== | simonw 9599 | 2019-11-03T00:44:03Z | 2019-11-03T00:44:03Z | MEMBER | Might not need an incomplete boolean - may be possible to handle this with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add "incomplete" boolean to users table for incomplete profiles 493671014 | |
549095217 | https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095217 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27 | MDEyOklzc3VlQ29tbWVudDU0OTA5NTIxNw== | simonw 9599 | 2019-11-03T01:06:25Z | 2019-11-03T01:06:25Z | MEMBER | Wow, that It looks like this needs to be combined with this API - https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets-id - to fetch the details of up to 100 recent users who actually DID retweet an individual status. But that has a one-every-12-seconds rate limit on it. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
retweets-of-me command 514459062 | |
549095317 | https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095317 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27 | MDEyOklzc3VlQ29tbWVudDU0OTA5NTMxNw== | simonw 9599 | 2019-11-03T01:08:10Z | 2019-11-03T01:08:10Z | MEMBER | Hmm... one thing that could be useful is that I'm not sure if the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
retweets-of-me command 514459062 | |
549095463 | https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095463 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27 | MDEyOklzc3VlQ29tbWVudDU0OTA5NTQ2Mw== | simonw 9599 | 2019-11-03T01:10:52Z | 2019-11-03T01:10:52Z | MEMBER | I imagine it won't, since the data I would be recording and then passing to |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
retweets-of-me command 514459062 | |
549095641 | https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095641 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27 | MDEyOklzc3VlQ29tbWVudDU0OTA5NTY0MQ== | simonw 9599 | 2019-11-03T01:12:58Z | 2019-11-03T01:12:58Z | MEMBER | It looks like Twitter really want you to subscribe to a premium API for this kind of thing and consume retweets via webhooks: https://developer.twitter.com/en/docs/accounts-and-users/subscribe-account-activity/api-reference I'm going to give up on this for the moment. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
retweets-of-me command 514459062 | |
549096321 | https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549096321 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0OTA5NjMyMQ== | simonw 9599 | 2019-11-03T01:27:55Z | 2019-11-03T01:28:17Z | MEMBER | It would be neat if this could support |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for running a search and saving tweets for that search 488833975 | |
549226399 | https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549226399 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0OTIyNjM5OQ== | simonw 9599 | 2019-11-04T05:11:57Z | 2019-11-04T05:11:57Z | MEMBER | I'm going to add a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for running a search and saving tweets for that search 488833975 | |
549228535 | https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549228535 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3 | MDEyOklzc3VlQ29tbWVudDU0OTIyODUzNQ== | simonw 9599 | 2019-11-04T05:31:55Z | 2019-11-04T05:31:55Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for running a search and saving tweets for that search 488833975 | ||
549230337 | https://github.com/dogsheep/github-to-sqlite/issues/10#issuecomment-549230337 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU0OTIzMDMzNw== | simonw 9599 | 2019-11-04T05:47:18Z | 2019-11-04T05:47:18Z | MEMBER | This definition isn't quite right - it's not pulling the identity of the user who starred the repo ( |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add this repos_starred view 516967682 | |
549230583 | https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549230583 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDU0OTIzMDU4Mw== | simonw 9599 | 2019-11-04T05:49:26Z | 2019-11-04T05:49:26Z | MEMBER | Adding the view from #10 would be useful here too. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
stargazers command, refs #4 516763727 | |
549233778 | https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549233778 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDU0OTIzMzc3OA== | simonw 9599 | 2019-11-04T06:14:40Z | 2019-11-04T06:14:40Z | MEMBER | Spotted a tricky problem: running But then... when it gets to the I need to find a way of NOT over-writing a good record with a thinner one. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
stargazers command, refs #4 516763727 | |
550388354 | https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-550388354 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDU1MDM4ODM1NA== | simonw 9599 | 2019-11-06T16:26:55Z | 2019-11-06T16:26:55Z | MEMBER | Here's a query I figured out using a window function that shows cumulative stargazers over time:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command to fetch stargazers for one or more repos 493670730 | |
550783316 | https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550783316 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU1MDc4MzMxNg== | simonw 9599 | 2019-11-07T05:16:56Z | 2019-11-07T05:34:29Z | MEMBER | It looks like Apple changed the location of these in iOS 13 - they are now in separate |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Failed to import workout points 519038979 | |
550806302 | https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550806302 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU1MDgwNjMwMg== | simonw 9599 | 2019-11-07T05:33:31Z | 2019-11-07T05:33:31Z | MEMBER | The XML now includes references to these new files: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Failed to import workout points 519038979 | |
550824838 | https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550824838 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU1MDgyNDgzOA== | simonw 9599 | 2019-11-07T05:47:07Z | 2019-11-07T05:47:07Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Failed to import workout points 519038979 | ||
550828084 | https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550828084 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10 | MDEyOklzc3VlQ29tbWVudDU1MDgyODA4NA== | simonw 9599 | 2019-11-07T05:49:24Z | 2019-11-07T05:49:24Z | MEMBER | So the fix there is going to be to detect the new This will be a little tricky because that function will need access to the zip file. It probably won't work at all for the mode where the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Failed to import workout points 519038979 | |
552129686 | https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552129686 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29 | MDEyOklzc3VlQ29tbWVudDU1MjEyOTY4Ng== | simonw 9599 | 2019-11-09T19:27:39Z | 2019-11-09T19:27:39Z | MEMBER | I think this is fixed by the latest version of |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`import` command fails on empty files 518725064 | |
552129921 | https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552129921 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29 | MDEyOklzc3VlQ29tbWVudDU1MjEyOTkyMQ== | simonw 9599 | 2019-11-09T19:30:42Z | 2019-11-09T19:30:42Z | MEMBER | Confirmed, that seems to fix it:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`import` command fails on empty files 518725064 | |
552131798 | https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552131798 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30 | MDEyOklzc3VlQ29tbWVudDU1MjEzMTc5OA== | simonw 9599 | 2019-11-09T19:54:45Z | 2019-11-09T19:54:45Z | MEMBER | Good catch - not sure how that bug crept in. Removing line 116 looks like the right fix to me. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`followers` fails because `transform_user` is called twice 518739697 | |
552133449 | https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552133449 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29 | MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ0OQ== | simonw 9599 | 2019-11-09T20:15:15Z | 2019-11-09T20:15:15Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`import` command fails on empty files 518725064 | ||
552133468 | https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552133468 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30 | MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ2OA== | simonw 9599 | 2019-11-09T20:15:27Z | 2019-11-09T20:15:27Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`followers` fails because `transform_user` is called twice 518739697 | ||
552133488 | https://github.com/dogsheep/twitter-to-sqlite/issues/28#issuecomment-552133488 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28 | MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ4OA== | simonw 9599 | 2019-11-09T20:15:42Z | 2019-11-09T20:15:42Z | MEMBER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add indexes to followers table 515658861 | ||
552135263 | https://github.com/dogsheep/twitter-to-sqlite/issues/31#issuecomment-552135263 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31 | MDEyOklzc3VlQ29tbWVudDU1MjEzNTI2Mw== | simonw 9599 | 2019-11-09T20:38:35Z | 2019-11-09T20:38:35Z | MEMBER | Command still needs documentation and a bit more testing. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"friends" command (similar to "followers") 520508502 | |
559883311 | https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559883311 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14 | MDEyOklzc3VlQ29tbWVudDU1OTg4MzMxMQ== | simonw 9599 | 2019-11-29T21:30:37Z | 2019-11-29T21:30:37Z | MEMBER | I should build the command to persist ETags and obey their polling guidelines:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing events 530491074 | |
559902818 | https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559902818 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14 | MDEyOklzc3VlQ29tbWVudDU1OTkwMjgxOA== | simonw 9599 | 2019-11-30T01:32:38Z | 2019-11-30T01:32:38Z | MEMBER | Prototype:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing events 530491074 | |
594151327 | https://github.com/dogsheep/github-to-sqlite/issues/12#issuecomment-594151327 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDU5NDE1MTMyNw== | simonw 9599 | 2020-03-03T20:26:15Z | 2020-03-03T20:32:23Z | MEMBER | Better version (since this also includes JSON array of repository topics):
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add this view for seeing new releases 520756546 | |
594154644 | https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-594154644 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8 | MDEyOklzc3VlQ29tbWVudDU5NDE1NDY0NA== | simonw 9599 | 2020-03-03T20:33:57Z | 2020-03-03T20:33:57Z | MEMBER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
stargazers command, refs #4 516763727 | |
594155249 | https://github.com/dogsheep/github-to-sqlite/issues/12#issuecomment-594155249 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDU5NDE1NTI0OQ== | simonw 9599 | 2020-03-03T20:35:17Z | 2020-03-03T20:35:17Z | MEMBER |
I think that approach can be approved by first checking if the view exists, then dropping it, then recreating it. Could even try to see if the view exists and matches what we were going to set it to and do nothing if that is the case. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add this view for seeing new releases 520756546 | |
597354514 | https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597354514 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17 | MDEyOklzc3VlQ29tbWVudDU5NzM1NDUxNA== | simonw 9599 | 2020-03-10T22:37:45Z | 2020-03-10T22:37:45Z | MEMBER | I should add an option to stop the moment you see a commit you have fetched before. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing commits 578883725 | |
597358364 | https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597358364 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17 | MDEyOklzc3VlQ29tbWVudDU5NzM1ODM2NA== | simonw 9599 | 2020-03-10T22:50:20Z | 2020-03-11T01:18:36Z | MEMBER | By default it will stop when it sees a commit that has already been stored. You will be able to over-ride that behaviour using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Command for importing commits 578883725 | |
601861908 | https://github.com/dogsheep/twitter-to-sqlite/issues/34#issuecomment-601861908 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34 | MDEyOklzc3VlQ29tbWVudDYwMTg2MTkwOA== | simonw 9599 | 2020-03-20T18:56:44Z | 2020-03-20T18:56:44Z | MEMBER | Could this be a bug in |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
IndexError running user-timeline command 585266763 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
updated_at (date) >1000 ✖