issue_comments
996 rows where author_association = "NONE" sorted by html_url
This data as json, CSV (advanced)
Suggested facets: body, reactions, created_at (date), updated_at (date)
issue 622
- Transformation type `--type DATETIME` 14
- link_or_copy_directory() error - Invalid cross-device link 13
- WIP: Add Gmail takeout mbox import 12
- .json and .csv exports fail to apply base_url 11
- base_url configuration setting 10
- Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 10
- Documentation with recommendations on running Datasette in production without using Docker 9
- JavaScript plugin hooks mechanism similar to pluggy 9
- Add GraphQL endpoint 8
- Call for birthday presents: if you're using Datasette, let us know how you're using it here 8
- Full text search of all tables at once? 7
- Populate "endpoint" key in ASGI scope 7
- Figure out some interesting example SQL queries 7
- Add Gmail takeout mbox import (v2) 7
- Incorrect URLs when served behind a proxy with base_url set 6
- publish heroku does not work on Windows 10 6
- Update for Big Sur 6
- Improve the display of facets information 6
- De-tangling Metadata before Datasette 1.0 6
- Metadata should be a nested arbitrary KV store 5
- Windows installation error 5
- Ways to improve fuzzy search speed on larger data sets? 5
- Redesign default .json format 5
- UNIQUE constraint failed: workouts.id 5
- Feature Request: Gmail 5
- Plugin hook for dynamic metadata 5
- i18n support 5
- datasette --root running in Docker doesn't reliably show the magic URL 5
- Datasette serve should accept paths/URLs to CSVs and other file formats 4
- Mechanism for ranking results from SQLite full-text search 4
- Port Datasette to ASGI 4
- Wildcard support in query parameters 4
- Handle really wide tables better 4
- Prototoype for Datasette on PostgreSQL 4
- Support column descriptions in metadata.json 4
- .delete_where() does not auto-commit (unlike .insert() or .upsert()) 4
- "Stream all rows" is not at all obvious 4
- Possible to deploy as a python app (for Rstudio connect server)? 4
- Document how to send multiple values for "Named parameters" 4
- Add support for Jinja2 version 3.0 4
- Win32 "used by another process" error with datasette publish 4
- introduce new option for datasette package to use a slim base image 4
- CLI eats my cursor 4
- datasette package --spatialite throws error during build 4
- How to redirect from "/" to a specific db/table 4
- Package as standalone binary 3
- Plugin that adds an authentication layer of some sort 3
- datasette publish lambda plugin 3
- Explore if SquashFS can be used to shrink size of packaged Docker containers 3
- make uvicorn optional dependancy (because not ok on windows python yet) 3
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 3
- updating metadata.json without recreating the app 3
- upsert_all() throws issue when upserting to empty table 3
- base_url doesn't seem to work when adding criteria and clicking "apply" 3
- Fallback to databases in inspect-data.json when no -i options are passed 3
- Some workout columns should be float, not text 3
- Archive import appears to be broken on recent exports 3
- Use structlog for logging 3
- KeyError: 'Contents' on running upload 3
- photo-to-sqlite: command not found 3
- sqlite-utils extract could handle nested objects 3
- Update Docker Spatialite version to 5.0.1 + add support for Spatialite topology functions 3
- improve table horizontal scroll experience 3
- feature: support "events" 3
- Rename Datasette.__init__(config=) parameter to settings= 3
- [Enhancement] Please allow 'insert-files' to insert content as text. 3
- KeyError: 'created_at' for private accounts? 3
- JSON link on row page is 404 if base_url setting is used 3
- Creating tables with custom datatypes 3
- query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data 3
- SQL query field can't begin by a comment 3
- Feature request: output number of ignored/replaced rows for insert command 3
- Expand foreign key references in row view as well 3
- When reverse proxying datasette with nginx an URL element gets erronously added 3
- Link to JSON for the list of tables 2
- Option to open readonly but not immutable 2
- Support WITH query 2
- I18n and L10n support 2
- add "format sql" button to query page, uses sql-formatter 2
- 500 from missing table name 2
- Ability to sort (and paginate) by column 2
- Figure out how to bundle a more up-to-date SQLite 2
- Escaping named parameters in canned queries 2
- Validate metadata.json on startup 2
- Support cross-database joins 2
- datasette inspect takes a very long time on large dbs 2
- Installation instructions, including how to use the docker image 2
- Problems handling column names containing spaces or - 2
- Zeit API v1 does not work for new users - need to migrate to v2 2
- How to pass named parameter into spatialite MakePoint() function 2
- Datasette Library 2
- Mechanism for turning nested JSON into foreign keys / many-to-many 2
- Too many SQL variables 2
- "Invalid SQL" page should let you edit the SQL 2
- Support Python 3.8, stop supporting Python 3.5 2
- Make database level information from metadata.json available in the index.html template 2
- Mechanism for adding arbitrary pages like /about 2
- Exception running first command: IndexError: list index out of range 2
- Allow creation of virtual tables at startup 2
- Escape_fts5_query-hookimplementation does not work with queries to standard tables 2
- Allow injecting configuration data from plugins 2
- --cp option for datasette publish and datasette package for shipping additional files and directories 2
- ?_searchmode=raw option for running FTS searches without escaping characters 2
- Authentication (and permissions) as a core concept 2
- Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6 2
- [Feature Request] Support Repo Name in Search 🥺 2
- Consider pagination of canned queries 2
- initial windows ci setup 2
- github-to-sqlite should handle rate limits better 2
- .extract() shouldn't extract null values 2
- Make it possible to download BLOB data from the Datasette UI 2
- changes to allow for compound foreign keys 2
- Support for generated columns 2
- sqlite-utils should suggest --csv if JSON parsing fails 2
- Better error message for *_fts methods against views 2
- Access Denied Error in Windows 2
- Not all quoted statuses get fetched? 2
- SSL Error 2
- Installing datasette via docker: Path 'fixtures.db' does not exist 2
- Share button for copying current URL 2
- Facets timing out but work when filtering 2
- I'm creating a plugin to export a spreadsheet file (.ods or .xlsx) 2
- Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0 2
- bool type not supported 2
- Cannot set type JSON 2
- basic support for events 2
- Serve all db files in a folder 2
- feature request: document minimum permissions for service account for cloudrun 2
- Manage /robots.txt in Datasette core, block robots by default 2
- Deploy a live instance of demos/apache-proxy 2
- Use datasette-table Web Component to guide the design of the JSON API for 1.0 2
- Support for CHECK constraints 2
- Table+query JSON and CSV links broken when using `base_url` setting 2
- Make it easier to insert geometries, with documentation and maybe code 2
- base_url or prefix does not work with _exact match 2
- `deterministic=True` fails on versions of SQLite prior to 3.8.3 2
- [feature] immutable mode for a directory, not just individual sqlite file 2
- `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 2
- Research: demonstrate if parallel SQL queries are worthwhile 2
- Allow making m2m relation of a table to itself 2
- illegal UTF-16 surrogate 2
- Reading rows from a file => AttributeError: '_io.StringIO' object has no attribute 'readinto' 2
- Ability to insert multi-line files 2
- Setting to turn off table row counts entirely 2
- devrel/python api: Pylance type hinting 2
- Reconsider the Datasette first-run experience 2
- don't use immutable=1, only mode=ro 2
- Datasette with many and large databases > Memory use 2
- Cannot enable FTS5 despite it being available 2
- DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 2
- Suggestion: Hiding columns 2
- How to use Datasette with apache webserver on GCP? 2
- Character encoding problem 2
- feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 2
- GitHub Action to lint Python code with ruff 2
- 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" 2
- photos-to-sql not found? 2
- Permissions in metadata.yml / metadata.json 2
- [feature request]`datasette install plugins.json` options 2
- Plugin hook for database queries that are run 2
- TemplateAssertionError: no filter named 'tojson' 1
- TemplateAssertionError: no filter named 'tojson' 1
- datasette publish can fail if /tmp is on a different device 1
- apsw as alternative sqlite3 binding (for full text search) 1
- Ability to customize presentation of specific columns in HTML view 1
- A primary key column that has foreign key restriction associated won't rendering label column 1
- proposal new option to disable user agents cache 1
- Ability to bundle metadata and templates inside the SQLite file 1
- Cleaner mechanism for handling custom errors 1
- Allow plugins to define additional URL routes and views 1
- prepare_context() plugin hook 1
- SQLite code decoupled from Datasette 1
- Add new metadata key persistent_urls which removes the hash from all database urls 1
- Add links to example Datasette instances to appropiate places in docs 1
- Documentation for URL hashing, redirects and cache policy 1
- Handle spatialite geometry columns better 1
- Support for external database connectors 1
- Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 1
- render_cell(value) plugin hook 1
- Search all apps during heroku publish 1
- CSV export in "Advanced export" pane doesn't respect query 1
- How to pass configuration to plugins? 1
- How does persistence work? 1
- .insert/.upsert/.insert_all/.upsert_all should add missing columns 1
- Add query parameter to hide SQL textarea 1
- Upgrade to Jinja2==2.10.1 1
- Option to facet by date using month or year 1
- Additional options to gcloud build command in cloudrun - timeout 1
- Accessibility for non-techie newsies? 1
- Exporting sqlite database(s)? 1
- Option to display binary data 1
- Get Datasette tests passing on Windows in GitHub Actions 1
- "about" parameter in metadata does not appear when alone 1
- Is it possible to publish to Heroku despite slug size being too large? 1
- Handle case-insensitive headers in a nicer way 1
- Stream all results for arbitrary SQL and canned queries 1
- Use keyed rows - fixes #521 1
- Support unicode in url 1
- extracts= option for insert/update/etc 1
- Unexpected keyword argument 'hidden' 1
- Datasette Edit 1
- Ability to list views, and to access db["view_name"].rows / rows_where / etc 1
- Added support for multi arch builds 1
- Queries per DB table in metadata.json 1
- upgrade to uvicorn-0.9 to be Python-3.8 friendly 1
- Support queries at the table level 1
- Datasette FTS detection bug 1
- "friends" command (similar to "followers") 1
- Publish to Heroku is broken: "WARNING: You must pass the application as an import string to enable 'reload' or 'workers" 1
- Feature request: enable extensions loading 1
- Implement ON DELETE and ON UPDATE actions for foreign keys 1
- fts5 syntax error when using punctuation 1
- Assets table with downloads 1
- order_by mechanism 1
- How do I use the app.css as style sheet? 1
- --port option to expose a port other than 8001 in "datasette package" 1
- Problem with square bracket in CSV column name 1
- Cashe-header missing in http-response 1
- Ability to customize columns used by extracts= feature 1
- datasette publish cloudrun --memory option 1
- Adding a "recreate" flag to the `Database` constructor 1
- Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records 1
- Import EXIF data into SQLite - lens used, ISO, aperture etc 1
- Integrate image content hashing 1
- Error when I click on "View and edit SQL" 1
- strange behavior using accented characters 1
- Replace "datasette publish --extra-options" with "--setting" 1
- Fall back to authentication via ENV 1
- Expose scores from ZCOMPUTEDASSETATTRIBUTES 1
- Question: Access to immutable database-path 1
- fts search on a column doesn't work anymore due to escape_fts 1
- Ability to serve thumbnailed Apple Photo from its place on disk 1
- bpylist.archiver.CircularReference: archive has a cycle with uid(13) 1
- Enable wildcard-searches by default 1
- Invalid SQL no such table: main.uploads 1
- Error pages not correctly loading CSS 1
- Group permission checks by request on /-/permissions debug page 1
- Reload support for config_dir mode. 1
- Fall back to FTS4 if FTS5 is not available 1
- Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15 1
- Magic parameters for canned queries 1
- New pattern for views that return either JSON or HTML, available for plugins 1
- Skip counting hidden tables 1
- Load only python files from plugins-dir. 1
- Use None as a default arg 1
- Don't install tests package 1
- Feature: pull request reviews and comments 1
- Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0 1
- Support reverse pagination (previous page, has-previous-items) 1
- Travis should not build the master branch, only the main branch 1
- 'datasette --get' option, refs #926 1
- Don't hang in db.execute_write_fn() if connection fails 1
- Run CI on GitHub Actions, not Travis 1
- Try out CodeMirror SQL hints 1
- favorites --stop_after=N stops after min(N, 200) 1
- request an "-o" option on "datasette server" to open the default browser at the running url 1
- Idea: transitive closure tables for tree structures 1
- Progress bar for sqlite-utils insert 1
- Update pytest requirement from <6.1.0,>=5.2.2 to >=5.2.2,<6.2.0 1
- Allow facet by primary keys, fixes #985 1
- Redesign application homepage 1
- Run tests against Python 3.9 1
- Document setting Google Cloud SDK properties 1
- datasette.client internal requests mechanism 1
- from_json jinja2 filter 1
- Add json_loads and json_dumps jinja2 filters 1
- Update janus requirement from <0.6,>=0.4 to >=0.4,<0.7 1
- Update asgiref requirement from ~=3.2.10 to >=3.2.10,<3.4.0 1
- Fix table name in spatialite example command 1
- About loading spatialite 1
- export.xml file name varies with different language settings 1
- Make `package` command deal with a configuration directory argument 1
- Bring date parsing into Datasette core 1
- DOC: Fix syntax error 1
- /db/table/-/blob/pk/column.blob download URL 1
- Include LICENSE in sdist 1
- Add minimum supported python 1
- Add template block prior to extra URL loaders 1
- Switch to .blob render extension for BLOB downloads 1
- Radical new colour scheme and base styles, courtesy of @natbat 1
- Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7 1
- New explicit versioning mechanism 1
- .blob output renderer 1
- Nav menu plus menu_links() hook 1
- load_template() plugin hook 1
- DigitalOcean buildpack memory errors for large sqlite db? 1
- Use FTS4 in fixtures 1
- import EX_CANTCREAT means datasette fails to work on Windows 1
- SQLite does not have case sensitive columns 1
- Use f-strings 1
- Discussion: Adding support for fetching only fresh tweets 1
- Fix --metadata doc usage 1
- GENERATED column support 1
- generated_columns table in fixtures.py 1
- Fix misaligned table actions cog 1
- Fix startup error on windows 1
- Fix footer not sticking to bottom in short pages 1
- "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 1
- sqlite3.OperationalError: near "(": syntax error 1
- More flexible CORS support in core, to encourage good security practices 1
- JavaScript to help plugins interact with the fragment part of the URL 1
- Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0 1
- killed by oomkiller on large location-history 1
- Maintain an in-memory SQLite table of connected databases and their tables 1
- --since support for favorites 1
- Modernize code to Python 3.6+ 1
- Mechanism for executing JavaScript unit tests 1
- Adopt Prettier for JavaScript code formatting 1
- Install Prettier via package.json 1
- GitHub Actions workflow to build and sign macOS binary executables 1
- Certain database names results in 404: "Database not found: None" 1
- Add fts offset docs. 1
- XML parse error 1
- WIP: Plugin includes 1
- Release 0.54 1
- Immutable Database w/ Canned Queries 1
- Use context manager instead of plain open 1
- /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory 1
- Add compile option to Dockerfile to fix failing test (fixes #696) 1
- Error reading csv files with large column data 1
- --no-headers option for CSV and TSV 1
- 500 error caused by faceting if a column called `n` exists 1
- ensure immutable databses when starting in configuration directory mode with 1
- Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages 1
- --crossdb option for joining across databases 1
- Custom pages don't work with base_url setting 1
- Allow facetting on custom queries 1
- fix small typo 1
- Sticky table column headers would be useful, especially on the query page 1
- Async support 1
- Add back styling to lists within table cells (fixes #1141) 1
- Capture "Ctrl + Enter" or "⌘ + Enter" to send SQL query? 1
- Minor type in IP adress 1
- Allow canned query params to specify default values 1
- Fix: code quality issues 1
- Escaping FTS search strings 1
- Some links aren't properly URL encoded. 1
- FTS quote functionality from datasette 1
- Plugin hook that could support 'order by random()' for table view 1
- Support for HTTP Basic Authentication 1
- support for Apache Arrow / parquet files I/O 1
- Full text search possibly broken? 1
- Use SQLite conn.interrupt() instead of sqlite_timelimit() 1
- Unit tests for the Dockerfile 1
- Invalid SQL: "no such table: pragma_database_list" on database page 1
- Minor Docs Update. Added `--app` to fly install command. 1
- Support to annotate photos on other than macOS OSes 1
- Add testres-db tool 1
- Fix little typo 1
- Better default display of arrays of items 1
- Use pytest-xdist to speed up tests 1
- Update docs: explain allow_download setting 1
- Dockerfile: use Ubuntu 20.10 as base 1
- Update pytest-asyncio requirement from <0.15,>=0.10 to >=0.10,<0.16 1
- Avoid error sorting by relationships if related tables are not allowed 1
- Bump black from 20.8b1 to 21.4b0 1
- Bump black from 20.8b1 to 21.4b1 1
- Bump black from 20.8b1 to 21.4b2 1
- Upgrade to GitHub-native Dependabot 1
- Bump black from 21.4b2 to 21.5b0 1
- Add Docker multi-arch support with Buildx 1
- Bump black from 21.4b2 to 21.5b1 1
- Update click requirement from ~=7.1.1 to >=7.1.1,<8.1.0 1
- Update jinja2 requirement from <2.12.0,>=2.10.3 to >=2.10.3,<3.1.0 1
- Support Unicode characters in metadata.json 1
- Update aiofiles requirement from <0.7,>=0.4 to >=0.4,<0.8 1
- Fix small typo 1
- ?_col=/?_nocol= to show/hide columns on the table page 1
- Re-display user's query with an error message if an error occurs 1
- DRAFT: add test and scan for docker images 1
- Error: Use either --since or --since_id, not both 1
- Using enable_fts before search term 1
- Make custom pages compatible with base_url setting 1
- Consider using CSP to protect against future XSS 1
- Update trustme requirement from <0.8,>=0.7 to >=0.7,<0.9 1
- Bump black from 21.5b2 to 21.6b0 1
- JSON export dumps JSON fields as TEXT 1
- sqlite-utils memory command for directly querying CSV/JSON data 1
- add -h support closes #276 1
- Update pytest-xdist requirement from <2.3,>=2.2.1 to >=2.2.1,<2.4 1
- Mypy fixes for rows_from_file() 1
- Test against Python 3.10-dev 1
- Fix + improve get_metadata plugin hook docs 1
- Update asgiref requirement from <3.4.0,>=3.2.10 to >=3.2.10,<3.5.0 1
- absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 1
- Option for importing CSV data using the SQLite .import mechanism 1
- Documentation on using Datasette as a library 1
- Bump black from 21.6b0 to 21.7b0 1
- Read lines with JSON object 1
- 403 when getting token 1
- sqlite-utils convert command and db[table].convert(...) method 1
- Spelling corrections plus CI job for codespell 1
- Show count of facet values if ?_facet_size=max 1
- `sqlite-utils insert --flatten` option to flatten nested JSON 1
- Add reference page to documentation using Sphinx autodoc 1
- Column metadata 1
- Update trustme requirement from <0.9,>=0.7 to >=0.7,<0.10 1
- Rethink how .ext formats (v.s. ?_format=) works before 1.0 1
- Add --merged-by flag to pull-requests sub command 1
- Duplicate Column 1
- Make sure that case-insensitive column names are unique 1
- Ability to insert file contents as text, in addition to blob 1
- Update pluggy requirement from ~=0.13.0 to >=0.13,<1.1 1
- Bump black from 21.7b0 to 21.8b0 1
- xml.etree.ElementTree.Parse Error - mismatched tag 1
- Correct naming of tool in readme 1
- Update beautifulsoup4 requirement from <4.10.0,>=4.8.1 to >=4.8.1,<4.11.0 1
- Test against 3.10-dev 1
- Add Authorization header when CORS flag is set 1
- Bump black from 21.7b0 to 21.9b0 1
- Update pytest-xdist requirement from <2.4,>=2.2.1 to >=2.2.1,<2.5 1
- Invalid JSON output when no rows 1
- Fix compatibility with Python 3.10 1
- Update pytest-timeout requirement from <1.5,>=1.4.2 to >=1.4.2,<2.1 1
- Test against Python 3.10 1
- Update pytest-asyncio requirement from <0.16,>=0.10 to >=0.10,<0.17 1
- Publish to Docker Hub failing with "libcrypt.so.1: cannot open shared object file" 1
- Add functionality to read Parquet files. 1
- Bump black from 21.9b0 to 21.10b0 1
- Default values for `--attach` and `--param` options 1
- Datasette should have an option to output CSV with semicolons 1
- Update docutils requirement from <0.18 to <0.19 1
- New pattern for async view classes 1
- Bump black from 21.9b0 to 21.11b0 1
- Bump black from 21.9b0 to 21.11b1 1
- base_url is omitted in JSON and CSV views 1
- Add new `"sql_file"` key to Canned Queries in metadata? 1
- Update janus requirement from <0.7,>=0.6.2 to >=0.6.2,<0.8 1
- Execution on Windows 1
- Update aiofiles requirement from <0.8,>=0.4 to >=0.4,<0.9 1
- Test against pysqlite3 running SQLite 3.37 1
- Bump black from 21.11b1 to 21.12b0 1
- Update pytest-xdist requirement from <2.5,>=2.2.1 to >=2.2.1,<2.6 1
- Data Pull fails for "Essential" level access to the Twitter API (for Documentation) 1
- TableView refactor 1
- filters_from_request plugin hook, now used in TableView 1
- Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 1
- --lines and --text and --convert and --import 1
- Initial prototype of .analyze() methods 1
- `sqlite-utils bulk` command 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18 1
- Add new spatialite helper methods 1
- Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2 1
- Documentation should clarify /stable/ vs /latest/ 1
- Potential simplified publishing mechanism 1
- Bump black from 21.12b0 to 22.1.0 1
- Ensure template_path always uses "/" to match jinja 1
- Reconsider policy on blocking queries containing the string "pragma" 1
- Test against Python 3.11-dev 1
- Index page `/` has no CORS headers 1
- Try test suite against macOS and Windows 1
- sqlite3.OperationalError: no such table: main.my_activity 1
- Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0 1
- Advanced class-based `conversions=` mechanism 1
- Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19 1
- Update Dockerfile generated by `datasette publish` 1
- Add SpatiaLite helpers to CLI 1
- Configuration directory mode does not pick up other file extensions than .db 1
- Optional Pandas integration 1
- Use dash encoding for table names and row primary keys in URLs 1
- Add /opt/homebrew to where spatialite extension can be found 1
- Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 1
- Tilde encoding 1
- Options for how `r.parsedate()` should handle invalid dates 1
- insert fails on JSONL with whitespace 1
- Ignore common generated files 1
- Document how to use a `--convert` function that runs initialization code first 1
- "Error: near "(": syntax error" when using sqlite-utils indexes CLI 1
- Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0 1
- Bump black from 22.1.0 to 22.3.0 1
- Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0 1
- Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0 1
- Datasette feature for publishing snapshots of query results 1
- Add timeout option to Cloudrun build 1
- Custom page variables aren't decoded 1
- Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases 1
- When running `auth` command, don't overwrite an existing auth.json file 1
- Misleading progress bar against utf-16-le CSV input 1
- Add scrollbars to table presentation in default layout 1
- Combining `rows_where()` and `search()` to limit which rows are searched 1
- Bump furo from 2022.4.7 to 2022.6.4.1 1
- Extract facet portions of table.html out into included templates 1
- Bump furo from 2022.4.7 to 2022.6.21 1
- Bump black from 22.1.0 to 22.6.0 1
- Keep track of config_dir 1
- Add duplicate table feature 1
- Update pytest-asyncio requirement from <0.19,>=0.17 to >=0.17,<0.20 1
- minor a11y: <select> has no visual indicator when tabbed to 1
- in extract code, check equality with IS instead of = for nulls 1
- feature request: pivot command 1
- Link to installation instructions 1
- Cross-link CLI to Python docs 1
- Discord badge 1
- beanbag-docutils>=2.0 1
- -a option is used for "--auth" and for "--all" 1
- Updating metadata.json on Datasette for MacOS 1
- db[table].create(..., transform=True) and create-table --transform 1
- Test `--load-extension` in GitHub Actions 1
- sqlite-utils query --functions mechanism for registering extra functions 1
- Support entrypoints for `--load-extension` 1
- Add an option for specifying column names when inserting CSV data 1
- Conda Forge 1
- search_sql add include_rank option 1
- Don't use upper bound dependencies, refs #1800 1
- Workaround for test failure: RuntimeError: There is no current event loop 1
- Add organization support to repos command 1
- truncate_cells_html does not work for links? 1
- progressbar for inserts/upserts of all fileformats, closes #485 1
- Specify foreign key against compound key in other table 1
- Database() constructor currently defaults is_mutable to False 1
- `sqlite-utils transform` should set empty strings to null when converting text columns to integer/float 1
- Bump furo from 2022.6.21 to 2022.9.15 1
- [SPIKE] Don't truncate query CSVs 1
- Keyword-only arguments for a bunch of internal methods 1
- Convert &_hide_sql=1 to #_hide_sql 1
- Add documentation for serving via OpenRC 1
- render_cell documentation example doesn't match the method signature 1
- Bump furo from 2022.9.15 to 2022.9.29 1
- use inspect data for hash and file size 1
- Make hash and size a lazy property 1
- Open Datasette link in new tab 1
- fix: enable-fts permanently save triggers 1
- feat: recreate fts triggers after table transform 1
- check_visibility can now take multiple permissions into account 1
- API to insert a single record into an existing table 1
- Default API token authentication mechanism 1
- Allow surrogates in parameters 1
- /db/table/-/upsert API 1
- Errors when using table filters behind a proxy 1
- Merge 1.0-dev branch back to main 1
- Upgrade to CodeMirror 6, add SQL autocomplete 1
- Use DOMContentLoaded instead of load event for CodeMirror initialization 1
- Typo in JSON API `Updating a row` documentation 1
- /db/table/-/upsert 1
- Bump furo from 2022.9.29 to 2022.12.7 1
- "permissions" blocks in metadata.json/yaml 1
- register_permissions() plugin hook 1
- invoke_startup() is not run in some conditions, e.g. gunicorn/uvicorn workers, breaking lots of things 1
- Port as many tests as possible to async def tests against ds_client 1
- Bump sphinx from 5.3.0 to 6.0.0 1
- Bump sphinx from 5.3.0 to 6.1.0 1
- Bump sphinx from 5.3.0 to 6.1.1 1
- Bump blacken-docs from 1.12.1 to 1.13.0 1
- Stuck on loading screen 1
- Document custom json encoder 1
- ?_extra= support (draft) 1
- Datasette is not compatible with SQLite's strict quoting compilation option 1
- Show referring tables and rows when the referring foreign key is compound 1
- use single quotes for string literals, fixes #2001 1
- array facet: don't materialize unnecessary columns 1
- Deploy demo job is failing due to rate limit 1
- Error 500 - not clear the cause 1
- Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 1
- add Python 3.11 classifier 1
- remove an unused `app` var in cli.py 1
- Potential feature: special support for `?a=1&a=2` on the query page 1
- Increase performance using macnotesapp 1
- Add paths for homebrew on Apple silicon 1
- Bump furo from 2022.12.7 to 2023.3.23 1
- Add permalink virtual field to items table 1
- rows: --transpose or psql extended view-like functionality 1
- Make detailed notes on how table, query and row views work right now 1
- Add paths for homebrew on Apple silicon 1
- Support self-referencing FKs in `Table.create` 1
- Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6 1
- `table.upsert_all` fails to write rows when `not_null` is present 1
- [BUG] Cannot insert new data to deployed instance 1
- sphinx.builders.linkcheck build error 1
- Bump sphinx from 6.1.3 to 7.0.1 1
- Analyze tables options: --common-limit, --no-most, --no-least 1
- TUI powered by Trogon 1
- Reformatted CLI examples in docs 1
- Bump furo from 2023.3.27 to 2023.5.20 1
- `IndexError` when doing `.insert(..., pk='id')` after `insert_all` 1
- New View base class 1
- `--settings settings.json` option 1
- Use sqlean if available in environment 1
- Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File 1
- Bump blacken-docs from 1.14.0 to 1.15.0 1
- feat: Implement a prepare_connection plugin hook 1
- cannot use jinja filters in display? 1
- Bump sphinx from 6.1.3 to 7.1.0 1
- Bump furo from 2023.3.27 to 2023.7.26 1
- datasette serve when invoked with --reload interprets the serve command as a file 1
- Bump sphinx from 6.1.3 to 7.1.1 1
- Bump sphinx from 6.1.3 to 7.1.2 1
- Bump blacken-docs, furo, blacken-docs 1
- Bump the python-packages group with 1 update 1
- Bump the python-packages group with 2 updates 1
- .transform() instead of modifying sqlite_master for add_foreign_keys 1
- Bump the python-packages group with 3 updates 1
- If a row has a primary key of `null` various things break 1
- Bump sphinx, furo, blacken-docs dependencies 1
- Start a new `datasette.yaml` configuration file, with settings support 1
- Test Datasette on multiple SQLite versions 1
- Bump the python-packages group with 3 updates 1
- Cascade for restricted token view-table/view-database/view-instance operations 1
- Fix hupper.start_reloader entry point 1
- Bump sphinx, furo, blacken-docs dependencies 1
- -s/--setting x y gets merged into datasette.yml, refs #2143, #2156 1
- Add new `--internal internal.db` option, deprecate legacy `_internal` database 1
- DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins 1
- Bump the python-packages group with 1 update 1
- click-default-group>=1.2.3 1
- Use $DATASETTE_INTERNAL in absence of --internal 1
- Test against Python 3.12 preview 1
- .transform() now preserves rowid values, refs #592 1
- actors_from_ids plugin hook and datasette.actors_from_ids() method 1
- `datasette.yaml` plugin support 1
- Bump the python-packages group with 3 updates 1
- Server hang on parallel execution of queries to named in-memory databases 1
- Raise an exception if a "plugins" block exists in metadata.json 1
- Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml` 1
- Stop using parallel SQL queries for tables 1
- Cascading DELETE not working with Table.delete(pk) 1
- Discord invite link returns 401 1
- Bump the python-packages group with 1 update 1
- Add spatialite arm64 linux path 1
- Bump the python-packages group with 1 update 1
- Fix query for suggested facets with column named value 1
- Add more STRICT table support 1
- CSV export fails for some `text` foreign key references 1
user 336
- codecov[bot] 240
- aborruso 19
- chrismp 18
- carlmjohnson 14
- tballison 13
- psychemedia 11
- stonebig 11
- frafra 10
- maxhawkins 10
- terrycojones 10
- dracos 10
- rayvoelker 10
- 20after4 9
- clausjuhl 9
- UtahDave 8
- tomchristie 8
- bsilverm 8
- 4l1fe 8
- zaneselvans 7
- mhalle 7
- zeluspudding 7
- cobiadigital 7
- cldellow 6
- khimaros 6
- CharlesNepote 6
- ocdtrekkie 6
- tsibley 5
- khusmann 5
- rdmurphy 5
- MarkusH 5
- lovasoa 5
- Mjboothaus 5
- dazzag24 5
- ar-jan 5
- xavdid 5
- davidhaley 5
- SteadBytes 5
- fs111 4
- yozlet 4
- Btibert3 4
- dholth 4
- jungle-boogie 4
- ColinMaudry 4
- nitinpaultifr 4
- Kabouik 4
- hydrosquall 4
- dvizard 4
- henry501 4
- pjamargh 4
- frankieroberto 3
- obra 3
- janimo 3
- atomotic 3
- briandorsey 3
- pkoppstein 3
- yschimke 3
- philroche 3
- coldclimate 3
- wsxiaoys 3
- johnfelipe 3
- mdrovdahl 3
- xrotwang 3
- robroc 3
- dmick 3
- betatim 3
- dufferzafar 3
- Florents-Tselai 3
- aki-k 3
- ashishdotme 3
- yejiyang 3
- henrikek 3
- swyxio 3
- Segerberg 3
- jsancho-gpl 3
- gk7279 3
- learning4life 3
- mattmalcher 3
- FabianHertwig 3
- polyrand 3
- justmars 3
- garethr 2
- nelsonjchen 2
- dsisnero 2
- hubgit 2
- jayvdb 2
- jackowayed 2
- ftrain 2
- chrishas35 2
- tannewt 2
- HaveF 2
- pkulchenko 2
- coleifer 2
- gavinband 2
- aviflax 2
- iloveitaly 2
- tholo 2
- mungewell 2
- frankier 2
- lchski 2
- tmaier 2
- hcarter333 2
- amitkoth 2
- eads 2
- virtadpt 2
- leafgarland 2
- glyph 2
- rafguns 2
- strada 2
- eelkevdbos 2
- ligurio 2
- n8henrie 2
- soobrosa 2
- nathancahill 2
- mustafa0x 2
- bsmithgall 2
- noslouch 2
- willingc 2
- nattaylor 2
- durkie 2
- cclauss 2
- wulfmann 2
- philshem 2
- bram2000 2
- zzeleznick 2
- plpxsk 2
- jeqo 2
- chapmanjacobd 2
- nickvazz 2
- aaronyih1 2
- luxint 2
- jussiarpalahti 2
- sachaj 2
- lagolucas 2
- stevecrawshaw 2
- chekos 2
- ctsrc 2
- ad-si 2
- smithdc1 2
- gsajko 2
- jcmkk3 2
- null92 2
- publicmatt 2
- rachelmarconi 2
- tunguyenatwork 2
- LVerneyPEReN 2
- tmcl-it 2
- anotherjesse 1
- jarib 1
- jokull 1
- danp 1
- fernand0 1
- precipice 1
- llimllib 1
- gijs 1
- blaine 1
- ashanan 1
- gravis 1
- nkirsch 1
- mrchrisadams 1
- dkam 1
- harperreed 1
- nileshtrivedi 1
- chrismytton 1
- nedbat 1
- furilo 1
- kindly 1
- prabhur 1
- palfrey 1
- dmd 1
- pquentin 1
- Uninen 1
- rtanglao 1
- carsonyl 1
- nryberg 1
- step21 1
- stefanocudini 1
- rcoup 1
- scoates 1
- hpk42 1
- annapowellsmith 1
- cadeef 1
- thorn0 1
- yurivish 1
- pax 1
- lucapette 1
- jmelloy 1
- Krazybug 1
- dvhthomas 1
- dckc 1
- phubbard 1
- sethvincent 1
- andrewdotn 1
- aitoehigie 1
- julienma 1
- michaelmcandrew 1
- drewda 1
- stiles 1
- saulpw 1
- adamalton 1
- terinjokes 1
- thadk 1
- camallen 1
- robintw 1
- astrojuanlu 1
- ipmb 1
- steren 1
- aidansteele 1
- 0x1997 1
- jonafato 1
- gwk 1
- knutwannheden 1
- davidszotten 1
- chrislkeller 1
- kevboh 1
- eaubin 1
- yunzheng 1
- mhkeller 1
- lfdebrux 1
- karlcow 1
- heyarne 1
- ryanfox 1
- sopel 1
- cephillips 1
- ryascott 1
- sirnacnud 1
- simonrjones 1
- justinpinkney 1
- merwok 1
- mattkiefer 1
- snth 1
- adarshp 1
- joshmgrant 1
- bcongdon 1
- nickdirienzo 1
- hannseman 1
- kaihendry 1
- urbas 1
- metamoof 1
- brimstone 1
- adamchainz 1
- PabloLerma 1
- heussd 1
- RayBB 1
- BryantD 1
- limar 1
- drkane 1
- Gagravarr 1
- radusuciu 1
- esagara 1
- agguser 1
- rclement 1
- dyllan-to-you 1
- justinallen 1
- jordaneremieff 1
- wdccdw 1
- wpears 1
- progpow 1
- DavidPratten 1
- ltrgoddard 1
- costrouc 1
- jratike80 1
- ment4list 1
- ccorcos 1
- choldgraf 1
- Olshansk 1
- qqilihq 1
- jdangerx 1
- fidiego 1
- OverkillGuy 1
- QAInsights 1
- secretGeek 1
- fkuhn 1
- jameslittle230 1
- Profpatsch 1
- dskrad 1
- kwladyka 1
- Carib0u 1
- fatihky 1
- phoenixjun 1
- JesperTreetop 1
- wenhoujx 1
- bapowell 1
- yairlenga 1
- chris48s 1
- ChristopherWilks 1
- Maltazar 1
- hueyy 1
- wuhland 1
- eric-burel 1
- foscoj 1
- dvot197007 1
- kokes 1
- RamiAwar 1
- csusanu 1
- rprimet 1
- metab0t 1
- spdkils 1
- sturzl 1
- jrdmb 1
- robmarkcole 1
- jfeiwell 1
- coisnepe 1
- chmaynard 1
- erlend-aasland 1
- amlestin 1
- tf13 1
- alecstein 1
- bendnorman 1
- noklam 1
- jakewilkins 1
- Thomascountz 1
- eigenfoo 1
- GmGniap 1
- rdtq 1
- AnkitKundariya 1
- LucasElArruda 1
- duarteocarmo 1
- sarcasticadmin 1
- yqlbu 1
- Rik-de-Kort 1
- patricktrainer 1
- xmichele 1
- RhetTbull 1
- miuku 1
- philipp-heinrich 1
- jimmybutton 1
- thewchan 1
- izzues 1
- thisismyfuckingusername 1
- kirajano 1
- J450n-4-W 1
- mlaparie 1
- Dhyanesh97 1
- knowledgecamp12 1
- McEazy2700 1
- cycle-data 1
id | html_url ▼ | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
1493442956 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/6#issuecomment-1493442956 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6 | IC_kwDOJHON9s5ZBCGM | amlestin 14314871 | 2023-04-02T21:20:43Z | 2023-04-02T21:25:37Z | NONE | I'm experiencing something similar. My apostrophes (') turn into (’) and the output is truncated. Hoping to debug next weekend |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Character encoding problem 1617602868 | |
1508784533 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/6#issuecomment-1508784533 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6 | IC_kwDOJHON9s5Z7jmV | sirnacnud 579727 | 2023-04-14T15:22:09Z | 2023-04-14T15:22:09Z | NONE | Just changing the encoding in |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Character encoding problem 1617602868 | |
1468898285 | https://github.com/dogsheep/apple-notes-to-sqlite/issues/8#issuecomment-1468898285 | https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8 | IC_kwDOJHON9s5XjZvt | RhetTbull 41546558 | 2023-03-14T22:00:21Z | 2023-03-14T22:00:21Z | NONE | Well that's embarrassing. I made a fork using macnotesapp and it's actually slower. This is because the Scripting Bridge sometimes fails to return the folder and thus macnotesapp resorts to AppleScript in this situation. The repeated AppleScript calls on a large library are slower than your "slurp it all in" approach. I've got some ideas about how to improve this--will make another attempt if I can fix the issues. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Increase performance using macnotesapp 1617823309 | |
1646950438 | https://github.com/dogsheep/dogsheep-beta/issues/37#issuecomment-1646950438 | https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37 | IC_kwDOC8SPRc5iKngm | rprimet 10352819 | 2023-07-23T20:18:26Z | 2023-07-23T20:18:26Z | NONE | My bad, although I could not find how to use |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
cannot use jinja filters in display? 1817281557 | |
748436115 | https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15 | MDEyOklzc3VlQ29tbWVudDc0ODQzNjExNQ== | nickvazz 8573886 | 2020-12-19T07:43:38Z | 2020-12-19T07:47:36Z | NONE | Hey Simon! I really enjoy datasette so far, just started trying it out today following your iPhone photos example. I am not sure if you had run into this or not, but it seems like they might have changed one of the column names from
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Expose scores from ZCOMPUTEDASSETATTRIBUTES 612151767 | |
633234781 | https://github.com/dogsheep/dogsheep-photos/issues/20#issuecomment-633234781 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20 | MDEyOklzc3VlQ29tbWVudDYzMzIzNDc4MQ== | dmd 41439 | 2020-05-24T13:56:13Z | 2020-05-24T13:56:13Z | NONE | As that seems to be closed, can you give a hint on how to make this work? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ability to serve thumbnailed Apple Photo from its place on disk 613006393 | |
748436195 | https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-748436195 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21 | MDEyOklzc3VlQ29tbWVudDc0ODQzNjE5NQ== | nickvazz 8573886 | 2020-12-19T07:44:32Z | 2020-12-19T07:44:49Z | NONE | I have also run into this a bit, would it be possible to post your |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
bpylist.archiver.CircularReference: archive has a cycle with uid(13) 615474990 | |
751125270 | https://github.com/dogsheep/dogsheep-photos/issues/28#issuecomment-751125270 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/28 | MDEyOklzc3VlQ29tbWVudDc1MTEyNTI3MA== | jmelloy 129786 | 2020-12-24T22:26:22Z | 2020-12-24T22:26:22Z | NONE | This comes around if you’ve run the photo export without running an s3 upload. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Invalid SQL no such table: main.uploads 624490929 | |
934207940 | https://github.com/dogsheep/dogsheep-photos/issues/3#issuecomment-934207940 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3 | IC_kwDOD079W843ruHE | jratike80 1751612 | 2021-10-05T08:57:41Z | 2021-10-05T08:57:41Z | NONE | Maybe the exif-loader from the SpatiaLite project could be useful as a reference even it is written in C and it also saves images as blobs https://www.gaia-gis.it/fossil/spatialite-tools/file?name=exif_loader.c&ci=tip. The tool is also integrated into the spatialite-gui application. I found some user documentation from the web archive http://web.archive.org/web/20180629041238/https://www.gaia-gis.it/spatialite-2.3.1/spatialite-exif-2.3.1.html. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Import EXIF data into SQLite - lens used, ISO, aperture etc 602533481 | |
791053721 | https://github.com/dogsheep/dogsheep-photos/issues/32#issuecomment-791053721 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32 | MDEyOklzc3VlQ29tbWVudDc5MTA1MzcyMQ== | dsisnero 6213 | 2021-03-05T00:31:27Z | 2021-03-05T00:31:27Z | NONE | I am getting the same thing for US West (N. California) us-west-1 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'Contents' on running upload 803333769 | |
882091516 | https://github.com/dogsheep/dogsheep-photos/issues/32#issuecomment-882091516 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32 | IC_kwDOD079W840k6X8 | aaronyih1 10793464 | 2021-07-18T17:29:39Z | 2021-07-18T17:33:02Z | NONE | Same here for US West (N. California) us-west-1. Running on Catalina. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'Contents' on running upload 803333769 | |
884688833 | https://github.com/dogsheep/dogsheep-photos/issues/32#issuecomment-884688833 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32 | IC_kwDOD079W840u0fB | aaronyih1 10793464 | 2021-07-22T06:40:25Z | 2021-07-22T06:40:25Z | NONE | The solution here is to upload an image to the bucket first. It is caused because it does not properly handle the case when there are no images in the bucket. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'Contents' on running upload 803333769 | |
777951854 | https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-777951854 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 | MDEyOklzc3VlQ29tbWVudDc3Nzk1MTg1NA== | leafgarland 675335 | 2021-02-12T03:54:39Z | 2021-02-12T03:54:39Z | NONE | I think that is a typo in the docs, you can use
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photo-to-sqlite: command not found 803338729 | |
778002092 | https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778002092 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 | MDEyOklzc3VlQ29tbWVudDc3ODAwMjA5Mg== | robmarkcole 11855322 | 2021-02-12T06:19:32Z | 2021-02-12T06:19:32Z | NONE | hi @leafgarland that results in a new error:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photo-to-sqlite: command not found 803338729 | |
778014990 | https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778014990 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 | MDEyOklzc3VlQ29tbWVudDc3ODAxNDk5MA== | leafgarland 675335 | 2021-02-12T06:54:14Z | 2021-02-12T06:54:14Z | NONE | Ahh, that might be because macOS Big Sur has changed the structure of the photos db. Might need to wait for a later release, there is a PR which adds support for Big Sur. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photo-to-sqlite: command not found 803338729 | |
813249000 | https://github.com/dogsheep/dogsheep-photos/issues/35#issuecomment-813249000 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/35 | MDEyOklzc3VlQ29tbWVudDgxMzI0OTAwMA== | ligurio 1151557 | 2021-04-05T07:37:57Z | 2021-04-05T07:37:57Z | NONE | There are trained ML models used in Photoprism: - https://dl.photoprism.org/tensorflow/nasnet.zip - https://dl.photoprism.org/tensorflow/nsfw.zip |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support to annotate photos on other than macOS OSes 842695374 | |
906015471 | https://github.com/dogsheep/dogsheep-photos/issues/7#issuecomment-906015471 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7 | IC_kwDOD079W842ALLv | dkam 18232 | 2021-08-26T02:01:01Z | 2021-08-26T02:01:01Z | NONE | Perceptual hashes might be what you're after : http://phash.org |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Integrate image content hashing 602585497 | |
1035717429 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1035717429 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W849u8s1 | harperreed 18504 | 2022-02-11T01:55:38Z | 2022-02-11T01:55:38Z | NONE | I would love this merged! |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1190995982 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1190995982 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W85G_SgO | jakewilkins 19231792 | 2022-07-21T03:26:38Z | 2023-04-14T22:41:31Z | NONE | 👋 Any update on getting this merged? Alternatively, is there a work around for this issue to unblock myself? edit to add: huge fan of both this project and Edit again to add:
Yes, there is. I was able to apply the patch of this PR and it applies (mostly) cleanly and works.
|
{ "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1382655354 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1382655354 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W85SaaV6 | fidiego 2704860 | 2023-01-14T04:08:36Z | 2023-01-14T04:08:36Z | NONE | I just tried this branch and saw some errors. I installed this PR locally with:
System Details**OS:** MacOS Monterey **Python Version:** Python 3.10.8Stacktrace```python Traceback (most recent call last): File "/Users/df/.venvs/photo-experiments/bin/dogsheep-photos", line 8, in <module> sys.exit(cli()) File "/Users/df/.venvs/photo-experiments/lib/python3.10/site-packages/click/core.py", line 1130, in __call__ return self.main(*args, **kwargs) File "/Users/df/.venvs/photo-experiments/lib/python3.10/site-packages/click/core.py", line 1055, in main rv = self.invoke(ctx) File "/Users/df/.venvs/photo-experiments/lib/python3.10/site-packages/click/core.py", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/df/.venvs/photo-experiments/lib/python3.10/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/df/.venvs/photo-experiments/lib/python3.10/site-packages/click/core.py", line 760, in invoke return __callback(*args, **kwargs) File "/Users/df/.venvs/photo-experiments/lib/python3.10/site-packages/dogsheep_photos/cli.py", line 254, in apple_photos sha256 = calculate_hash(pathlib.Path(photo.path)) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pathlib.py", line 960, in __new__ self = cls._from_parts(args) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pathlib.py", line 594, in _from_parts drv, root, parts = self._parse_args(args) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pathlib.py", line 578, in _parse_args a = os.fspath(a) TypeError: expected str, bytes or os.PathLike object, not NoneType ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1656696679 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1656696679 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W85ivy9n | coldclimate 319473 | 2023-07-29T10:10:29Z | 2023-07-29T10:10:29Z | NONE | +1 to getting this merged down. For future googlers, I installed by...
|
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1669877769 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1669877769 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W85jiFAJ | chrismytton 22996 | 2023-08-08T15:52:52Z | 2023-08-08T15:52:52Z | NONE | You can also install this with pip using this oneliner:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
811362316 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-811362316 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | MDEyOklzc3VlQ29tbWVudDgxMTM2MjMxNg== | PabloLerma 871250 | 2021-03-31T19:14:39Z | 2021-03-31T19:14:39Z | NONE | 👋 could I help somehow for this to be merged? As Big Sur is going to be more used as the time goes I think it would be nice to merge and publish a new version. Nice work! |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1006708046 | https://github.com/dogsheep/dogsheep-photos/pull/36#issuecomment-1006708046 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/36 | IC_kwDOD079W848ASVO | scoates 71983 | 2022-01-06T16:04:46Z | 2022-01-06T16:04:46Z | NONE | This one got me, today, too. 👍 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Correct naming of tool in readme 988493790 | |
1656694854 | https://github.com/dogsheep/dogsheep-photos/pull/38#issuecomment-1656694854 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/38 | IC_kwDOD079W85ivyhG | coldclimate 319473 | 2023-07-29T10:00:45Z | 2023-07-29T10:00:45Z | NONE | Ran across https://github.com/dogsheep/dogsheep-photos/issues/33 which is the same subject. My PR just fixes docs |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photos-to-sql not found? 1827427757 | |
1656694944 | https://github.com/dogsheep/dogsheep-photos/pull/38#issuecomment-1656694944 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/38 | IC_kwDOD079W85ivyig | coldclimate 319473 | 2023-07-29T10:01:19Z | 2023-07-29T10:01:19Z | NONE | Duplicate of https://github.com/dogsheep/dogsheep-photos/pull/36 - closing. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photos-to-sql not found? 1827427757 | |
1021264135 | https://github.com/dogsheep/dogsheep.github.io/pull/6#issuecomment-1021264135 | https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6 | IC_kwDODMzF1s4830EH | ligurio 1151557 | 2022-01-25T14:52:40Z | 2022-01-25T14:52:40Z | NONE | @simonw, could you review? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add testres-db tool 842765105 | |
777690332 | https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777690332 | https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11 | MDEyOklzc3VlQ29tbWVudDc3NzY5MDMzMg== | dskrad 3613583 | 2021-02-11T18:16:01Z | 2021-02-11T18:16:01Z | NONE | I solved this issue by modifying line 31 of utils.py
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
XML parse error 792851444 | |
911772943 | https://github.com/dogsheep/evernote-to-sqlite/issues/14#issuecomment-911772943 | https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/14 | IC_kwDOEhK-wc42WI0P | step21 46968 | 2021-09-02T14:53:11Z | 2021-09-02T14:53:11Z | NONE | Additionally, assuming the line numbers match up with the provided enenx file, the mentioned line plus one before and after is as follows: ``` ]]>``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xml.etree.ElementTree.Parse Error - mismatched tag 986829194 | |
765495861 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765495861 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTQ5NTg2MQ== | cobiadigital 25372415 | 2021-01-22T15:44:00Z | 2021-01-22T15:44:00Z | NONE | Risk of autoimmune disorders: https://www.snpedia.com/index.php/Genotype
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765498984 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765498984 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTQ5ODk4NA== | cobiadigital 25372415 | 2021-01-22T15:48:25Z | 2021-01-22T15:49:33Z | NONE | The "Warrior Gene" https://www.snpedia.com/index.php/Rs4680
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765502845 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765502845 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUwMjg0NQ== | cobiadigital 25372415 | 2021-01-22T15:53:19Z | 2021-01-22T15:53:19Z | NONE | rs7903146 Influences risk of Type-2 diabetes
https://www.snpedia.com/index.php/Rs7903146
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765506901 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765506901 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUwNjkwMQ== | cobiadigital 25372415 | 2021-01-22T15:58:41Z | 2021-01-22T15:58:58Z | NONE | Both rs10757274 and rs2383206 can both indicate higher risks of heart disease https://www.snpedia.com/index.php/Rs2383206
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765523517 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765523517 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUyMzUxNw== | cobiadigital 25372415 | 2021-01-22T16:20:25Z | 2021-01-22T16:20:25Z | NONE | rs53576: the oxytocin receptor (OXTR) gene
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765525338 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765525338 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUyNTMzOA== | cobiadigital 25372415 | 2021-01-22T16:22:44Z | 2021-01-22T16:22:44Z | NONE | rs1333049 associated with coronary artery disease https://www.snpedia.com/index.php/Rs1333049 ``` select rsid, genotype, case genotype when 'CC' then '1.9x increased risk for coronary artery disease' when 'CG' then '1.5x increased risk for CAD' when 'GG' then 'normal' end as interpretation from genome where rsid = 'rs1333049' ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
831004775 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-831004775 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDgzMTAwNDc3NQ== | cobiadigital 25372415 | 2021-05-03T03:46:23Z | 2021-05-03T03:46:23Z | NONE | RS1800955 is related to novelty seeking and ADHD https://www.snpedia.com/index.php/Rs1800955
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
605439685 | https://github.com/dogsheep/github-to-sqlite/issues/15#issuecomment-605439685 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15 | MDEyOklzc3VlQ29tbWVudDYwNTQzOTY4NQ== | garethr 2029 | 2020-03-28T12:17:01Z | 2020-03-28T12:17:01Z | NONE | That looks great, thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Assets table with downloads 544571092 | |
571412923 | https://github.com/dogsheep/github-to-sqlite/issues/16#issuecomment-571412923 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16 | MDEyOklzc3VlQ29tbWVudDU3MTQxMjkyMw== | jayvdb 15092 | 2020-01-07T03:06:46Z | 2020-01-07T03:06:46Z | NONE | I re-tried after doing |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Exception running first command: IndexError: list index out of range 546051181 | |
602136481 | https://github.com/dogsheep/github-to-sqlite/issues/16#issuecomment-602136481 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16 | MDEyOklzc3VlQ29tbWVudDYwMjEzNjQ4MQ== | jayvdb 15092 | 2020-03-22T02:08:57Z | 2020-03-22T02:08:57Z | NONE | I'd love to be using your library as a better cached gh layer for a new library I have built, replacing large parts of the very ugly https://github.com/jayvdb/pypidb/blob/master/pypidb/_github.py , and then probably being able to rebuild the setuppy chunk as a feature here at a later stage. I would also need tokenless and netrc support, but I would be happy to add those bits. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Exception running first command: IndexError: list index out of range 546051181 | |
622279374 | https://github.com/dogsheep/github-to-sqlite/issues/33#issuecomment-622279374 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33 | MDEyOklzc3VlQ29tbWVudDYyMjI3OTM3NA== | garethr 2029 | 2020-05-01T07:12:47Z | 2020-05-01T07:12:47Z | NONE | I also go it working with:
|
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fall back to authentication via ENV 609950090 | |
623038148 | https://github.com/dogsheep/github-to-sqlite/issues/38#issuecomment-623038148 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38 | MDEyOklzc3VlQ29tbWVudDYyMzAzODE0OA== | zzeleznick 5779832 | 2020-05-03T01:18:57Z | 2020-05-03T01:18:57Z | NONE | Thanks, @simonw! I feel a little foolish in hindsight, but I'm on the same page now and am glad to have discovered first-hand a motivation for this |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[Feature Request] Support Repo Name in Search 🥺 611284481 | |
623044643 | https://github.com/dogsheep/github-to-sqlite/issues/38#issuecomment-623044643 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38 | MDEyOklzc3VlQ29tbWVudDYyMzA0NDY0Mw== | zzeleznick 5779832 | 2020-05-03T02:34:32Z | 2020-05-03T02:34:32Z | NONE |
| starred_at | starred_by | repo_name | | --- | --- | --- | | 2020-02-11T01:08:59Z | zzeleznick | dogsheep/twitter-to-sqlite | | 2020-01-11T21:57:34Z | zzeleznick | simonw/datasette |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[Feature Request] Support Repo Name in Search 🥺 611284481 | |
1359468823 | https://github.com/dogsheep/github-to-sqlite/issues/46#issuecomment-1359468823 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46 | IC_kwDODFdgUs5RB9kX | choldgraf 1839645 | 2022-12-20T14:39:39Z | 2022-12-20T14:40:15Z | NONE | Just a quick +1 to this one from me - I would like to do a better job of tracking who is reviewing one another's pull requests in repositories, since this is a specific kind of maintenance work that I think often goes unrewarded. I can't seem to figure this out just by looking at the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature: pull request reviews and comments 664485022 | |
1208757153 | https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-1208757153 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51 | IC_kwDODFdgUs5IDCuh | hydrosquall 9020979 | 2022-08-09T00:29:44Z | 2022-08-09T00:29:44Z | NONE | I've been looking into how to to get this data out of Github (especially now there are "secondary rate limits" without an advertised allowance separate from the regular rate limits. I've had decent success with the Airbyte github extractor (aside from one data quality issue https://github.com/airbytehq/airbyte/pull/15420 ). Airbyte splits data extraction between the GraphQL and REST endpoints depending on the resource type, but they're very comprehensive. Before this, I tried a few solutions in my own custom wrapper mentioned in this thread + its children https://github.com/PyGithub/PyGithub/issues/1989 , but they weren't working as expected. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
github-to-sqlite should handle rate limits better 703246031 | |
1279224780 | https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-1279224780 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51 | IC_kwDODFdgUs5MP2vM | chapmanjacobd 7908073 | 2022-10-14T16:34:07Z | 2022-10-14T16:34:07Z | NONE | also, it says that authenticated requests have a much higher "rate limit". Unauthenticated requests only get 60 req/hour ?? seems more like a quota than a "rate limit" (although I guess that is semantic equivalence) You would want to use
But a more complete solution would bring authenticated requests to the other subcommands. I'm surprised only |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
github-to-sqlite should handle rate limits better 703246031 | |
860895838 | https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-860895838 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64 | MDEyOklzc3VlQ29tbWVudDg2MDg5NTgzOA== | khimaros 231498 | 2021-06-14T18:23:21Z | 2021-06-14T21:37:35Z | NONE | i have a basic working version at https://github.com/khimaros/github-to-sqlite this can be tested with caveat: the GitHub API doesn't seem to provide a complete history of events. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
feature: support "events" 920636216 | |
861035862 | https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-861035862 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64 | MDEyOklzc3VlQ29tbWVudDg2MTAzNTg2Mg== | khimaros 231498 | 2021-06-14T22:29:20Z | 2021-06-14T22:29:20Z | NONE | it looks like the v4 GraphQL API is the only way to get data beyond 90 days from GitHub. this is significant change, but may be worth considering in the future. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
feature: support "events" 920636216 | |
861087651 | https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-861087651 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64 | MDEyOklzc3VlQ29tbWVudDg2MTA4NzY1MQ== | khimaros 231498 | 2021-06-15T00:48:37Z | 2021-06-15T00:48:37Z | NONE | @simonw -- i've created an omega-query that fetched most of what was interesting to me for a single user. found by poking around in the "Explorer" tab in https://docs.github.com/en/graphql/overview/explorer note: pagination is still required via
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
feature: support "events" 920636216 | |
1847317568 | https://github.com/dogsheep/github-to-sqlite/issues/79#issuecomment-1847317568 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79 | IC_kwDODFdgUs5uG9RA | nedbat 23789 | 2023-12-08T14:50:13Z | 2023-12-08T14:50:13Z | NONE | Adding |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deploy demo job is failing due to rate limit 1570375808 | |
1266141699 | https://github.com/dogsheep/github-to-sqlite/pull/65#issuecomment-1266141699 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/65 | IC_kwDODFdgUs5Ld8oD | khimaros 231498 | 2022-10-03T22:35:03Z | 2022-10-03T22:35:03Z | NONE | @simonw rebased against latest, please let me know if i should drop this PR. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
basic support for events 923270900 | |
885964242 | https://github.com/dogsheep/github-to-sqlite/pull/65#issuecomment-885964242 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/65 | IC_kwDODFdgUs40zr3S | khimaros 231498 | 2021-07-23T23:45:35Z | 2021-07-23T23:45:35Z | NONE | @simonw is this PR of interest to you? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
basic support for events 923270900 | |
929651819 | https://github.com/dogsheep/github-to-sqlite/pull/66#issuecomment-929651819 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/66 | IC_kwDODFdgUs43aVxr | sarcasticadmin 30531572 | 2021-09-28T21:50:31Z | 2021-09-28T21:50:31Z | NONE | @simonw any feedback/thoughts? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add --merged-by flag to pull-requests sub command 975161924 | |
1238190601 | https://github.com/dogsheep/github-to-sqlite/pull/76#issuecomment-1238190601 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/76 | IC_kwDODFdgUs5JzUoJ | OverkillGuy 2757699 | 2022-09-06T13:58:20Z | 2022-09-06T13:59:08Z | NONE | Tested PR just now in private org, fetched >2k repos infos flawlessly!
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add organization support to repos command 1363280254 | |
1073152522 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/10#issuecomment-1073152522 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10 | IC_kwDODFE5qs4_9wIK | csusanu 9290214 | 2022-03-20T02:38:07Z | 2022-03-20T02:38:07Z | NONE | This line needs to say |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
sqlite3.OperationalError: no such table: main.my_activity 1123393829 | |
747130908 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/2#issuecomment-747130908 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDc0NzEzMDkwOA== | khimaros 231498 | 2020-12-17T00:47:04Z | 2020-12-17T00:47:43Z | NONE | it looks like almost all of the memory consumption is coming from another direction here may be to use the new "Semantic Location History" data which is already broken down by year and month. it also provides much more interesting data, such as estimated address, form of travel, etc. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
killed by oomkiller on large location-history 769376447 | |
780817596 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-780817596 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc4MDgxNzU5Ng== | UtahDave 306240 | 2021-02-17T20:01:35Z | 2021-02-17T20:01:35Z | NONE | I've got this almost working. Just needs some polish |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
781451701 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-781451701 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc4MTQ1MTcwMQ== | Btibert3 203343 | 2021-02-18T16:06:21Z | 2021-02-18T16:06:21Z | NONE | Awesome! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
783688547 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-783688547 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc4MzY4ODU0Nw== | UtahDave 306240 | 2021-02-22T21:31:28Z | 2021-02-22T21:31:28Z | NONE | @Btibert3 I've opened a PR with my initial attempt at this. Would you be willing to give this a try? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
790198930 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-790198930 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc5MDE5ODkzMA== | Btibert3 203343 | 2021-03-04T00:58:40Z | 2021-03-04T00:58:40Z | NONE | I am just seeing this sorry, yes! I will kick the tires later on tonight. My apologies for the delay. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
790934616 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-790934616 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc5MDkzNDYxNg== | Btibert3 203343 | 2021-03-04T20:54:44Z | 2021-03-04T20:54:44Z | NONE | Sorry for the delay, I got sidetracked after class last night. I am getting the following error: ``` /content# google-takeout-to-sqlite mbox takeout.db Takeout/Mail/gmail.mbox Usage: google-takeout-to-sqlite [OPTIONS] COMMAND [ARGS]...Try 'google-takeout-to-sqlite --help' for help. Error: No such command 'mbox'. ``` On the box, I installed with pip after cloning: https://github.com/UtahDave/google-takeout-to-sqlite.git |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
783794520 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-783794520 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc4Mzc5NDUyMA== | UtahDave 306240 | 2021-02-23T01:13:54Z | 2021-02-23T01:13:54Z | NONE | Also, @simonw I created a test based off the existing tests. I think it's working correctly |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
784638394 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-784638394 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc4NDYzODM5NA== | UtahDave 306240 | 2021-02-24T00:36:18Z | 2021-02-24T00:36:18Z | NONE | I noticed that @simonw is using black for formatting. I ran black on my additions in this PR. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
790389335 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790389335 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc5MDM4OTMzNQ== | UtahDave 306240 | 2021-03-04T07:32:04Z | 2021-03-04T07:32:04Z | NONE |
The wait is from python loading the mbox file. This happens regardless if you're getting the length of the mbox. The mbox module is on the slow side. It is possible to do one's own parsing of the mbox, but I kind of wanted to avoid doing that. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
790391711 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790391711 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc5MDM5MTcxMQ== | UtahDave 306240 | 2021-03-04T07:36:24Z | 2021-03-04T07:36:24Z | NONE |
Ah, that's good to know. I think explicitly creating the tables will be a great improvement. I'll add that. Also, I noticed after I opened this PR that the Thanks for the feedback. I should have time tomorrow to put together some improvements. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
791089881 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-791089881 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc5MTA4OTg4MQ== | maxhawkins 28565 | 2021-03-05T02:03:19Z | 2021-03-05T02:03:19Z | NONE | I just tried to run this on a small VPS instance with 2GB of memory and it crashed out of memory while processing a 12GB mbox from Takeout. Is it possible to stream the emails to sqlite instead of loading it all into memory and upserting at once? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
791530093 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-791530093 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc5MTUzMDA5Mw== | UtahDave 306240 | 2021-03-05T16:28:07Z | 2021-03-05T16:28:07Z | NONE |
@maxhawkins a limitation of the python mbox module is it loads the entire mbox into memory. I did find another approach to this problem that didn't use the builtin python mbox module and created a generator so that it didn't have to load the whole mbox into memory. I was hoping to use standard library modules, but this might be a good reason to investigate that approach a bit more. My worry is making sure a custom processor handles all the ins and outs of the mbox format correctly. Hm. As I'm writing this, I thought of something. I think I can parse each message one at a time, and then use an mbox function to load each message using the python mbox module. That way the mbox module can still deal with the specifics of the mbox format, but I can use a generator. I'll give that a try. Thanks for the feedback @maxhawkins and @simonw. I'll give that a try. @simonw can we hold off on merging this until I can test this new approach? |
{ "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
849708617 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-849708617 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDg0OTcwODYxNw== | maxhawkins 28565 | 2021-05-27T15:01:42Z | 2021-05-27T15:01:42Z | NONE | Any updates? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
884672647 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-884672647 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40uwiH | maxhawkins 28565 | 2021-07-22T05:56:31Z | 2021-07-22T14:03:08Z | NONE | How does this commit look? https://github.com/maxhawkins/google-takeout-to-sqlite/commit/72802a83fee282eb5d02d388567731ba4301050d It seems that Takeout's mbox format is pretty simple, so we can get away with just splitting the file on lines begining with I was able to load a 12GB takeout mbox without the program using more than a couple hundred MB of memory during the import process. It does make us lose the progress bar, but maybe I can add that back in a later commit. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
885022230 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-885022230 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40wF4W | maxhawkins 28565 | 2021-07-22T15:51:46Z | 2021-07-22T15:51:46Z | NONE | One thing I noticed is this importer doesn't save attachments along with the body of the emails. It would be nice if those got stored as blobs in a separate attachments table so attachments can be included while fetching search results. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
885094284 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-885094284 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40wXeM | maxhawkins 28565 | 2021-07-22T17:41:32Z | 2021-07-22T17:41:32Z | NONE | I added a follow-up commit that deals with emails that don't have a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
885098025 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-885098025 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs40wYYp | UtahDave 306240 | 2021-07-22T17:47:50Z | 2021-07-22T17:47:50Z | NONE | Hi @maxhawkins , I'm sorry, I haven't had any time to work on this. I'll have some time tomorrow to test your commits. I think they look great. I'm great with your commits superseding my initial attempt here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
888075098 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-888075098 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | IC_kwDODFE5qs407vNa | maxhawkins 28565 | 2021-07-28T07:18:56Z | 2021-07-28T07:18:56Z | NONE |
I did some investigation into this issue and made a fix here. The problem was that some messages (like gchat logs) don't have a @simonw While looking into this I found something unexpected about how sqlite_utils handles upserts if the pkey column is |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
1002735370 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1002735370 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs47xIcK | Btibert3 203343 | 2021-12-29T18:58:23Z | 2021-12-29T18:58:23Z | NONE | @maxhawkins how hard would it be to add an entry to the table that includes the HTML version of the email, if it exists? I just attempted your the PR branch on a very small mbox file, and it worked great. My use case is a research project and I need to access more than just the body plain text. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1003437288 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1003437288 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs47zzzo | maxhawkins 28565 | 2021-12-31T19:06:20Z | 2021-12-31T19:06:20Z | NONE |
Shouldn't be hard. The easiest way is probably to remove the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1708945716 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1708945716 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs5l3HE0 | iloveitaly 150855 | 2023-09-06T19:12:33Z | 2023-09-06T19:12:33Z | NONE | @maxhawkins curious why you didn't use the stdlib |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1710380941 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1710380941 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs5l8leN | maxhawkins 28565 | 2023-09-07T15:39:59Z | 2023-09-07T15:39:59Z | NONE |
Mailbox parses the entire mbox into memory. Using the lower level library lets us stream the emails in one at a time to support larger archives. Both libraries are in the stdlib. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1710950671 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1710950671 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs5l-wkP | iloveitaly 150855 | 2023-09-08T01:22:49Z | 2023-09-08T01:22:49Z | NONE | Makes sense, thanks for explaining! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
894581223 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-894581223 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs41Ujnn | maxhawkins 28565 | 2021-08-07T00:57:48Z | 2021-08-07T00:57:48Z | NONE | Just added two more fixes:
I was able to run this on my Takeout export and everything seems to work fine. @simonw let me know if this looks good to merge. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
896378525 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-896378525 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs41baad | maxhawkins 28565 | 2021-08-10T23:28:45Z | 2021-08-10T23:28:45Z | NONE | I added parsing of text/html emails using BeautifulSoup. Around half of the emails in my archive don't include a text/plain payload so adding html parsing makes a good chunk of them searchable. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1489110168 | https://github.com/dogsheep/hacker-news-to-sqlite/pull/6#issuecomment-1489110168 | https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/6 | IC_kwDODtX3eM5YwgSY | xavdid 1231935 | 2023-03-29T18:36:16Z | 2023-03-29T18:36:16Z | NONE | @simonw can you take a look when you have a chance? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add permalink virtual field to items table 1641117021 | |
711083698 | https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711083698 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11 | MDEyOklzc3VlQ29tbWVudDcxMTA4MzY5OA== | jarib 572 | 2020-10-17T21:39:15Z | 2020-10-17T21:39:15Z | NONE | Nice! Works perfectly. Thanks for the quick response and great tooling in general. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
export.xml file name varies with different language settings 723838331 | |
1163917719 | https://github.com/dogsheep/healthkit-to-sqlite/issues/12#issuecomment-1163917719 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12 | IC_kwDOC8tyDs5FX_mX | Mjboothaus 956433 | 2022-06-23T04:35:02Z | 2022-06-23T04:35:02Z | NONE | In terms of unique identifiers - could you use values stored in |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Some workout columns should be float, not text 727848625 | |
877805513 | https://github.com/dogsheep/healthkit-to-sqlite/issues/12#issuecomment-877805513 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDg3NzgwNTUxMw== | Mjboothaus 956433 | 2021-07-11T14:03:01Z | 2021-07-11T14:03:01Z | NONE | Hi Simon -- just experimenting with your excellent software! Up to this point in time I have been using the (paid) HealthFit App to export my workouts from my Apple Watch, one walk at the time into either .GPX or .FIT format and then using another library to suck it into Python and eventually here to my "Emmaus Walking" app: https://share.streamlit.io/mjboothaus/emmaus_walking/emmaus_walking/app.py I just used I did notice the issue with various numeric fields being stored in the SQLite db as TEXT for now and just thought I'd flag it - but you're already self-reported this issue. Keep up the great work! I was curious if you have any thoughts about periodically exporting "export.zip" and how to just update the SQLite file instead of re-creating it each time. Hopefully Apple will give some thought to managing this data in a more sensible fashion as it grows over time. Ideally one could pull it from iCloud (where it is allegedly being backed up). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Some workout columns should be float, not text 727848625 | |
877874117 | https://github.com/dogsheep/healthkit-to-sqlite/issues/12#issuecomment-877874117 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12 | MDEyOklzc3VlQ29tbWVudDg3Nzg3NDExNw== | Mjboothaus 956433 | 2021-07-11T23:03:37Z | 2021-07-11T23:03:37Z | NONE | P.s. wondering if you have explored using the spatialite functionality with the location data in workouts? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Some workout columns should be float, not text 727848625 | |
1073123231 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1073123231 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | IC_kwDOC8tyDs4_9o-f | lchski 343884 | 2022-03-19T22:39:29Z | 2022-03-19T22:39:29Z | NONE | I have this issue, too, with a fresh export. None of my When I run the script, a
Are there maybe duplicate workouts in the data, which’d cause multiple rows to share the same |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
1073139067 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1073139067 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | IC_kwDOC8tyDs4_9s17 | lchski 343884 | 2022-03-20T00:54:18Z | 2022-03-20T00:54:18Z | NONE | Update: this appears to be because of running the command twice without clearing the DB in between. Tries to insert a Workout that already exists, causing a collision on the (auto-generated) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
1629123734 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1629123734 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | IC_kwDOC8tyDs5hGnSW | philipp-heinrich 44622670 | 2023-07-10T14:46:52Z | 2023-07-10T14:46:52Z | NONE | @simonw any chance to get this fixed soon? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
798436026 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-798436026 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | MDEyOklzc3VlQ29tbWVudDc5ODQzNjAyNg== | n8henrie 1234956 | 2021-03-13T14:23:16Z | 2021-03-13T14:23:16Z | NONE | This PR allows my import to succeed. It looks like some events don't have an If a record has neither of these, I changed it to just print the record (for debugging) and For some odd reason this ran fine at first, and now (after removing the generated db and trying again) I'm getting a different error (duplicate column name). Looks like it may have run when I had two successive runs without remembering to delete the db in between. Will try to refactor. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
798468572 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-798468572 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | MDEyOklzc3VlQ29tbWVudDc5ODQ2ODU3Mg== | n8henrie 1234956 | 2021-03-13T14:47:31Z | 2021-03-13T14:47:31Z | NONE | Ok, new PR works. I'm not I still end up with a lot of activities that are missing an
I also end up with some unhappy characters (in the skipped events), such as: But it's successfully making it through the file, and the resulting db opens in datasette, so I'd call that progress. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
903950096 | https://github.com/dogsheep/healthkit-to-sqlite/issues/21#issuecomment-903950096 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/21 | IC_kwDOC8tyDs414S8Q | FabianHertwig 32016596 | 2021-08-23T17:00:59Z | 2021-08-23T17:00:59Z | NONE | I think the issue is that I have records like these:
And if sqlite is case insensitive, then |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Duplicate Column 977128935 | |
1464786643 | https://github.com/dogsheep/healthkit-to-sqlite/issues/24#issuecomment-1464786643 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24 | IC_kwDOC8tyDs5XTt7T | Mjboothaus 956433 | 2023-03-11T02:01:27Z | 2023-03-11T02:01:27Z | NONE | Thanks for reporting this and providing a solution -- I was puzzled by this error when I revisited my walking data and experienced this issues. I haven't tried the fix yet. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 1515883470 | |
1464796494 | https://github.com/dogsheep/healthkit-to-sqlite/issues/24#issuecomment-1464796494 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24 | IC_kwDOC8tyDs5XTwVO | Mjboothaus 956433 | 2023-03-11T02:23:42Z | 2023-03-11T02:23:42Z | NONE | @simonw - maybe put in some error handling to trap for poorly formed XML (from Apple engineers) so that it suggests that there are problems with export.zip rather than odd looking Python errors :) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 1515883470 | |
514745798 | https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-514745798 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDUxNDc0NTc5OA== | tholo 166463 | 2019-07-24T18:25:36Z | 2019-07-24T18:25:36Z | NONE | This is on macOS 10.14.6, with Python 3.7.4, packages in the virtual environment: ``` Package Version aiofiles 0.4.0 Click 7.0 click-default-group 1.2.1 datasette 0.29.2 h11 0.8.1 healthkit-to-sqlite 0.3.1 httptools 0.0.13 hupper 1.8.1 importlib-metadata 0.18 Jinja2 2.10.1 MarkupSafe 1.1.1 Pint 0.8.1 pip 19.2.1 pluggy 0.12.0 setuptools 41.0.1 sqlite-utils 1.7 tabulate 0.8.3 uvicorn 0.8.4 uvloop 0.12.2 websockets 7.0 zipp 0.5.2 ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Too many SQL variables 472429048 | |
515370687 | https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515370687 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDUxNTM3MDY4Nw== | tholo 166463 | 2019-07-26T09:01:19Z | 2019-07-26T09:01:19Z | NONE | Yes, that did fix the issue I was seeing — it will now import my complete HealthKit data. Thorsten
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Too many SQL variables 472429048 | |
904642396 | https://github.com/dogsheep/healthkit-to-sqlite/pull/13#issuecomment-904642396 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/13 | IC_kwDOC8tyDs41679c | FabianHertwig 32016596 | 2021-08-24T13:27:40Z | 2021-08-24T13:28:26Z | NONE | This would fix #21 and make #22 obsolete. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
SQLite does not have case sensitive columns 743071410 | |
904641261 | https://github.com/dogsheep/healthkit-to-sqlite/pull/22#issuecomment-904641261 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/22 | IC_kwDOC8tyDs4167rt | FabianHertwig 32016596 | 2021-08-24T13:26:20Z | 2021-08-24T13:26:20Z | NONE | Did not see that #13 fixes the same issue in a similar way. You can decide which one to merge ;) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Make sure that case-insensitive column names are unique 978086284 | |
1239516561 | https://github.com/dogsheep/pocket-to-sqlite/issues/10#issuecomment-1239516561 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10 | IC_kwDODLZ_YM5J4YWR | ashanan 11887 | 2022-09-07T15:07:38Z | 2022-09-07T15:07:38Z | NONE | Thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
When running `auth` command, don't overwrite an existing auth.json file 1246826792 | |
1221521377 | https://github.com/dogsheep/pocket-to-sqlite/issues/11#issuecomment-1221521377 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11 | IC_kwDODLZ_YM5Izu_h | fernand0 2467 | 2022-08-21T10:51:37Z | 2022-08-21T10:51:37Z | NONE | I didn't see there is a PR about this: https://github.com/dogsheep/pocket-to-sqlite/pull/7 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
-a option is used for "--auth" and for "--all" 1345452427 | |
774726123 | https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774726123 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDc3NDcyNjEyMw== | jfeiwell 12669260 | 2021-02-07T18:21:08Z | 2021-02-07T18:21:08Z | NONE | @simonw any ideas here? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
SSL Error 801780625 | |
774730656 | https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774730656 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDc3NDczMDY1Ng== | merwok 635179 | 2021-02-07T18:45:04Z | 2021-02-07T18:45:04Z | NONE | That URL uses TLS 1.3, but maybe only if the client supports it. It could be your Python version or your SSL library that’s not recent enough. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
SSL Error 801780625 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
issue_url 622 ✖