id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1880968405,PR_kwDOJHON9s5ZhYny,14,fix: fix the problem of Chinese character garbling,2698003,open,0,,,0,2023-09-04T23:48:28Z,2023-09-04T23:48:28Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/apple-notes-to-sqlite/pulls/14,"1. The code uses two different ways of writing encoding formats, `mac_roman` and `macroman`. It is uncertain whether there are any typo errors. 2. When there are Chinese characters in the content, exporting it results in garbled code. Changing it to `utf8` can fix the issue.",611552758,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1650984552,PR_kwDOJHON9s5NbyYN,13,use universal command,14314871,open,0,,,0,2023-04-02T15:10:54Z,2023-04-02T15:37:34Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/apple-notes-to-sqlite/pulls/13,,611552758,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1650981564,I_kwDOJHON9s5iZ_q8,12,Error running pytest,14314871,open,0,,,0,2023-04-02T15:02:36Z,2023-04-02T15:07:10Z,,NONE,,"`______________________________________________________ ERROR collecting tests/test_apple_notes_to_sqlite.py _______________________________________________________ ImportError while importing test module '/Users/lol/development/apple-notes-to-sqlite/tests/test_apple_notes_to_sqlite.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/homebrew/Cellar/python@3.9/3.9.16/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/test_apple_notes_to_sqlite.py:2: in from apple_notes_to_sqlite.cli import cli, COUNT_SCRIPT, FOLDERS_SCRIPT E ModuleNotFoundError: No module named 'apple_notes_to_sqlite'` Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using `python -m pytest` The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv)",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,closed,0,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617938730,I_kwDOJHON9s5gb8kq,9,"Default to just storing plaintext, store HTML if `--html` is passed",9599,open,0,,,0,2023-03-09T20:19:06Z,2023-03-09T20:19:06Z,,MEMBER,,"The full `body` version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the `plaintext` version. I'm tempted to say you don't get HTML unless you pass a `--html` option.",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616440856,I_kwDOJHON9s5gWO4Y,5,Configure full text search,9599,open,0,,,0,2023-03-09T05:20:46Z,2023-03-09T05:20:46Z,,MEMBER,,"FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the `plaintext` property for that.",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,closed,0,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1943259395,I_kwDOEhK-wc5z08kD,16, time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ',3746270,open,0,,,0,2023-10-14T13:24:39Z,2023-10-14T13:24:39Z,,NONE,," ``` evernote-to-sqlite enex evernote.db ./我的笔记.enex Importing from ENEX [#####-------------------------------] 14% Traceback (most recent call last): File ""/usr/local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) ^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 46, in save_note ""created"": convert_datetime(created), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 111, in convert_datetime return datetime.datetime.strptime(s, ""%Y%m%dT%H%M%SZ"").isoformat() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 568, in _strptime_datetime tt, fraction, gmtoff_fraction = _strptime(data_string, format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 349, in _strptime raise ValueError(""time data %r does not match format %r"" % ValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' ``` enex is exported by evernote mac client ",303218369,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1042759769,PR_kwDOEhK-wc4uAJb9,15,include note tags in the export,436138,open,0,,,0,2021-11-02T20:04:31Z,2021-11-02T20:04:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/evernote-to-sqlite/pulls/15,"When parsing the Evernote `` elements, the script will now also parse any nested `` elements, writing them out into a separate sqlite table. Here is an example of how to query the data after the script has run: ``` select notes.*, (select group_concat(tag) from notes_tags where notes_tags.note_id=notes.id) as tags from notes; ``` My .enex source file is 3+ years old so I am assuming the structure hasn't changed. Interestingly, my _notebook names_ show up in the _tags_ list where the tag name is prefixed with `notebook_`, so this could maybe help work around the first limitation mentioned in the [evernote-to-sqlite blog post](https://simonwillison.net/2020/Oct/16/building-evernote-sqlite-exporter/). ",303218369,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 892383270,MDExOlB1bGxSZXF1ZXN0NjQ1MTAwODQ4,12,Recovering of malformed ENEX file,8431437,open,0,,,0,2021-05-15T07:49:31Z,2021-05-15T19:57:50Z,,FIRST_TIMER,dogsheep/evernote-to-sqlite/pulls/12,"Hey .. Awesome work developing this project, that I found very useful to me and saved me some work.. Thanks.. :) Some background to this PR... I've been searching around for a tool allowing me to transforming my personal collection of Evernote notes to a format easier to search and potentially easier import to future services. Now I discovered problem processing my large data ~5GB using the existing source using Pythons builtin xml-parser that unfortunately was unable to succeed without exception breaking the process. My first attempt I tried to adapt to more robust lxml package allowing huge data and with ""recover"", but even if it worked better it also failed processing the whole data. Even using the memory efficient etree.iterparse() it also unfortunately got into trouble. And with no luck finding any other libraries successfully parsing this enormous file I instead chose to build a ""hugexmlparser"" module that allows parsing this huge file using yield (on a byte-to-byte-level) and allows you to set a maximum size for to cater for potential malformed or undesirable large attachments to export, should succeed covering potential exceptions. Some cases found where the parses discover malformed XML within so also in those cases try to save as much as possible by escaping (to be dealt at a later stage, better than nothing), and if a missing end before new (malformed?) it would add this after encounter a new start-tag. The code for the recovery process is a bit rough and for certain room for refactoring, but at the moment is seem to achieve what I wanted. Now with the above we pass this a minor changed version of save_note_recovery() assure the existing works. Also adding this as a new recover-enex command to click and kept the original options. A couple of new tests was added as well to check against using this command. Now this currently works to me, but thought I might share a PR in such as you find use for this yourself or found useful to others finding this repository. As a second step .. When the time allows it would have been nice to also be able to easily export from SQLite to formatted HTML/MD and attachments saved... but that might perhaps be better a separate project ... or if you or someone else have something that might shared to save some trouble, I would be interested ;-) ",303218369,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 748370021,MDExOlB1bGxSZXF1ZXN0NTI1MzcxMDI5,8,"fix import error if note has no ""updated"" element",4028322,closed,0,,,0,2020-11-22T22:51:05Z,2021-02-11T22:34:06Z,2021-02-11T22:34:06Z,CONTRIBUTOR,dogsheep/evernote-to-sqlite/pulls/8,"I got the following error when executing evernote-to-sqlite enex evernote.db evernote.enex ``` ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text' ``` Seems that in some cases the updated element is not added to the note, this is a part of the problematic note: ``` 20201019T074518Z web.clip7 webclipper.evernote ```",303218369,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 718938321,MDU6SXNzdWU3MTg5MzgzMjE=,3,Use a content hash for the note IDs,9599,closed,0,,,0,2020-10-11T22:13:46Z,2020-10-11T23:15:04Z,2020-10-11T23:15:04Z,MEMBER,,"Without a GUID note IDs are pretty ineffective, but using a hash of the contents will at least avoid creating identical duplicates in the future. https://sqlite-utils.readthedocs.io/en/stable/python-api.html#setting-an-id-based-on-the-hash-of-the-row-contents",303218369,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938046,MDU6SXNzdWU3MTg5MzgwNDY=,2,Convert dates to a better format,9599,closed,0,,,0,2020-10-11T22:12:33Z,2020-10-11T23:15:03Z,2020-10-11T23:15:03Z,MEMBER,,"They currently look like this: https://github.com/dogsheep/evernote-to-sqlite/blob/9d8efd17580f6ddf76745c145d1e69dd24e52b64/tests/test_evernote_to_sqlite.py#L35-L36",303218369,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,closed,0,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1827436260,PR_kwDOD079W85WtVyk,39,Missing option in datasette instructions,319473,open,0,,,0,2023-07-29T10:34:48Z,2023-07-29T10:34:48Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/39,Gotta tell it where to look,256834907,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1293698966,PR_kwDOD079W84600uh,37,Fix former command name in readme,578773,open,0,,,0,2022-07-05T02:09:13Z,2022-07-05T02:09:13Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/37,Looks like a previous commit missed a `photo-to-sqlite`→ `dogsheep-photos` replacement.,256834907,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 830283447,MDU6SXNzdWU4MzAyODM0NDc=,34,bucket name,6213,open,0,,,0,2021-03-12T16:40:57Z,2021-03-12T16:40:57Z,,NONE,,I followed the instructions to setup credentials but I am getting a invalid bucket name. Can you put a sample auth.json file in the base that shows the correct format for this? Thanks,256834907,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 655974395,MDExOlB1bGxSZXF1ZXN0NDQ4MzU1Njgw,30,Handle empty bucket on first upload. Allow specifying the endpoint_url for services other than S3 (like b2 and digitalocean spaces),110038,open,0,,,0,2020-07-13T16:15:26Z,2020-07-13T16:15:26Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/30,"Finally got around to trying dogsheep-photos but I want to use backblaze's b2 service instead of AWS S3. Had to add a way to optionally specify the endpoint_url to connect to. Then with the bucket being empty the initial key retrieval would fail. Probably a better way to see that the bucket is empty than doing a test inside the paginator loop. Also probably a better way to specify the endpoint_url as we get and test for it twice using the same code in two different places but did not want to spend too much time worrying about it.",256834907,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 621486115,MDU6SXNzdWU2MjE0ODYxMTU=,27,photos_with_apple_metadata view should include labels,9599,open,0,,,0,2020-05-20T06:06:17Z,2020-05-20T06:06:17Z,,MEMBER,,"https://dogsheep-photos.dogsheep.net/public/photos_with_apple_metadata?place_city=New+Orleans&_facet=place_city&_facet_array=albums&_facet_array=persons Here's one way to add that: ```sql select rowid, photo, ( select json_group_array( json_object( 'label', normalized_string, 'href', '/photos/labelled?_hide_sql=1&label=' || normalized_string ) ) from labels where labels.uuid = photos_with_apple_metadata.uuid ) as labels, date, ```",256834907,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 606032950,MDU6SXNzdWU2MDYwMzI5NTA=,11,Try running S3 uploads in a thread pool,9599,closed,0,,,0,2020-04-24T04:34:31Z,2020-04-24T16:45:41Z,2020-04-24T16:45:41Z,MEMBER,,"Since #10 provided such a speedup, can the same thing be done for the actual uploads? http://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance.",256834907,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606028272,MDU6SXNzdWU2MDYwMjgyNzI=,10,Speed up hashing step using threads,9599,closed,0,,,0,2020-04-24T04:20:08Z,2020-04-24T04:32:35Z,2020-04-24T04:32:35Z,MEMBER,,"This TODO from the code: https://github.com/dogsheep/photos-to-sqlite/blob/2e7f2c67cc18b02c75bb64992a05b0196e507252/photos_to_sqlite/cli.py#L82-L90",256834907,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353418822,PR_kwDODtX3eM497MOV,5,The program fails when the user has no submissions,2467,open,0,,,0,2022-08-28T17:25:45Z,2022-08-28T17:25:45Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/hacker-news-to-sqlite/pulls/5,"Tested with: hacker-news-to-sqlite user hacker-news.db fernand0 Result: ` Traceback (most recent call last): File ""/home/ftricas/.pyenv/versions/3.10.6/bin/hacker-news-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/hacker_news_to_sqlite/cli.py"", line 27, in user submitted = user.pop(""submitted"", None) or [] AttributeError: 'NoneType' object has no attribute 'pop' ` There is a problem of style with the patch (but not sure what to do) because with the new inicialization ( submitted = []) the part or [] is not needed. Maybe there is a more adequate way of doing this.",248903544,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1205867842,I_kwDODtX3eM5H4BVC,4,Retrieve the top-level story for a comment,1755789,open,0,,,0,2022-04-15T20:25:39Z,2022-04-15T20:25:39Z,,NONE,,"I think that each comment inserted into the database should include a column `onstory` that contains the ID of the story on which the comment was made. This is exactly equivalent to the link after ""on:"" at the top of an HN comment page ([example](https://news.ycombinator.com/item?id=18358028)). We could do this either by directly retrieving the HTML page and using Beautiful Soup to find that link, or alternatively recurse up the tree in the Firebase API using the `parent` field (probably using `functools.lru_cache` in case a person has commented a bunch of times on the same story).",248903544,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 927385540,MDU6SXNzdWU5MjczODU1NDA=,8,any guidance / experience on imessage-to-sqlite ?,2675621,open,0,,,0,2021-06-22T15:46:16Z,2021-06-22T15:46:16Z,,NONE,,,214746582,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 925384329,MDExOlB1bGxSZXF1ZXN0NjczODcyOTc0,7,Add instagram-to-sqlite,36654812,open,0,,,0,2021-06-19T12:26:16Z,2021-07-28T07:58:59Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/7,"The tool covers only chat imports at the time of opening this PR but I'm planning to import everything else that I feel inquisitive about ref: https://github.com/gavindsouza/instagram-to-sqlite",214746582,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 723499985,MDExOlB1bGxSZXF1ZXN0NTA1MDc2NDE4,5,Add fitbit-to-sqlite,4632208,open,0,,,0,2020-10-16T20:04:05Z,2020-10-16T20:04:05Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/5,,214746582,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 558715564,MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3,4,Add beeminder-to-sqlite,706257,closed,0,,,0,2020-02-02T15:51:36Z,2020-10-12T00:36:16Z,2020-10-12T00:36:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/4,,214746582,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543717994,MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2,3,Add todoist-to-sqlite,706257,closed,0,,,0,2019-12-30T04:02:59Z,2020-10-12T00:35:58Z,2020-10-12T00:35:57Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/3,"Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful",214746582,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 541274681,MDU6SXNzdWU1NDEyNzQ2ODE=,2,Add linkedin-to-sqlite,881925,open,0,,,0,2019-12-21T03:13:40Z,2019-12-21T03:13:40Z,,NONE,,"There is an API available. https://developer.linkedin.com/docs/rest-api# At the minimum, I would think contact list and messages would be of interest.",214746582,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 519979091,MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4,1,Add parkrun-to-sqlite,1101318,closed,0,,,0,2019-11-08T12:05:32Z,2020-10-12T00:35:16Z,2020-10-12T00:35:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/1,,214746582,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 797728929,MDU6SXNzdWU3OTc3Mjg5Mjk=,8,QUESTION: extract full text,417363,open,0,,,0,2021-01-31T14:50:10Z,2021-01-31T14:50:10Z,,NONE,,"This may be solved or a feature already, but I couldn't figure it out, is it possible to extract and store also full text from the saved pages? The same way that Pocket parses the text, it'd be amazing to be able to store (and thus make searchable later) the text. Thank you very much for the project, it's such an amazing idea! ",213286752,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 689848827,MDU6SXNzdWU2ODk4NDg4Mjc=,6,ISO timestamps,9599,open,0,,,0,2020-09-01T06:16:42Z,2020-09-01T06:16:42Z,,MEMBER,,"The `time_added`, `time_updated` and `time_read` columns currently store data like this: September 19, 2019 - 00:30:30 UTC Should use ISO instead, e.g. `2020-07-26T01:05:24+00:00`",213286752,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 589402939,MDU6SXNzdWU1ODk0MDI5Mzk=,4,"Store authentication information as ""pocket_access_token"" etc",9599,closed,0,,,0,2020-03-27T20:43:22Z,2020-03-27T20:43:59Z,2020-03-27T20:43:59Z,MEMBER,,The `pocket_` prefix will mean that the same `auth.json` file can be used for other Dogsheep tools without Pocket over-riding a value set by some other tool.,213286752,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 664793260,MDU6SXNzdWU2NjQ3OTMyNjA=,2,Yak shave,145425,open,0,,,0,2020-07-23T22:04:18Z,2020-07-23T22:04:18Z,,NONE,,"Just a quick note... The 23andme data is not exactly your genome, but a SNP chip of your genome. It's ""some of your genotypes."" Or about 0.1% of your genome. Nice work in any case! It deserves to be liberated!!!!!",209590345,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1505411725,I_kwDODFdgUs5ZusKN,78,self-hosted or corp github enterprise,549431,open,0,,,0,2022-12-20T22:51:45Z,2022-12-20T22:51:45Z,,NONE,,"We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL?",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1410548368,I_kwDODFdgUs5UE0KQ,77,Feature: Support GitHub discussions,631242,open,0,,,0,2022-10-16T16:53:38Z,2022-10-16T16:53:38Z,,CONTRIBUTOR,,"Hi @simonw I've been a happy user of this tool. Thank you for writing it and sharing it. I wanted to suggest a feature request to support Discussions. For example the VisiData project has discussions https://github.com/saulpw/visidata/discussions , and it would be useful if there was a way to pull that data into the database. However, I'm not offering a pull request.",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/77/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1363244199,I_kwDODFdgUs5RQXSn,75,Fetch repos doesn't support organisations,2757699,open,0,,,0,2022-09-06T12:55:06Z,2022-09-06T12:55:06Z,,NONE,,"Say I want to get all my Github Org's repos info, for data analysis. Not just the public repos, but also the private/internal repos. The endpoints are different for organisation, and this tool doesn't take it into account: https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L453 https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L455 The endpoints for organisation repos is instead ([source](https://docs.github.com/en/rest/repos/repos#list-organization-repositories)): `url = ""https://api.github.com/orgs/{}/repos"".format(username)` Let's add support for organisations repo scraping.",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,closed,0,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149402080,PR_kwDODFdgUs4zaUta,70,scrape-dependents: enable paging through package menu option if present,36061055,open,0,,,0,2022-02-24T15:07:25Z,2022-02-24T15:07:25Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/70,Some repos organize network dependents by a Package toggle. This PR adds the ability to page through those options and scrape underlying dependents.,207052882,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1013506559,PR_kwDODFdgUs4skaNS,68,Add support for retrieving teams / members,68329,open,0,,,0,2021-10-01T15:55:02Z,2021-10-01T15:59:53Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/68,Adds a method for retrieving all the teams within an organisation and all the members in those teams. The latter is stored as a join table `team_members` beteween `teams` and `users`.,207052882,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 981690086,MDExOlB1bGxSZXF1ZXN0NzIxNjg2NzIx,67,Replacing step ID key with step_id,16374374,open,0,,,0,2021-08-28T01:26:41Z,2021-08-28T01:27:00Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/67,"Workflows that have an `id` in any step result in the following error when running `workflows`: e.g.`github-to-sqlite workflows github.db nixos/nixpkgs` ```Traceback (most recent call last): File ""/usr/local/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1668, in invoke```Traceback (most recent call last): File ""/usr/local/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/github_to_sqlite/cli.py"", line 601, in workflows utils.save_workflow(db, repo_id, filename, content) File ""/usr/local/lib/python3.8/dist-packages/github_to_sqlite/utils.py"", line 865, in save_workflow db[""steps""].insert_all( File ""/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py"", line 2596, in insert_all self.insert_chunk( File ""/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py"", line 2378, in insert_chunk result = self.db.execute(query, params) File ""/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py"", line 419, in execute return self.conn.execute(sql, parameters) sqlite3.IntegrityError: datatype mismatch ``` - [Information about the ID key in a step for GHA](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstepsid) - [An example workflow from a public repo](https://github.com/NixOS/nixpkgs/blob/b4cc66827745e525ce7bb54659845ac89788a597/.github/workflows/direct-push.yml#L16) # Changes I'm proposing that the key for `id` in step is replaced with `step_id` so that it no longer interferes with the table `id` for tracking the record. Special thanks to @sarcasticadmin @egiffen and @ruebenramirez for helping a bit on this 😄 ",207052882,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/67/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 897212458,MDU6SXNzdWU4OTcyMTI0NTg=,63,Ability to fetch commits from branches other than the default,9599,open,0,,,0,2021-05-20T17:58:08Z,2021-05-20T17:58:08Z,,MEMBER,,This tool is currently almost entirely ignorant of the concept of branches. One example: you can't retrieve commits from any branch other than the default (usually main).,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 797784080,MDU6SXNzdWU3OTc3ODQwODA=,62,Stargazers and workflows commands always require an auth file when using GITHUB_TOKEN ,631242,open,0,,,0,2021-01-31T18:56:05Z,2021-01-31T18:56:05Z,,CONTRIBUTOR,,"Requested fix in https://github.com/dogsheep/github-to-sqlite/pull/59 The stargazers and workflows commands always require an auth file, even when using a `GITHUB_TOKEN`. Other commands don't require the auth file. ",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 758944006,MDU6SXNzdWU3NTg5NDQwMDY=,57,--readme throws 404 error if README does not exist in repo,9599,closed,0,,,0,2020-12-07T23:58:49Z,2020-12-16T18:17:54Z,2020-12-16T18:17:54Z,MEMBER,,It should fail silently (populate the column with a null) instead.,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753122082,MDU6SXNzdWU3NTMxMjIwODI=,56,Link to example tables from the README,9599,closed,0,,,0,2020-11-30T04:01:51Z,2020-11-30T04:10:27Z,2020-11-30T04:10:27Z,MEMBER,,Would help demonstrate how the tool works.,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753026388,MDU6SXNzdWU3NTMwMjYzODg=,55,github-to-sqlite workflows does not correctly replace existing records,9599,closed,0,,,0,2020-11-29T21:58:43Z,2020-11-29T23:48:50Z,2020-11-29T23:48:50Z,MEMBER,,Following #54 - see this TODO: https://github.com/dogsheep/github-to-sqlite/blob/1b23ce11953f9f59c0161ea1f99188b55b5ea11c/github_to_sqlite/utils.py#L700,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724264574,MDU6SXNzdWU3MjQyNjQ1NzQ=,52,Option to fetch README and/or HTML-rendered README for repos,9599,closed,0,,,0,2020-10-19T05:10:24Z,2020-10-19T05:33:42Z,2020-10-19T05:33:42Z,MEMBER,,"I'm thinking: github-to-sqlite repos ... --readme # Populates readme column with raw text github-to-sqlite repos ... --readme-html # Populates readme_html column with raw HTML https://developer.github.com/v3/repos/contents/#get-a-repository-readme",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703216044,MDU6SXNzdWU3MDMyMTYwNDQ=,49,Feature: gists and starred gists,9599,open,0,,,0,2020-09-17T02:30:52Z,2020-09-17T02:30:52Z,,MEMBER,,https://developer.github.com/v3/gists/#list-starred-gists,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 660413281,MDU6SXNzdWU2NjA0MTMyODE=,44,Rename tags.repo_id column to tags.repo,9599,closed,0,,,0,2020-07-18T22:13:46Z,2020-07-18T22:15:12Z,2020-07-18T22:15:12Z,MEMBER,,"For improved consistency with other tables. https://observablehq.com/@simonw/datasette-table-diagram ![datasette-table-diagram(1)](https://user-images.githubusercontent.com/9599/87862843-3cca4900-c909-11ea-9c76-58b3f4aca43f.png) ",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 654405302,MDU6SXNzdWU2NTQ0MDUzMDI=,42,Option for importing just specific repos,9599,closed,0,,,0,2020-07-09T23:20:15Z,2020-07-09T23:25:35Z,2020-07-09T23:25:35Z,MEMBER,,"For if you know which specific repos you care about, as opposed to loading everything owned by the authenticated user. github-to-sqlite repos specific.db -r simonw/datasette -r simonw/github-contents ",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 599776345,MDU6SXNzdWU1OTk3NzYzNDU=,24,Feature idea: github-to-sqlite everything ...,9599,open,0,,,0,2020-04-14T18:34:00Z,2020-04-14T18:34:00Z,,MEMBER,,"At the moment if you want to pull all your repos, issues, issues comments etc you have to do it with a sequence of separate commands. Consider adding a `everything` or `all` command which fetches everything that the tool knows how to fetch, and is designed to be run on a cron in a way that fetches just new stuff each time.",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/24/reactions"", ""total_count"": 7, ""+1"": 7, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 586567379,MDU6SXNzdWU1ODY1NjczNzk=,22,Handle empty git repositories,9599,closed,0,,,0,2020-03-23T22:49:48Z,2020-03-23T23:13:11Z,2020-03-23T23:13:11Z,MEMBER,,"Got this error: ``` github_to_sqlite.utils.GitHubError: {'message': 'Git Repository is empty.', 'documentation_url': 'https://developer.github.com/v3/repos/commits/#list-commits-on-a-repository'} ``` From https://api.github.com/repos/dogsheep/beta/commits",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586454513,MDU6SXNzdWU1ODY0NTQ1MTM=,20,Upgrade to sqlite-utils 2.x,9599,closed,0,,5225818,0,2020-03-23T19:17:58Z,2020-03-23T19:22:52Z,2020-03-23T19:22:52Z,MEMBER,,,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520521843,MDU6SXNzdWU1MjA1MjE4NDM=,11,Command to fetch releases,9599,closed,0,,,0,2019-11-09T22:23:30Z,2019-11-09T22:57:00Z,2019-11-09T22:57:00Z,MEMBER,,"https://developer.github.com/v3/repos/releases/#list-releases-for-a-repository `GET /repos/:owner/:repo/releases`",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516769276,MDU6SXNzdWU1MTY3NjkyNzY=,9,Commands do not work without an auth.json file,9599,closed,0,,,0,2019-11-03T01:54:28Z,2019-11-11T05:30:48Z,2019-11-11T05:30:48Z,MEMBER,,"`auth.json` is meant to be optional. If it's not provided, the tool should make heavily rate-limited unauthenticated requests. ``` $ github-to-sqlite repos .data/repos.db simonw Usage: github-to-sqlite repos [OPTIONS] DB_PATH [USERNAME] Try ""github-to-sqlite repos --help"" for help. Error: Invalid value for ""-a"" / ""--auth"": File ""auth.json"" does not exist. ```",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493668862,MDU6SXNzdWU0OTM2Njg4NjI=,2,Extract licenses from repos into a separate table,9599,closed,0,,,0,2019-09-14T21:33:41Z,2019-09-14T21:46:58Z,2019-09-14T21:46:58Z,MEMBER,," ",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493599818,MDU6SXNzdWU0OTM1OTk4MTg=,1,Command for fetching starred repos,9599,closed,0,,,0,2019-09-14T08:36:29Z,2019-09-14T21:30:48Z,2019-09-14T21:30:48Z,MEMBER,,,207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1884499674,PR_kwDODFE5qs5ZtYMc,13,"use poetry for packages, asdf for versioning, and gh actions for ci",150855,open,0,,,0,2023-09-06T17:59:16Z,2023-09-06T17:59:16Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/13,"- build: use poetry for package management, asdf for python version - build: cleanup poetry config, add keywords, ignore dist - ci: migrate circleci to gh actions - fix: dup method definition ",206649770,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1557599877,I_kwDODFE5qs5c1xaF,12,location history changes,14809320,open,0,,,0,2023-01-26T03:57:25Z,2023-01-26T03:57:25Z,,NONE,,"not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25 filename changed from ""Location History.json"" to ""Records.json"" `""timestampMs""` is not present, `""timestamp""` is roughly iso timestamp ```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%SZ"").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%S.%fZ"").timestamp() def save_location_history(db, zf): location_history = json.load( zf.open(""Takeout/Location History/Records.json"") ) db[""location_history""].upsert_all( ( { ""id"": id_for_location_history(row), ""latitude"": row[""latitudeE7""] / 1e7, ""longitude"": row[""longitudeE7""] / 1e7, ""accuracy"": row[""accuracy""], ""timestampMs"": get_timestamp_ms(row[""timestamp""]), ""when"": row[""timestamp""], } for row in location_history[""locations""] ), pk=""id"", ) def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=("","", "":""), sort_keys=True).encode(""utf8"") ).hexdigest()[:6] return ""{}-{}"".format( row['timestamp'], first_six, ) ``` example locations from mine ```json { ""latitudeE7"": 427220206, ""longitudeE7"": -923423972, ""accuracy"": 10, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:31:50.867Z"" } ``` ```json { ""latitudeE7"": 427011317, ""longitudeE7"": -923448300, ""accuracy"": 5, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:33:53Z"" }, ```",206649770,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 2}",, 1250287607,PR_kwDODFE5qs44jvRV,11,Update README.md,11887,open,0,,,0,2022-05-27T03:13:59Z,2022-05-27T03:13:59Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/11,Fix typo,206649770,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/11/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1046887492,PR_kwDODFE5qs4uMsMJ,9,Removed space from filename My Activity.json,91880982,open,0,,,0,2021-11-08T00:04:31Z,2021-11-08T00:04:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/9,"File name from google takeout has no space. The code only runs without error if filename is ""MyActivity.json"" and not ""My Activity.json"". Is it a new change by Google?",206649770,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 930946817,MDU6SXNzdWU5MzA5NDY4MTc=,7,KeyError: 'accuracy' when processing Location History,403152,open,0,,,0,2021-06-27T14:39:43Z,2021-06-27T14:39:43Z,,NONE,,"I'm new to both the dogsheep tools and datasette but have been experimenting a bit the last few days and these are really cool tools! I encountered a problem running my Google location history through this tool running the latest release in a docker container: ``` Traceback (most recent call last): File ""/usr/local/bin/google-takeout-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/cli.py"", line 49, in my_activity utils.save_location_history(db, zf) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/utils.py"", line 27, in save_location_history db[""location_history""].upsert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1105, in upsert_all return self.insert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 990, in insert_all chunk = list(chunk) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/utils.py"", line 33, in ""accuracy"": row[""accuracy""], KeyError: 'accuracy' ``` It looks like the tool assumes the `accuracy` key will be in every location history entry. My first attempt at a local patch to get myself going was to convert accessing the `accuracy` key to a `.get` instead to hopefully make the row nullable but I wasn't quite sure what `sqlite_utils` would do there. That did work in that the import happened and so I was going to propose a patch that made that change but in updating the existing test to include an entry with a missing accuracy entry, I noticed the expected type of the field appeared to be changing to a string in the test (and from a quick scan through the sqlite_utils code, probably TEXT in the database). Given this change in column type, it seemed that opening an issue first before proposing a fix seemed warranted. It seems the schema would need to be explicitly specified if you wanted a nullable integer column. Now that I've done a successful import run using my initial fix of calling `.get` on the row dict, I can see with datasette that I only have 7 data points (out of ~250k) that have a null accuracy column. They are all from 2011-2012 in an import that includes points spanning ~2010-2016 so perhaps another approach might be to filter those entries out during import if it really is that infrequent? I'm happy to provide a PR for a fix but figured I'd ask about which direction is preferred first.",206649770,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 769397742,MDU6SXNzdWU3NjkzOTc3NDI=,3,sqlite-utils error on takeout import,231498,open,0,,,0,2020-12-17T01:18:48Z,2020-12-17T01:19:04Z,,NONE,,"``` $ google-takeout-to-sqlite my-activity takeout.db /path/to/zip ... sqlite3.OperationalError: no such table: main.my_activity ``` there is no table create in `utils.py`, unlike other importers such as github-to-sqlite additionally, this package and hackernews-to-sqlite have conflicting `sqlite-utils` dep with datasette and dogsheep-beta",206649770,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/3/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 504720731,MDU6SXNzdWU1MDQ3MjA3MzE=,1,Add more details on how to request data from google takeout correctly.,1055831,open,0,,,0,2019-10-09T15:17:34Z,2019-10-09T15:17:34Z,,NONE,,"The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now: - My Activity - Location History In addition unless you specify that ""My Activity"" is downloaded in JSON format the default is HTML. This then causes the `google-takeout-to-sqlite my-activity takeout.db takeout.zip` command to fail as it only contains html files not json files. Thanks",206649770,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1485017981,I_kwDODEpn8M5Yg5N9,2,table identifications has no column named previous_observation_taxon,520541,open,0,,,0,2022-12-08T16:47:17Z,2022-12-08T16:47:17Z,,NONE,,"Installed successfully with pip and ran `inaturalist-to-sqlite inaturalist.db simonw` and got the error: ``` sqlite3.OperationalError: table identifications has no column named previous_observation_taxon ```",206202864,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353411865,I_kwDODEpn8M5Qq20Z,1,Problem with my user,2467,open,0,,,0,2022-08-28T16:59:37Z,2022-08-28T16:59:37Z,,NONE,,"If I call the program with: inaturalist-to-sqlite inaturalist.db ftricas the program exits with an error: `Importing 36 observations Traceback (most recent call last): File ""/home/ftricas/.pyenv/versions/3.10.6/bin/inaturalist-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/cli.py"", line 51, in cli save_observation(observation, db) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/utils.py"", line 34, in save_observation db[""observations""] File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2965, in insert return self.insert_all( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3068, in insert_all self.create( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 1564, in create self.db.create_table( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 951, in create_table sql = self.create_table_sql( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 765, in create_table_sql foreign_keys = self.resolve_foreign_keys(name, foreign_keys or []) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 702, in resolve_foreign_keys other_table = table.guess_foreign_table(column) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2061, in guess_foreign_table raise NoObviousTable( sqlite_utils.db.NoObviousTable: No obvious foreign key table for column 'taxon' - tried ['taxon', 'taxons'] ` If I call the program with your user everything seems to go well and then, I can call the program with my own user without problems. Moreover, I can call the program again with my own user and everything goes well now. Additional info, the command: sqlite-utils tables inaturalist.db shows that the correct name can be 'taxons'. There is another small problem with a warning: warnings.warn(""urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "" ",206202864,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1816830546,I_kwDODEm0Qs5sSqJS,73,Twitter v1 API shutdown,6341745,open,0,,,0,2023-07-22T16:57:41Z,2023-07-22T16:57:41Z,,NONE,,"I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get: ``` [2023-07-19 21:00:04.937536] File ""/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets[""errors""])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}] ``` It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they [announced they'd be deprecated on 29th April](https://twittercommunity.com/t/reminder-to-migrate-to-the-new-free-basic-or-enterprise-plans-of-the-twitter-api/189737). Unfortunately [retrieving likes using the v2 API](https://developer.twitter.com/en/docs/twitter-api/tweets/likes/introduction) is not part of their [free plan](https://developer.twitter.com/en/portal/products). In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! ",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 1524431805,I_kwDODEm0Qs5a3Pu9,72,"Import thread, including self- and others' replies",601708,open,0,,,0,2023-01-08T09:51:06Z,2023-01-08T09:51:06Z,,NONE,,"statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this. `twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234` twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is [implemented in twarc](https://sourcegraph.com/github.com/DocNow/twarc/-/blob/twarc/client.py?L708-766&subtree=true), using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a [search query](https://stackoverflow.com/a/30480103/1020467), use of [undocumented `related_results` api](https://stackoverflow.com/a/9419346/1020467), or with requested inclusion of [newer conversation_id](https://stackoverflow.com/a/68115718/1020467) with subsequent query. ",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1513238455,PR_kwDODEm0Qs5GUoPm,71,"Archive: Fix ""ni devices"" typo in importer",26161409,open,0,,,0,2022-12-28T23:33:31Z,2022-12-28T23:33:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/71,,206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513238314,PR_kwDODEm0Qs5GUoN6,70,Archive: Import Twitter Circle data,26161409,open,0,,,0,2022-12-28T23:33:09Z,2022-12-28T23:33:09Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/70,,206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513238152,PR_kwDODEm0Qs5GUoMM,69,Archive: Import new tweets table name,26161409,open,0,,,0,2022-12-28T23:32:44Z,2022-12-28T23:32:44Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/69,"Given the code here, it seems like in the past this file was named ""tweet.js"". In recent exports, it's named ""tweets.js"". The archive importer needs to be modified to take this into account. Existing logic is reused for importing this table. (However, the resulting table name will be different, matching the different file name -- archive_tweets, rather than archive_tweet).",206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513237982,PR_kwDODEm0Qs5GUoKL,68,Archive: Import mute table,26161409,open,0,,,0,2022-12-28T23:32:06Z,2022-12-28T23:32:06Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/68,,206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513237712,PR_kwDODEm0Qs5GUoG_,67,Add support for app-only bearer tokens,26161409,open,0,,,0,2022-12-28T23:31:20Z,2022-12-28T23:31:20Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/67,"Previously, twitter-to-sqlite only supported OAuth1 authentication, and the token must be on behalf of a user. However, Twitter also supports application-only bearer tokens, documented here: https://developer.twitter.com/en/docs/authentication/oauth-2-0/bearer-tokens This PR adds support to twitter-to-sqlite for using application-only bearer tokens. To use, the auth.json file just needs to contain a ""bearer_token"" key instead of ""api_key"", ""api_secret_key"", etc.",206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1244082183,PR_kwDODEm0Qs44PPLy,66,Ageinfo workaround,11887,open,0,,,0,2022-05-21T21:08:29Z,2022-05-21T21:09:16Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/66,"I'm not sure if this is due to a new format or just because my ageinfo file is blank, but trying to import an archive would crash when it got to that file. This PR adds a guard clause in the `ageinfo` transformer and sets a default value that doesn't throw an exception. Seems likely to be the same issue mentioned by danp in https://github.com/dogsheep/twitter-to-sqlite/issues/54, my ageinfo file looks the same. Added that same ageinfo file to the test archive as well to help confirm my workaround didn't break anything. Let me know if you want any changes!",206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/66/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1160327106,PR_kwDODEm0Qs4z_V3w,65,"Update Twitter dev link, clarify apps vs projects",2657547,open,0,,,0,2022-03-05T11:56:08Z,2022-03-05T11:56:08Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/65,"Twitter pushes you heavily towards v2 projects instead of v1 apps – I know the README mentions v1 API compatibility at the top, but I still nearly got turned around here.",206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1097332098,I_kwDODEm0Qs5BZ_WC,64,Include all entities for tweets,111631,open,0,,,0,2022-01-09T23:35:28Z,2022-01-09T23:35:28Z,,NONE,,"Per our conversation [on Twitter](https://twitter.com/mschoening/status/1480312477246054401): It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun. Right now, I believe the tool filters out all entities that are not of type `media`.",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091850530,I_kwDODEm0Qs5BFFEi,63,Import archive error 'withheld_in_countries',521097,open,0,,,0,2022-01-01T16:58:59Z,2022-01-01T16:58:59Z,,NONE,,"Importing the twitter archive I received this error: ```bash $ twitter-to-sqlite import archive.db twitter-2021-12-31-.zip birdwatch-note-rating: not yet implemented birdwatch-note: not yet implemented branch-links: not yet implemented community-tweet: not yet implemented contact: not yet implemented device-token: not yet implemented direct-message-mute: not yet implemented mute: not yet implemented periscope-account-information: not yet implemented periscope-ban-information: not yet implemented periscope-broadcast-metadata: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-expired-broadcasts: not yet implemented periscope-followers: not yet implemented periscope-profile-description: not yet implemented professional-data: not yet implemented protected-history: not yet implemented reply-prompt: not yet implemented screen-name-change: not yet implemented smartblock: not yet implemented spaces-metadata: not yet implemented sso: not yet implemented Traceback (most recent call last): File ""/home/paulox/.virtualenvs/dogsheep/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 759, in import_ archive.import_from_file(db, filename, content) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 246, in import_from_file db[table_name].insert_all(rows, pk=pk, replace=True) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2625, in insert_all self.insert_chunk( File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2406, in insert_chunk result = self.db.execute(query, params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 422, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table archive_tweet has no column named withheld_in_countries ``` I found only a single tweet with the key `withheld_in_countries` in `tweet.js` that seems the problems: ```JSON [ { ""tweet"" : { ""retweeted"" : false, ""source"" : ""Twitter for Android"", ""entities"" : { ""hashtags"" : [ { ""text"" : ""NowOnAndroid"", ""indices"" : [ ""64"", ""77"" ] } ], ""symbols"" : [ ], ""user_mentions"" : [ { ""name"" : ""Periscope"", ""screen_name"" : ""PeriscopeCo"", ""indices"" : [ ""3"", ""15"" ], ""id_str"" : ""1111111111"", ""id"" : ""222222222"" } ], ""urls"" : [ { ""url"" : ""https://t.co/xxxxxxxxx"", ""expanded_url"" : ""https://vine.co/v/xxxxxxxxx"", ""display_url"" : ""vine.co/v/xxxxxxxxxx"", ""indices"" : [ ""78"", ""101"" ] } ] }, ""display_text_range"" : [ ""0"", ""101"" ], ""favorite_count"" : ""0"", ""id_str"" : ""1111111111111111111111"", ""truncated"" : false, ""retweet_count"" : ""0"", ""withheld_in_countries"" : [ ""TR"" ], ""id"" : ""000000000000000000"", ""possibly_sensitive"" : false, ""created_at"" : ""Fri Aug 14 06:04:03 +0000 2015"", ""favorited"" : false, ""full_text"" : ""RT @periscopeco: Travel the world. LIVE. The Global Map is here #NowOnAndroid https://t.co/NZXdsPWROk"", ""lang"" : ""en"" } } ] ``` I solved the error removing the key from the `tweet.js` but I'm reporting this error to improve the project.",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 703218448,MDU6SXNzdWU3MDMyMTg0NDg=,51,Documentation for twitter-to-sqlite fetch,9599,open,0,,,0,2020-09-17T02:38:10Z,2020-09-17T02:38:10Z,,MEMBER,,"It's mentioned in passing in the README but it deserves its own section: ``` $ twitter-to-sqlite fetch \ ""https://api.twitter.com/1.1/account/verify_credentials.json"" \ | grep '""id""' | head -n 1 ```",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 610284471,MDU6SXNzdWU2MTAyODQ0NzE=,46,Error running 'search' for the first time,9599,closed,0,,,0,2020-04-30T18:11:20Z,2020-04-30T18:11:58Z,2020-04-30T18:11:58Z,MEMBER,,"``` % twitter-to-sqlite search infodemic.db '#infodemic' Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 867, in search for tweet in tweets: File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 165, in fetch_timeline [since_type_id, since_key], sqlite3.OperationalError: no such table: since_ids ```",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602181581,MDU6SXNzdWU2MDIxODE1ODE=,44,"tweet[""source""] can be an empty string",9599,closed,0,,,0,2020-04-17T19:18:26Z,2020-04-17T22:01:44Z,2020-04-17T22:01:44Z,MEMBER,,"Got this excepion: ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 641, in extract_and_save_source details = m.groupdict() AttributeError: 'NoneType' object has no attribute 'groupdict' ``` I traced it back to this tweet: https://twitter.com/osder/status/578712651393576960 ``` (Pdb) source_re re.compile('.*?)"".*?>(?P.*?)') (Pdb) locals()['source'] '' (Pdb) u > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets() -> tweet[""source""] = extract_and_save_source(db, tweet[""source""]) (Pdb) tweet {'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741} ```",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602173589,MDU6SXNzdWU2MDIxNzM1ODk=,42,Error running user-timeline with --sql and --ids together,9599,closed,0,,,0,2020-04-17T19:02:06Z,2020-04-17T23:34:40Z,2020-04-17T23:34:40Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in user_timeline ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" TypeError: object of type 'int' has no len() ``` But this DID work - casting to strings: ``` $ twitter-to-sqlite user-timeline tweets.db --sql='select """" || id from users' --ids ... this worked ... ```",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 591613579,MDU6SXNzdWU1OTE2MTM1Nzk=,41,"Bug: recorded a since_id for None, None",9599,closed,0,,,0,2020-04-01T04:29:43Z,2020-04-01T04:31:11Z,2020-04-01T04:31:11Z,MEMBER,,"This shouldn't happen in the `since_ids` table (relates to #39): ",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585306847,MDU6SXNzdWU1ODUzMDY4NDc=,36,twitter-to-sqlite followers/friends --sql / --attach,9599,closed,0,,,0,2020-03-20T20:20:33Z,2020-03-20T23:12:38Z,2020-03-20T23:12:38Z,MEMBER,,"Split from #8. The `friends` and `followers` commands don't yet support `--sql` and `--attach`. (`friends-ids` and `followers-ids` do though).",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561454071,MDU6SXNzdWU1NjE0NTQwNzE=,32,"Documentation for "" favorites"" command",9599,closed,0,,,0,2020-02-07T06:50:11Z,2020-02-07T06:59:10Z,2020-02-07T06:59:10Z,MEMBER,,"It looks like I forgot to document this one in the README. https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508553387,MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4,24,Tweet source extraction and new migration system,9599,closed,0,,,0,2019-10-17T15:24:56Z,2019-10-17T15:49:29Z,2019-10-17T15:49:24Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/24,Closes #12 and #23,206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 508024032,MDU6SXNzdWU1MDgwMjQwMzI=,22,Ability to import from uncompressed archive or from specific files,9599,closed,0,,,0,2019-10-16T18:31:57Z,2019-10-16T18:53:36Z,2019-10-16T18:53:36Z,MEMBER,,"Currently you can only import like this: $ twitter-to-sqlite import path-to-twitter.zip It would be useful if you could import from a folder that was decompressed from that zip: $ twitter-to-sqlite import path-to-twitter/ AND from individual files within that folder - since that would allow you to e.g. selectively import certain files: $ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505673645,MDU6SXNzdWU1MDU2NzM2NDU=,16,Do a better job with archived direct message threads,9599,open,0,,,0,2019-10-11T06:55:21Z,2019-10-11T06:55:27Z,,MEMBER,,https://github.com/dogsheep/twitter-to-sqlite/blob/fb2698086d766e0333a55bb73435e7283feeb438/twitter_to_sqlite/archive.py#L98-L99,206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 505666744,MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz,15,"twitter-to-sqlite import command, refs #4",9599,closed,0,,,0,2019-10-11T06:37:14Z,2019-10-11T06:45:01Z,2019-10-11T06:45:01Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/15,,206156866,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 503244410,MDU6SXNzdWU1MDMyNDQ0MTA=,14,"When importing favorites, record which user favorited them",9599,closed,0,,,0,2019-10-07T05:45:11Z,2019-10-14T03:30:25Z,2019-10-14T03:30:25Z,MEMBER,,"This code currently just dumps them into the `tweets` table without recording who it was who had favorited them. https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490798130,MDU6SXNzdWU0OTA3OTgxMzA=,7,users-lookup command for fetching users,9599,closed,0,,,0,2019-09-08T19:47:59Z,2019-09-08T20:32:13Z,2019-09-08T20:32:13Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup ``` https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282 https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws ``` CLI design: ``` $ twitter-to-sqlite users-lookup simonw cleopaws $ twitter-to-sqlite users-lookup 783214 6253282 --ids ```",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 489419782,MDU6SXNzdWU0ODk0MTk3ODI=,6,Extract extended_entities into a media table,9599,closed,0,,,0,2019-09-04T21:59:10Z,2019-09-04T22:08:01Z,2019-09-04T22:08:01Z,MEMBER,," ",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833136,MDU6SXNzdWU0ODg4MzMxMzY=,1,"Imported followers should go in ""users"", relationships in ""following""",9599,closed,0,,,0,2019-09-03T21:27:37Z,2019-09-04T20:23:04Z,2019-09-04T20:23:04Z,MEMBER,,"Right now `twitter-to-sqlite followers` dumps everything in a `followers` table, and doesn't actually record which account they are following! It should instead save them all in a global `users` table and then set up m2m relationships in a `following` table. This also means it should create a record for the specified user in order to record both sides of each relationship.",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,closed,0,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1393330070,PR_kwDODD6af84__DNJ,14,Photo links,6782721,open,0,,,0,2022-10-01T09:44:15Z,2022-11-18T17:10:49Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/14,"* add to `checkin_details` view new column for a calculated photo links * supported multiple links split by newline * create `events` table if there's no events in the history to avoid SQL errors Fixes #9.",205429375,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 673602857,MDU6SXNzdWU2NzM2MDI4NTc=,9,Define a view that displays photos correctly,9599,open,0,,,0,2020-08-05T14:53:39Z,2020-08-05T14:53:39Z,,MEMBER,,"The `photos` table stores data like this: id | createdAt | source | prefix | suffix | width | height | visibility | created ▲ | user -- | -- | -- | -- | -- | -- | -- | -- | -- | -- 5e12c9708506bc000840262a | January 06, 2020 - 05:45:20 UTC | Swarm for iOS 1 | https://fastly.4sqi.net/img/general/ | /15889193_AXxGk4I1nbzUZuyYqObgbXdJNyEHiwj6AUDq0tPZWtw.jpg | 1920 | 1440 | public | 2020-01-06T05:45:20 | 15889193 The photo URL can be derived from those pieces - define a SQL view which does that (using `datasette-json-html` to display the pictures)",205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 589491711,MDU6SXNzdWU1ODk0OTE3MTE=,7,Upgrade to sqlite-utils 2.x,9599,closed,0,,,0,2020-03-28T02:24:51Z,2020-03-28T02:25:03Z,2020-03-28T02:25:03Z,MEMBER,,,205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487721884,MDU6SXNzdWU0ODc3MjE4ODQ=,5,Treat Foursquare timestamps as UTC,9599,closed,0,,,0,2019-08-31T02:44:47Z,2019-08-31T02:50:41Z,2019-08-31T02:50:41Z,MEMBER,,"Current test failure is due to timezone differences between my laptop and Circle CI: https://circleci.com/gh/dogsheep/swarm-to-sqlite/3 ``` E Full diff: E - [{'created': '2018-07-01T04:48:19', E ? ^ E + [{'created': '2018-07-01T02:48:19', E ? ^ E 'createdAt': 1530413299, ``` The timestamps I store in `created` should always be UTC.",205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487598042,MDU6SXNzdWU0ODc1OTgwNDI=,1,Implement code to pull checkins from the Foursquare API,9599,closed,0,,,0,2019-08-30T17:40:02Z,2019-08-30T18:23:24Z,2019-08-30T18:23:24Z,MEMBER,,"The tool currently only works with a pre-prepared JSON file of checkins. When called without options, it should prompt the user to paste in a Foursquare OAuth token. The `--token=` option should work too, and should be backed up by an optional environment variable.",205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515717718,PR_kwDOC8tyDs5Gc-VH,23,Include workout statistics,2129,open,0,,,0,2023-01-01T17:29:57Z,2023-01-01T17:29:57Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/23,"Not sure when this changed (iOS 16 maybe?), but the `WorkoutStatistics` now has a whole bunch of information about workouts, e.g. for runs it contains the distance (as a `` element). Adding it as another column at leat allows me to pull these out (using SQLite's JSON support). I'm running with this patch on my own data now.",197882382,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 975158266,MDU6SXNzdWU5NzUxNTgyNjY=,19,table activity_summary has no column named appleMoveTime,9599,closed,0,,,0,2021-08-20T00:46:44Z,2021-08-20T00:54:34Z,2021-08-20T00:54:34Z,MEMBER,,"Got this error today against a fresh export: table activity_summary has no column named appleMoveTime ",197882382,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836064851,MDExOlB1bGxSZXF1ZXN0NTk2NjI3Nzgw,18,Add datetime parsing,1234956,open,0,,,0,2021-03-19T14:34:22Z,2021-03-19T14:34:22Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/18,"Parses the datetime columns so they are subsequently properly recognized as datetime. Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/17 ",197882382,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,