issues

92 rows sorted by updated_at descending

View and edit SQL

Suggested facets: user, comments, author_association, created_at (date), updated_at (date), closed_at (date)

type

state

id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association body repo type pull_request
591613579 MDU6SXNzdWU1OTE2MTM1Nzk= 41 Bug: recorded a since_id for None, None simonw 9599 closed 0     0 2020-04-01T04:29:43Z 2020-04-01T04:31:11Z 2020-04-01T04:31:11Z MEMBER

This shouldn't happen in the since_ids table (relates to #39):

dogsheep/twitter-to-sqlite issue  
590669793 MDU6SXNzdWU1OTA2Njk3OTM= 40 Feature: record history of follower counts simonw 9599 closed 0     5 2020-03-30T23:32:28Z 2020-04-01T04:13:05Z 2020-04-01T04:13:05Z MEMBER

We currently over-write the follower count every time we import a tweet (when we import that user profile again):

https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294

It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.

dogsheep/twitter-to-sqlite issue  
492297930 MDU6SXNzdWU0OTIyOTc5MzA= 10 Rethink progress bars for various commands simonw 9599 closed 0     5 2019-09-11T15:06:47Z 2020-04-01T03:45:48Z 2020-04-01T03:45:48Z MEMBER

Progress bars and the --silent option are implemented inconsistently across commands at the moment.

This is made more challenging by the fact that for many operations the total length is not known.

https://click.palletsprojects.com/en/7.x/api/#click.progressbar

dogsheep/twitter-to-sqlite issue  
590666760 MDU6SXNzdWU1OTA2NjY3NjA= 39 --since feature can be confused by retweets simonw 9599 closed 0     11 2020-03-30T23:25:33Z 2020-04-01T03:45:16Z 2020-04-01T03:45:16Z MEMBER

If you run twitter-to-sqlite user-timeline ... --since it's supposed to fetch Tweets those specific users tweeted since last time the command was run.

It does this by seeking out the max ID of their previous tweets:

https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311

BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!

dogsheep/twitter-to-sqlite issue  
544571092 MDU6SXNzdWU1NDQ1NzEwOTI= 15 Assets table with downloads garethr 2029 closed 0   5225818 4 2020-01-02T13:05:28Z 2020-03-28T12:17:01Z 2020-03-23T19:17:32Z NONE

The releases command extracts the releases table, but data about the individual assets are locked up in the JSON document in the assets field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful?
If so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the releases command or would you prefer a separate command as well?

dogsheep/github-to-sqlite issue  
543355051 MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2 6 don't break if source is missing mfa 78035 closed 0     1 2019-12-29T10:46:47Z 2020-03-28T02:28:11Z 2020-03-28T02:28:11Z CONTRIBUTOR

broke for me. very old checkins in 2010 had no source set.

dogsheep/swarm-to-sqlite pull dogsheep/swarm-to-sqlite/pulls/6
589491711 MDU6SXNzdWU1ODk0OTE3MTE= 7 Upgrade to sqlite-utils 2.x simonw 9599 closed 0     0 2020-03-28T02:24:51Z 2020-03-28T02:25:03Z 2020-03-28T02:25:03Z MEMBER dogsheep/swarm-to-sqlite issue  
503234169 MDU6SXNzdWU1MDMyMzQxNjk= 2 Track and use the 'since' value simonw 9599 closed 0     3 2019-10-07T05:02:59Z 2020-03-27T22:22:30Z 2020-03-27T22:22:30Z MEMBER

Pocket says:

Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time.

At the bottom of https://getpocket.com/developer/docs/v3/retrieve

dogsheep/pocket-to-sqlite issue  
503233021 MDU6SXNzdWU1MDMyMzMwMjE= 1 Use better pagination (and implement progress bar) simonw 9599 closed 0     4 2019-10-07T04:58:11Z 2020-03-27T22:13:57Z 2020-03-27T22:13:57Z MEMBER

Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow.

We can do better by implementing pagination using count and offset.

dogsheep/pocket-to-sqlite issue  
589402939 MDU6SXNzdWU1ODk0MDI5Mzk= 4 Store authentication information as "pocket_access_token" etc simonw 9599 closed 0     0 2020-03-27T20:43:22Z 2020-03-27T20:43:59Z 2020-03-27T20:43:59Z MEMBER

The pocket_ prefix will mean that the same auth.json file can be used for other Dogsheep tools without Pocket over-riding a value set by some other tool.

dogsheep/pocket-to-sqlite issue  
586595839 MDU6SXNzdWU1ODY1OTU4Mzk= 23 Release 1.0 simonw 9599 closed 0   5225818 1 2020-03-24T00:03:55Z 2020-03-24T00:15:50Z 2020-03-24T00:15:50Z MEMBER

Need to compile release notes.

dogsheep/github-to-sqlite issue  
521275281 MDU6SXNzdWU1MjEyNzUyODE= 13 Set up a live demo Datasette instance simonw 9599 closed 0   5225818 9 2019-11-12T01:27:02Z 2020-03-24T00:03:26Z 2020-03-24T00:03:25Z MEMBER

I deployed https://github-to-sqlite-releases-j7hipcg4aq-uc.a.run.app/ by running this:

#!/bin/bash
# Fetch repos for simonw and dogsheep
github-to-sqlite repos github.db simonw dogsheep -a auth.json

# Fetch releases for the repos tagged 'datasette-io'
sqlite-utils github.db "
select full_name from repos where rowid in (
    select repos.rowid from repos, json_each(repos.topics) j
    where j.value = 'datasette-io'
)" --csv --no-headers | while read repo;
    do github-to-sqlite releases \
            github.db $(echo $repo | tr -d '\r') \
            -a auth.json;
        sleep 2;
    done;

And then deploying using this:

$ datasette publish cloudrun github.db \
  --title "github-to-sqlite releases demo" \
  --about_url="https://github.com/simonw/github-to-sqlite" \
  --about='github-to-sqlite' \
  --install=datasette-render-markdown \
  --install=datasette-json-html \
  --service=github-to-sqlite-releases

This should happen automatically for every release. I can run it once a day in Circle CI to keep the demo database up-to-date.

dogsheep/github-to-sqlite issue  
493670730 MDU6SXNzdWU0OTM2NzA3MzA= 4 Command to fetch stargazers for one or more repos simonw 9599 open 0     2 2019-09-14T21:58:22Z 2020-03-23T23:55:34Z   MEMBER

Maybe this:

$ github-to-sqlite stargazers simonw/datasette

It could accept more than one repos.

Maybe have options similar to --sql in twitter-to-sqlite so you can e.g. fetch all stargazers for all of the repos you have fetched into the database already (or all of the repos belonging to owner X)

dogsheep/github-to-sqlite issue  
586561727 MDU6SXNzdWU1ODY1NjE3Mjc= 21 Turn GitHub API errors into exceptions simonw 9599 closed 0   5225818 2 2020-03-23T22:37:24Z 2020-03-23T23:48:23Z 2020-03-23T23:48:22Z MEMBER

This would have really helped in debugging the mess in #13. Running with this auth.json is a useful demo:

{"github_personal_token": ""}
dogsheep/github-to-sqlite issue  
586567379 MDU6SXNzdWU1ODY1NjczNzk= 22 Handle empty git repositories simonw 9599 closed 0     0 2020-03-23T22:49:48Z 2020-03-23T23:13:11Z 2020-03-23T23:13:11Z MEMBER

Got this error:

github_to_sqlite.utils.GitHubError: {'message': 'Git Repository is empty.', 'documentation_url': 'https://developer.github.com/v3/repos/commits/#list-commits-on-a-repository'}

From https://api.github.com/repos/dogsheep/beta/commits

dogsheep/github-to-sqlite issue  
585411547 MDU6SXNzdWU1ODU0MTE1NDc= 18 Commits in GitHub API can have null author simonw 9599 closed 0   5225818 8 2020-03-21T02:20:56Z 2020-03-23T20:44:49Z 2020-03-23T20:44:26Z MEMBER
Traceback (most recent call last):
  File "/home/ubuntu/datasette-venv/bin/github-to-sqlite", line 8, in <module>
    sys.exit(cli())
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/cli.py", line 235, in commits
    utils.save_commits(db, commits, repo_full["id"])
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 290, in save_commits
    commit_to_insert["author"] = save_user(db, commit["author"])
  File "/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 54, in save_user
    for key, value in user.items()
AttributeError: 'NoneType' object has no attribute 'items'

Got this running the commits command from cron.

dogsheep/github-to-sqlite issue  
493671014 MDU6SXNzdWU0OTM2NzEwMTQ= 5 Add "incomplete" boolean to users table for incomplete profiles simonw 9599 closed 0     2 2019-09-14T22:01:50Z 2020-03-23T19:23:31Z 2020-03-23T19:23:30Z MEMBER

User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc.

Add a incomplete boolean flag to the users table to record this. Then later I can add a backfill-users command which loops through and fetches missing data for those incomplete profiles.

dogsheep/github-to-sqlite issue  
586454513 MDU6SXNzdWU1ODY0NTQ1MTM= 20 Upgrade to sqlite-utils 2.x simonw 9599 closed 0   5225818 0 2020-03-23T19:17:58Z 2020-03-23T19:22:52Z 2020-03-23T19:22:52Z MEMBER dogsheep/github-to-sqlite issue  
585850715 MDU6SXNzdWU1ODU4NTA3MTU= 19 Enable full-text search for more stuff (like commits, issues and issue_comments) simonw 9599 closed 0   5225818 2 2020-03-23T00:19:56Z 2020-03-23T19:06:39Z 2020-03-23T19:06:39Z MEMBER

Currently FTS is only enabled for repos and releases.

dogsheep/github-to-sqlite issue  
546051181 MDU6SXNzdWU1NDYwNTExODE= 16 Exception running first command: IndexError: list index out of range jayvdb 15092 open 0     3 2020-01-07T03:01:58Z 2020-03-22T02:08:57Z   NONE

Exception running first command without an existing db or auth.

> mkdir ~/.github/coala
> /usr/bin/github-to-sqlite repos ~/.github/coala coala
Traceback (most recent call last):
  File "/usr/bin/github-to-sqlite", line 11, in <module>
    load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')()
  File "/usr/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py", line 163, in repos
    utils.save_repo(db, repo)
  File "/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py", line 120, in save_repo
    to_save["owner"] = save_user(db, to_save["owner"])
  File "/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py", line 61, in save_user
    return db["users"].upsert(to_save, pk="id", alter=True).last_pk
  File "/usr/lib/python3.7/site-packages/sqlite_utils/db.py", line 1135, in upsert
    extracts=extracts,
  File "/usr/lib/python3.7/site-packages/sqlite_utils/db.py", line 1162, in upsert_all
    upsert=True,
  File "/usr/lib/python3.7/site-packages/sqlite_utils/db.py", line 1105, in insert_all
    row = list(self.rows_where("rowid = ?", [self.last_rowid]))[0]
IndexError: list index out of range
dogsheep/github-to-sqlite issue  
585526292 MDU6SXNzdWU1ODU1MjYyOTI= 1 Set up full text search simonw 9599 closed 0     1 2020-03-21T15:57:35Z 2020-03-21T19:47:46Z 2020-03-21T19:45:52Z MEMBER

Should run against title and text in items, and about and id in users.

dogsheep/hacker-news-to-sqlite issue  
490803176 MDU6SXNzdWU0OTA4MDMxNzY= 8 --sql and --attach options for feeding commands from SQL queries simonw 9599 closed 0     4 2019-09-08T20:35:49Z 2020-03-20T23:13:01Z 2020-03-20T23:13:01Z MEMBER

Say you want to fetch Twitter profiles for a list of accounts that are stored in another database:

$ twitter-to-sqlite users-lookup users.db --attach attending.db \
    --sql "select Twitter from attending.attendes where Twitter is not null"

The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command.

Should be supported by all three of:

  • twitter-to-sqlite users-lookup
  • twitter-to-sqlite user-timeline
  • twitter-to-sqlite followers and friends

The --attach option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.

dogsheep/twitter-to-sqlite issue  
585306847 MDU6SXNzdWU1ODUzMDY4NDc= 36 twitter-to-sqlite followers/friends --sql / --attach simonw 9599 closed 0     0 2020-03-20T20:20:33Z 2020-03-20T23:12:38Z 2020-03-20T23:12:38Z MEMBER

Split from #8. The friends and followers commands don't yet support --sql and --attach.

(friends-ids and followers-ids do though).

dogsheep/twitter-to-sqlite issue  
585359363 MDU6SXNzdWU1ODUzNTkzNjM= 38 Screen name display for user-timeline is uneven simonw 9599 closed 0     1 2020-03-20T22:30:23Z 2020-03-20T22:37:17Z 2020-03-20T22:37:17Z MEMBER
CDPHE  [####################################]  67
CHFSKy  [####################################]  3216
DHSWI  [####################################]  41
DPHHSMT  [####################################]  742
Delaware_DHSS  [####################################]  3231
DhhsNevada  [####################################]  639

I could format them to match the length of the longest screen name instead.

dogsheep/twitter-to-sqlite issue  
585353598 MDU6SXNzdWU1ODUzNTM1OTg= 37 Handle "User not found" error simonw 9599 open 0     0 2020-03-20T22:14:32Z 2020-03-20T22:14:32Z   MEMBER

While running user-timeline I got this bug (because a screen name I asked for didn't exist):

  File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 185, in transform_user
    user["created_at"] = parser.parse(user["created_at"])
KeyError: 'created_at'
>>> import pdb
>>> pdb.pm()
> /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user()
-> user["created_at"] = parser.parse(user["created_at"])
(Pdb) user
{'errors': [{'code': 50, 'message': 'User not found.'}]}
dogsheep/twitter-to-sqlite issue  
585282212 MDU6SXNzdWU1ODUyODIyMTI= 35 twitter-to-sqlite user-timeline [screen_names] --sql / --attach simonw 9599 closed 0     5 2020-03-20T19:26:07Z 2020-03-20T20:17:00Z 2020-03-20T20:16:35Z MEMBER

Split from #8.

dogsheep/twitter-to-sqlite issue  
561469252 MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4 33 Upgrade to sqlite-utils 2.2.1 simonw 9599 closed 0     1 2020-02-07T07:32:12Z 2020-03-20T19:21:42Z 2020-03-20T19:21:41Z MEMBER dogsheep/twitter-to-sqlite pull dogsheep/twitter-to-sqlite/pulls/33
585266763 MDU6SXNzdWU1ODUyNjY3NjM= 34 IndexError running user-timeline command simonw 9599 closed 0     2 2020-03-20T18:54:08Z 2020-03-20T19:20:52Z 2020-03-20T19:20:37Z MEMBER
$ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines
Traceback (most recent call last):
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite", line 11, in <module>
    load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')()
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py", line 256, in user_timeline
    utils.save_tweets(db, chunk)
  File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 289, in save_tweets
    db["users"].upsert(user, pk="id", alter=True)
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1128, in upsert
    conversions=conversions,
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1157, in upsert_all
    upsert=True,
  File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1096, in insert_all
    row = list(self.rows_where("rowid = ?", [self.last_rowid]))[0]
IndexError: list index out of range
dogsheep/twitter-to-sqlite issue  
578883725 MDU6SXNzdWU1Nzg4ODM3MjU= 17 Command for importing commits simonw 9599 closed 0     2 2020-03-10T21:55:12Z 2020-03-11T02:47:37Z 2020-03-11T02:47:37Z MEMBER

Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits

dogsheep/github-to-sqlite issue  
520756546 MDU6SXNzdWU1MjA3NTY1NDY= 12 Add this view for seeing new releases simonw 9599 open 0     2 2019-11-11T06:00:12Z 2020-03-03T20:35:17Z   MEMBER
CREATE VIEW recent_releases AS select
  json_object("label", repos.full_name, "href", repos.html_url) as repo,
  json_object(
    "href",
    releases.html_url,
    "label",
    releases.name
  ) as release,
  substr(releases.published_at, 0, 11) as date,
  releases.body as body_markdown,
  releases.published_at
from
  releases
  join repos on repos.id = releases.repo
order by
  releases.published_at desc
dogsheep/github-to-sqlite issue  
516763727 MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2 8 stargazers command, refs #4 simonw 9599 open 0     4 2019-11-03T00:37:36Z 2020-03-03T20:33:58Z   MEMBER

Needs tests. Refs #4.

dogsheep/github-to-sqlite pull dogsheep/github-to-sqlite/pulls/8
520508502 MDU6SXNzdWU1MjA1MDg1MDI= 31 "friends" command (similar to "followers") simonw 9599 closed 0     1 2019-11-09T20:20:20Z 2020-02-07T07:03:28Z 2020-02-07T07:03:28Z MEMBER

Current list of commands:

  followers          Save followers for specified user (defaults to...
  followers-ids      Populate followers table with IDs of account followers
  friends-ids        Populate followers table with IDs of account friends

Obvious omission here is friends, which would be powered by https://api.twitter.com/1.1/friends/list.json: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list

dogsheep/twitter-to-sqlite issue  
561454071 MDU6SXNzdWU1NjE0NTQwNzE= 32 Documentation for " favorites" command simonw 9599 closed 0     0 2020-02-07T06:50:11Z 2020-02-07T06:59:10Z 2020-02-07T06:59:10Z MEMBER

It looks like I forgot to document this one in the README.

https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194

dogsheep/twitter-to-sqlite issue  
558715564 MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3 4 Add beeminder-to-sqlite bcongdon 706257 open 0     0 2020-02-02T15:51:36Z 2020-02-02T15:51:36Z   FIRST_TIME_CONTRIBUTOR dogsheep/dogsheep.github.io pull dogsheep/dogsheep.github.io/pulls/4
543717994 MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2 3 Add todoist-to-sqlite bcongdon 706257 open 0     0 2019-12-30T04:02:59Z 2019-12-30T04:02:59Z   FIRST_TIME_CONTRIBUTOR

Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful

dogsheep/dogsheep.github.io pull dogsheep/dogsheep.github.io/pulls/3
541274681 MDU6SXNzdWU1NDEyNzQ2ODE= 2 Add linkedin-to-sqlite mnp 881925 open 0     0 2019-12-21T03:13:40Z 2019-12-21T03:13:40Z   NONE

There is an API available. https://developer.linkedin.com/docs/rest-api#

At the minimum, I would think contact list and messages would be of interest.

dogsheep/dogsheep.github.io issue  
530491074 MDU6SXNzdWU1MzA0OTEwNzQ= 14 Command for importing events simonw 9599 open 0     2 2019-11-29T21:28:58Z 2019-11-30T01:32:38Z   MEMBER

Eg from https://api.github.com/users/simonw/events

Docs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user

dogsheep/github-to-sqlite issue  
516769276 MDU6SXNzdWU1MTY3NjkyNzY= 9 Commands do not work without an auth.json file simonw 9599 closed 0     0 2019-11-03T01:54:28Z 2019-11-11T05:30:48Z 2019-11-11T05:30:48Z MEMBER

auth.json is meant to be optional. If it's not provided, the tool should make heavily rate-limited unauthenticated requests.

$ github-to-sqlite repos .data/repos.db simonw
Usage: github-to-sqlite repos [OPTIONS] DB_PATH [USERNAME]
Try "github-to-sqlite repos --help" for help.

Error: Invalid value for "-a" / "--auth": File "auth.json" does not exist.
dogsheep/github-to-sqlite issue  
520521843 MDU6SXNzdWU1MjA1MjE4NDM= 11 Command to fetch releases simonw 9599 closed 0     0 2019-11-09T22:23:30Z 2019-11-09T22:57:00Z 2019-11-09T22:57:00Z MEMBER

https://developer.github.com/v3/repos/releases/#list-releases-for-a-repository

GET /repos/:owner/:repo/releases

dogsheep/github-to-sqlite issue  
518725064 MDU6SXNzdWU1MTg3MjUwNjQ= 29 `import` command fails on empty files jacobian 21148 closed 0     4 2019-11-06T20:34:26Z 2019-11-09T20:33:38Z 2019-11-09T19:36:36Z NONE

If a file in the export is empty (in my case it was account-suspensions.js), twitter-to-sqlite import fails:

$ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip
Traceback (most recent call last):
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite", line 10, in <module>
    sys.exit(cli())
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 627, in import_
    archive.import_from_file(db, filename, content)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py", line 224, in import_from_file
    db[table_name].upsert_all(rows, hash_id="pk")
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py", line 1113, in upsert_all
    extracts=extracts,
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py", line 980, in insert_all
    first_record = next(records)
StopIteration

This appears to be because db.upsert_all is called with no rows -- I think?

I hacked around this by modifying import_from_file to have an if rows: clause:

    for table, rows in to_insert.items():
        if rows:
            table_name = "archive_{}".format(table.replace("-", "_"))
            ...

I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.

dogsheep/twitter-to-sqlite issue  
519979091 MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4 1 Add parkrun-to-sqlite mrw34 1101318 open 0     0 2019-11-08T12:05:32Z 2019-11-09T20:23:24Z   FIRST_TIME_CONTRIBUTOR dogsheep/dogsheep.github.io pull dogsheep/dogsheep.github.io/pulls/1
515658861 MDU6SXNzdWU1MTU2NTg4NjE= 28 Add indexes to followers table simonw 9599 closed 0     1 2019-10-31T18:40:22Z 2019-11-09T20:15:42Z 2019-11-09T20:11:48Z MEMBER

select follower_id from following where followed_id = 12497 takes over a second for me at the moment.

dogsheep/twitter-to-sqlite issue  
518739697 MDU6SXNzdWU1MTg3Mzk2OTc= 30 `followers` fails because `transform_user` is called twice jacobian 21148 closed 0     2 2019-11-06T20:44:52Z 2019-11-09T20:15:28Z 2019-11-09T19:55:52Z NONE

Trying to run twitter-to-sqlite followers errors out:

Traceback (most recent call last):
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite", line 10, in <module>
    sys.exit(cli())
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 130, in followers
    go(bar.update)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 116, in go
    utils.save_users(db, [profile])
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 302, in save_users
    transform_user(user)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 181, in transform_user
    user["created_at"] = parser.parse(user["created_at"])
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 1374, in parse
    return DEFAULTPARSER.parse(timestr, **kwargs)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 646, in parse
    res, skipped_tokens = self._parse(timestr, **kwargs)
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 725, in _parse
    l = _timelex.split(timestr)         # Splits the timestr into tokens
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 207, in split
    return list(cls(s))
  File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 76, in __init__
    '{itype}'.format(itype=instream.__class__.__name__))
TypeError: Parser must be a string or character stream, not datetime

This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls transform_user, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls transform_user again, which fails because the user is already transformed.

I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116.

Shall I work up a patch for that, or is there a better approach?

dogsheep/twitter-to-sqlite issue  
519038979 MDU6SXNzdWU1MTkwMzg5Nzk= 10 Failed to import workout points simonw 9599 closed 0     4 2019-11-07T04:50:22Z 2019-11-08T01:18:37Z 2019-11-08T01:18:37Z MEMBER

I just ran the script and it failed to import any workout_points, though it did import workouts.

dogsheep/healthkit-to-sqlite issue  
516967682 MDU6SXNzdWU1MTY5Njc2ODI= 10 Add this repos_starred view simonw 9599 open 0     1 2019-11-04T05:44:38Z 2019-11-04T05:47:18Z   MEMBER
create view repos_starred as select
  stars.starred_at,
  users.login,
  repos.*
from
  repos
  join stars on repos.id = stars.repo
  join users on repos.owner = users.id
order by
  starred_at desc;
dogsheep/github-to-sqlite issue  
488833975 MDU6SXNzdWU0ODg4MzM5NzU= 3 Command for running a search and saving tweets for that search simonw 9599 closed 0     6 2019-09-03T21:29:56Z 2019-11-04T05:31:56Z 2019-11-04T05:31:16Z MEMBER
$ twitter-to-sqlite search dogsheep
dogsheep/twitter-to-sqlite issue  
514459062 MDU6SXNzdWU1MTQ0NTkwNjI= 27 retweets-of-me command simonw 9599 closed 0     4 2019-10-30T07:43:01Z 2019-11-03T01:12:58Z 2019-11-03T01:12:58Z MEMBER

https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me

dogsheep/twitter-to-sqlite issue  
513074501 MDU6SXNzdWU1MTMwNzQ1MDE= 26 Command for importing mentions timeline simonw 9599 closed 0     1 2019-10-28T03:14:27Z 2019-10-30T02:36:13Z 2019-10-30T02:20:47Z MEMBER

https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline

Almost identical to home-timeline #18 but it uses https://api.twitter.com/1.1/statuses/mentions_timeline.json instead.

dogsheep/twitter-to-sqlite issue  
496415321 MDU6SXNzdWU0OTY0MTUzMjE= 1 Figure out some interesting example SQL queries simonw 9599 open 0     2 2019-09-20T15:28:07Z 2019-10-21T18:36:03Z   MEMBER

My knowledge of genetics has left me short here. I'd love to be able to provide some interesting example SELECT queries - maybe one that spots if you are likely to have red hair?

dogsheep/genome-to-sqlite issue  
506268945 MDU6SXNzdWU1MDYyNjg5NDU= 20 --since support for various commands for refresh-by-cron simonw 9599 closed 0     3 2019-10-13T03:40:46Z 2019-10-21T03:32:04Z 2019-10-16T19:26:11Z MEMBER

I want to run a cron that updates my Twitter database every X minutes.

It should be able to retrieve the following without needing to paginate through everything:

  • Tweets I have tweeted
  • My home timeline (see #19)
  • Tweets I have favourited

It would be nice if this could be standardized across all commands as a --since option.

dogsheep/twitter-to-sqlite issue  
508190730 MDU6SXNzdWU1MDgxOTA3MzA= 23 Extremely simple migration system simonw 9599 closed 0     2 2019-10-17T02:13:57Z 2019-10-17T16:57:17Z 2019-10-17T16:57:17Z MEMBER

Needed for #12. This is going to be an incredibly simple version of the Django migration system.

  • A migrations table, keeping track of which migrations were applied (and when)
  • A migrate() function which applies any pending migrations
  • A MIGRATIONS constant which is a list of functions to be applied

The function names will be detected and used as the names of the migrations.

Every time you run the CLI tool it will call the migrate() function before doing anything else.

Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.

dogsheep/twitter-to-sqlite issue  
508578780 MDU6SXNzdWU1MDg1Nzg3ODA= 25 Ensure migrations don't accidentally create foreign key twice simonw 9599 closed 0     2 2019-10-17T16:08:50Z 2019-10-17T16:56:47Z 2019-10-17T16:56:47Z MEMBER

Is it possible for these lines to run against a database table that already has these foreign keys?

https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22

dogsheep/twitter-to-sqlite issue  
508553387 MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4 24 Tweet source extraction and new migration system simonw 9599 closed 0     0 2019-10-17T15:24:56Z 2019-10-17T15:49:29Z 2019-10-17T15:49:24Z MEMBER

Closes #12 and #23

dogsheep/twitter-to-sqlite pull dogsheep/twitter-to-sqlite/pulls/24
503053800 MDU6SXNzdWU1MDMwNTM4MDA= 12 Extract "source" into a separate lookup table simonw 9599 closed 0     3 2019-10-06T05:17:23Z 2019-10-17T15:49:24Z 2019-10-17T15:49:24Z MEMBER

It's pretty bulky and ugly at the moment:

dogsheep/twitter-to-sqlite issue  
487600595 MDU6SXNzdWU0ODc2MDA1OTU= 3 Option to fetch only checkins more recent than the current max checkin simonw 9599 closed 0     4 2019-08-30T17:46:45Z 2019-10-16T20:41:23Z 2019-10-16T20:39:59Z MEMBER

The Foursquare checkins API supports "return every checkin occurring after this point" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron.

dogsheep/swarm-to-sqlite issue  
506087267 MDU6SXNzdWU1MDYwODcyNjc= 19 since_id support for home-timeline simonw 9599 closed 0     3 2019-10-11T22:48:24Z 2019-10-16T19:13:06Z 2019-10-16T19:12:46Z MEMBER

Currently every time you run home-timeline we pull all 800 available tweets. We should offer to support since_id (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.

dogsheep/twitter-to-sqlite issue  
508024032 MDU6SXNzdWU1MDgwMjQwMzI= 22 Ability to import from uncompressed archive or from specific files simonw 9599 closed 0     0 2019-10-16T18:31:57Z 2019-10-16T18:53:36Z 2019-10-16T18:53:36Z MEMBER

Currently you can only import like this:

$ twitter-to-sqlite import path-to-twitter.zip

It would be useful if you could import from a folder that was decompressed from that zip:

$ twitter-to-sqlite import path-to-twitter/

AND from individual files within that folder - since that would allow you to e.g. selectively import certain files:

$ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js
dogsheep/twitter-to-sqlite issue  
506432572 MDU6SXNzdWU1MDY0MzI1NzI= 21 Fix &amp; escapes in tweet text simonw 9599 closed 0     1 2019-10-14T03:37:28Z 2019-10-15T18:48:16Z 2019-10-15T18:48:16Z MEMBER

Shouldn't be storing &amp; here.

dogsheep/twitter-to-sqlite issue  
506276893 MDU6SXNzdWU1MDYyNzY4OTM= 7 issue-comments command for importing issue comments simonw 9599 closed 0     1 2019-10-13T05:23:58Z 2019-10-14T14:44:12Z 2019-10-13T05:24:30Z MEMBER

Using this API: https://developer.github.com/v3/issues/comments/

dogsheep/github-to-sqlite issue  
503244410 MDU6SXNzdWU1MDMyNDQ0MTA= 14 When importing favorites, record which user favorited them simonw 9599 closed 0     0 2019-10-07T05:45:11Z 2019-10-14T03:30:25Z 2019-10-14T03:30:25Z MEMBER

This code currently just dumps them into the tweets table without recording who it was who had favorited them.

https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157

dogsheep/twitter-to-sqlite issue  
504238461 MDU6SXNzdWU1MDQyMzg0NjE= 6 sqlite3.OperationalError: table users has no column named bio dazzag24 1055831 closed 0     2 2019-10-08T19:39:52Z 2019-10-13T05:31:28Z 2019-10-13T05:30:19Z NONE
$ github-to-sqlite repos github.db
$ github-to-sqlite starred github.db dazzag24

Traceback (most recent call last):
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite", line 10, in <module>
    sys.exit(cli())
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py", line 106, in starred
    utils.save_stars(db, user, stars)
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 177, in save_stars
    user_id = save_user(db, user)
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 61, in save_user
    return db["users"].upsert(to_save, pk="id").last_pk
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 1067, in upsert
    extracts=extracts,
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 916, in insert
    extracts=extracts,
  File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 1024, in insert_all
    result = self.db.conn.execute(sql, values)
sqlite3.OperationalError: table users has no column named bio
$ pipenv graph
github-to-sqlite==0.4
  - requests [required: Any, installed: 2.22.0]
    - certifi [required: >=2017.4.17, installed: 2019.9.11]
    - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4]
    - idna [required: >=2.5,<2.9, installed: 2.8]
    - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6]
  - sqlite-utils [required: ~=1.11, installed: 1.11]
    - click [required: Any, installed: 7.0]
    - click-default-group [required: Any, installed: 1.2.2]
      - click [required: Any, installed: 7.0]
    - tabulate [required: Any, installed: 0.8.5]

Python 3.6.8
dogsheep/github-to-sqlite issue  
505928530 MDU6SXNzdWU1MDU5Mjg1MzA= 18 Command to import home-timeline simonw 9599 closed 0     4 2019-10-11T15:47:54Z 2019-10-11T16:51:33Z 2019-10-11T16:51:12Z MEMBER

Feature request: https://twitter.com/johankj/status/1182563563136868352

Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.

dogsheep/twitter-to-sqlite issue  
505674949 MDU6SXNzdWU1MDU2NzQ5NDk= 17 import command should empty all archive-* tables first simonw 9599 closed 0     2 2019-10-11T06:58:43Z 2019-10-11T15:40:08Z 2019-10-11T15:40:08Z MEMBER

Can have a CLI option for NOT doing that.

dogsheep/twitter-to-sqlite issue  
505673645 MDU6SXNzdWU1MDU2NzM2NDU= 16 Do a better job with archived direct message threads simonw 9599 open 0     0 2019-10-11T06:55:21Z 2019-10-11T06:55:27Z   MEMBER

https://github.com/dogsheep/twitter-to-sqlite/blob/fb2698086d766e0333a55bb73435e7283feeb438/twitter_to_sqlite/archive.py#L98-L99

dogsheep/twitter-to-sqlite issue  
488835586 MDU6SXNzdWU0ODg4MzU1ODY= 4 Command for importing data from a Twitter Export file simonw 9599 closed 0     2 2019-09-03T21:34:13Z 2019-10-11T06:45:02Z 2019-10-11T06:45:02Z MEMBER

Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data

A command for importing this data into SQLite would be extremely useful.

$ twitter-to-sqlite import twitter.db path-to-archive.zip
dogsheep/twitter-to-sqlite issue  
505666744 MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz 15 twitter-to-sqlite import command, refs #4 simonw 9599 closed 0     0 2019-10-11T06:37:14Z 2019-10-11T06:45:01Z 2019-10-11T06:45:01Z MEMBER dogsheep/twitter-to-sqlite pull dogsheep/twitter-to-sqlite/pulls/15
504720731 MDU6SXNzdWU1MDQ3MjA3MzE= 1 Add more details on how to request data from google takeout correctly. dazzag24 1055831 open 0     0 2019-10-09T15:17:34Z 2019-10-09T15:17:34Z   NONE

The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now:

  • My Activity
  • Location History

In addition unless you specify that "My Activity" is downloaded in JSON format the default is HTML. This then causes the

google-takeout-to-sqlite my-activity takeout.db takeout.zip

command to fail as it only contains html files not json files.

Thanks

dogsheep/google-takeout-to-sqlite issue  
503243784 MDU6SXNzdWU1MDMyNDM3ODQ= 3 Extract images into separate tables simonw 9599 open 0     0 2019-10-07T05:43:01Z 2019-10-07T05:43:01Z   MEMBER

As already done with authors. Slightly harder because images do not have a universally unique ID. Also need to figure out what to do about there being columns for both image and images.

dogsheep/pocket-to-sqlite issue  
503085013 MDU6SXNzdWU1MDMwODUwMTM= 13 statuses-lookup command simonw 9599 closed 0     1 2019-10-06T11:00:20Z 2019-10-07T00:33:49Z 2019-10-07T00:31:44Z MEMBER

For bulk retrieving tweets by their ID.

https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup

Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute.

Should support --SQL and --attach #8

dogsheep/twitter-to-sqlite issue  
503045221 MDU6SXNzdWU1MDMwNDUyMjE= 11 Commands for recording real-time tweets from the streaming API simonw 9599 closed 0     1 2019-10-06T03:09:30Z 2019-10-06T04:54:17Z 2019-10-06T04:48:31Z MEMBER

https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter

We can support tracking keywords and following specific users.

dogsheep/twitter-to-sqlite issue  
493670426 MDU6SXNzdWU0OTM2NzA0MjY= 3 Command to fetch all repos belonging to a user or organization simonw 9599 closed 0     2 2019-09-14T21:54:21Z 2019-09-17T00:17:53Z 2019-09-17T00:17:53Z MEMBER

How about this:

$ github-to-sqlite repos simonw
dogsheep/github-to-sqlite issue  
493668862 MDU6SXNzdWU0OTM2Njg4NjI= 2 Extract licenses from repos into a separate table simonw 9599 closed 0     0 2019-09-14T21:33:41Z 2019-09-14T21:46:58Z 2019-09-14T21:46:58Z MEMBER

dogsheep/github-to-sqlite issue  
493599818 MDU6SXNzdWU0OTM1OTk4MTg= 1 Command for fetching starred repos simonw 9599 closed 0     0 2019-09-14T08:36:29Z 2019-09-14T21:30:48Z 2019-09-14T21:30:48Z MEMBER dogsheep/github-to-sqlite issue  
491791152 MDU6SXNzdWU0OTE3OTExNTI= 9 followers-ids and friends-ids subcommands simonw 9599 closed 0     1 2019-09-10T16:58:15Z 2019-09-10T17:36:55Z 2019-09-10T17:36:55Z MEMBER

These will import follower and friendship IDs into the following tables, using these APIs:

https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids
https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids

dogsheep/twitter-to-sqlite issue  
490798130 MDU6SXNzdWU0OTA3OTgxMzA= 7 users-lookup command for fetching users simonw 9599 closed 0     0 2019-09-08T19:47:59Z 2019-09-08T20:32:13Z 2019-09-08T20:32:13Z MEMBER

https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup

https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282
https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws

CLI design:

$ twitter-to-sqlite users-lookup simonw cleopaws
$ twitter-to-sqlite users-lookup 783214 6253282 --ids
dogsheep/twitter-to-sqlite issue  
489419782 MDU6SXNzdWU0ODk0MTk3ODI= 6 Extract extended_entities into a media table simonw 9599 closed 0     0 2019-09-04T21:59:10Z 2019-09-04T22:08:01Z 2019-09-04T22:08:01Z MEMBER

dogsheep/twitter-to-sqlite issue  
488833136 MDU6SXNzdWU0ODg4MzMxMzY= 1 Imported followers should go in "users", relationships in "following" simonw 9599 closed 0     0 2019-09-03T21:27:37Z 2019-09-04T20:23:04Z 2019-09-04T20:23:04Z MEMBER

Right now twitter-to-sqlite followers dumps everything in a followers table, and doesn't actually record which account they are following!

It should instead save them all in a global users table and then set up m2m relationships in a following table. This also means it should create a record for the specified user in order to record both sides of each relationship.

dogsheep/twitter-to-sqlite issue  
488833698 MDU6SXNzdWU0ODg4MzM2OTg= 2 "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user simonw 9599 closed 0     3 2019-09-03T21:29:12Z 2019-09-04T20:02:11Z 2019-09-04T20:02:11Z MEMBER

Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html

I'm going to do:

$ twitter-to-sqlite tweets simonw
dogsheep/twitter-to-sqlite issue  
488874815 MDU6SXNzdWU0ODg4NzQ4MTU= 5 Write tests that simulate the Twitter API simonw 9599 open 0     1 2019-09-03T23:55:35Z 2019-09-03T23:56:28Z   MEMBER

I can use betamax for this: https://pypi.org/project/betamax/

dogsheep/twitter-to-sqlite issue  
487601121 MDU6SXNzdWU0ODc2MDExMjE= 4 Online tool for getting a Foursquare OAuth token simonw 9599 closed 0     1 2019-08-30T17:48:14Z 2019-08-31T18:07:26Z 2019-08-31T18:07:26Z MEMBER

I will link to this from the documentation. See also this conversation on Twitter: https://twitter.com/simonw/status/1166822603023011840

I've decided to go with "copy and paste in a token" rather than hooking up a local web server that can have tokens passed to it.

dogsheep/swarm-to-sqlite issue  
487721884 MDU6SXNzdWU0ODc3MjE4ODQ= 5 Treat Foursquare timestamps as UTC simonw 9599 closed 0     0 2019-08-31T02:44:47Z 2019-08-31T02:50:41Z 2019-08-31T02:50:41Z MEMBER

Current test failure is due to timezone differences between my laptop and Circle CI:

https://circleci.com/gh/dogsheep/swarm-to-sqlite/3

E         Full diff:
E         - [{'created': '2018-07-01T04:48:19',
E         ?                           ^
E         + [{'created': '2018-07-01T02:48:19',
E         ?                           ^
E         'createdAt': 1530413299,

The timestamps I store in created should always be UTC.

dogsheep/swarm-to-sqlite issue  
487598468 MDU6SXNzdWU0ODc1OTg0Njg= 2 --save option to dump checkins to a JSON file on disk simonw 9599 closed 0     1 2019-08-30T17:41:06Z 2019-08-31T02:40:21Z 2019-08-31T02:40:21Z MEMBER

This is a complement to the --load option - mainly useful for development purposes.

(I'll rename --file to --load as part of this issue).

dogsheep/swarm-to-sqlite issue  
487598042 MDU6SXNzdWU0ODc1OTgwNDI= 1 Implement code to pull checkins from the Foursquare API simonw 9599 closed 0     0 2019-08-30T17:40:02Z 2019-08-30T18:23:24Z 2019-08-30T18:23:24Z MEMBER

The tool currently only works with a pre-prepared JSON file of checkins.

When called without options, it should prompt the user to paste in a Foursquare OAuth token.

The --token= option should work too, and should be backed up by an optional environment variable.

dogsheep/swarm-to-sqlite issue  
472429048 MDU6SXNzdWU0NzI0MjkwNDg= 9 Too many SQL variables tholo 166463 closed 0     4 2019-07-24T18:24:17Z 2019-07-26T10:01:05Z 2019-07-26T10:01:05Z NONE

Decided to try importing my data, and ran into this:

Traceback (most recent call last):
  File "/Users/tholo/Source/health/bin/healthkit-to-sqlite", line 10, in <module>
    sys.exit(cli())
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py", line 50, in cli
    convert_xml_to_sqlite(fp, db, progress_callback=bar.update)
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py", line 41, in convert_xml_to_sqlite
    write_records(records, db)
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py", line 80, in write_records
    column_order=["startDate", "endDate", "value", "unit"],
  File "/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py", line 911, in insert_all
    result = self.db.conn.execute(sql, values)
sqlite3.OperationalError: too many SQL variables

Added some debug output in sqlite_utils/db.py, which resulted in:

                INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ,
                    (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
                ;

with the attached data:

['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user                    """
id=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056']
dogsheep/healthkit-to-sqlite issue  
472097220 MDU6SXNzdWU0NzIwOTcyMjA= 7 Script uses a lot of RAM simonw 9599 closed 0     3 2019-07-24T06:11:11Z 2019-07-24T06:35:52Z 2019-07-24T06:35:52Z MEMBER

I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor.

I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later.

http://effbot.org/elementtree/iterparse.htm says I should use elem.clear() as I go. It also says:

The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element.

So I will try that recipe and see if it helps.

dogsheep/healthkit-to-sqlite issue  
472104705 MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx 8 Use less RAM simonw 9599 closed 0     0 2019-07-24T06:35:01Z 2019-07-24T06:35:52Z 2019-07-24T06:35:52Z MEMBER

Closes #7

dogsheep/healthkit-to-sqlite pull dogsheep/healthkit-to-sqlite/pulls/8
470691622 MDU6SXNzdWU0NzA2OTE2MjI= 5 Add progress bar simonw 9599 closed 0     2 2019-07-20T16:29:07Z 2019-07-22T03:30:13Z 2019-07-22T02:49:22Z MEMBER

Showing a progress bar would be nice, using Click.

The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser:

https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10

dogsheep/healthkit-to-sqlite issue  
470856782 MDU6SXNzdWU0NzA4NTY3ODI= 6 Break up records into different tables for each type simonw 9599 closed 0     1 2019-07-22T01:54:59Z 2019-07-22T03:28:55Z 2019-07-22T03:28:50Z MEMBER

I don't think there's much benefit to having all of the different record types stored in the same enormous table. Here's what I get when I use _facet=type:

I'm going to try splitting these up into separate tables - so HKQuantityTypeIdentifierBodyMassIndex becomes a table called rBodyMassIndex - and see if that's nicer to work with.

dogsheep/healthkit-to-sqlite issue  
470637152 MDU6SXNzdWU0NzA2MzcxNTI= 2 Import workouts simonw 9599 closed 0     1 2019-07-20T05:20:21Z 2019-07-20T06:21:41Z 2019-07-20T06:21:41Z MEMBER

From #1

dogsheep/healthkit-to-sqlite issue  
470640505 MDU6SXNzdWU0NzA2NDA1MDU= 4 Import Records simonw 9599 closed 0     1 2019-07-20T06:11:20Z 2019-07-20T06:21:41Z 2019-07-20T06:21:41Z MEMBER

From #1:

 'Record': {'attr_counts': {'creationDate': 2672233,
                            'device': 2665111,
                            'endDate': 2672233,
                            'sourceName': 2672233,
                            'sourceVersion': 2671779,
                            'startDate': 2672233,
                            'type': 2672233,
                            'unit': 2650012,
                            'value': 2672232},
            'child_counts': {'HeartRateVariabilityMetadataList': 2318,
                             'MetadataEntry': 287974},
            'count': 2672233,
            'parent_counts': {'Correlation': 2, 'HealthData': 2672231}},
dogsheep/healthkit-to-sqlite issue  
470637206 MDU6SXNzdWU0NzA2MzcyMDY= 3 Import ActivitySummary simonw 9599 closed 0     0 2019-07-20T05:21:00Z 2019-07-20T05:58:07Z 2019-07-20T05:58:07Z MEMBER

From #1

  'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980,
                                     'activeEnergyBurnedGoal': 980,
                                     'activeEnergyBurnedUnit': 980,
                                     'appleExerciseTime': 980,
                                     'appleExerciseTimeGoal': 980,
                                     'appleStandHours': 980,
                                     'appleStandHoursGoal': 980,
                                     'dateComponents': 980},
                     'child_counts': {},
                     'count': 980,
                     'parent_counts': {'HealthData': 980}},
dogsheep/healthkit-to-sqlite issue  
470637068 MDU6SXNzdWU0NzA2MzcwNjg= 1 Use XML Analyser to figure out the structure of the export XML simonw 9599 closed 0     1 2019-07-20T05:19:02Z 2019-07-20T05:20:09Z 2019-07-20T05:20:09Z MEMBER

https://github.com/simonw/xml_analyser

dogsheep/healthkit-to-sqlite issue  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] TEXT REFERENCES [users]([id]),
   [milestone] TEXT REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [repo] TEXT,
   [type] TEXT
, [pull_request] TEXT);
Powered by Datasette · Query took 146.397ms · About: github-to-sqlite