html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/dogsheep/twitter-to-sqlite/issues/48#issuecomment-663143160,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48,663143160,MDEyOklzc3VlQ29tbWVudDY2MzE0MzE2MA==,9599,2020-07-23T17:46:07Z,2020-07-23T17:46:07Z,MEMBER,"Frustratingly, these links don't work on PyPI: https://pypi.org/project/twitter-to-sqlite/ There's an issue about that here: https://github.com/pypa/readme_renderer/issues/169","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663976976, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693788387,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693788387,MDEyOklzc3VlQ29tbWVudDY5Mzc4ODM4Nw==,9599,2020-09-17T03:36:47Z,2020-09-17T03:36:58Z,MEMBER,"Fun demo of the `--nl` option: github-to-sqlite get /users/simonw/repos --paginate --nl | sqlite-utils insert simonw.db repos - --nl ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756, https://github.com/dogsheep/dogsheep-beta/issues/16#issuecomment-695877627,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16,695877627,MDEyOklzc3VlQ29tbWVudDY5NTg3NzYyNw==,9599,2020-09-21T02:42:29Z,2020-09-21T02:42:29Z,MEMBER,"Fun twist: assuming `timestamp` is always stored as UTC, I need the interface to be timezone aware so I can see e.g. everything from 4th July 2020 in the San Francisco timezone definition of 4th July 2020.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694493566, https://github.com/dogsheep/github-to-sqlite/issues/53#issuecomment-735485677,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/53,735485677,MDEyOklzc3VlQ29tbWVudDczNTQ4NTY3Nw==,9599,2020-11-30T00:36:09Z,2020-11-30T00:36:09Z,MEMBER,Given rate limits (see #51) this command might be better implemented by running a `git clone` into a temporary directory - doing so would retrieve all of the files in one go.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753000405, https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552131798,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30,552131798,MDEyOklzc3VlQ29tbWVudDU1MjEzMTc5OA==,9599,2019-11-09T19:54:45Z,2019-11-09T19:54:45Z,MEMBER,Good catch - not sure how that bug crept in. Removing line 116 looks like the right fix to me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518739697, https://github.com/dogsheep/pocket-to-sqlite/issues/5#issuecomment-684425714,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/5,684425714,MDEyOklzc3VlQ29tbWVudDY4NDQyNTcxNA==,9599,2020-09-01T06:18:32Z,2020-09-01T06:18:32Z,MEMBER,"Good suggestion, I'll setup a demo somewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",629473827, https://github.com/dogsheep/hacker-news-to-sqlite/issues/3#issuecomment-886241674,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3,886241674,IC_kwDODtX3eM400vmK,9599,2021-07-25T18:41:17Z,2021-07-25T18:41:17Z,MEMBER,Got a TIL out of this: https://til.simonwillison.net/jq/extracting-objects-recursively,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952189173, https://github.com/dogsheep/dogsheep-photos/issues/4#issuecomment-615957385,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4,615957385,MDEyOklzc3VlQ29tbWVudDYxNTk1NzM4NQ==,9599,2020-04-18T21:56:16Z,2020-04-18T21:58:11Z,MEMBER,Got this working! I'll do EXIF in a separate ticket #3.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",602533539, https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-623010272,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4,623010272,MDEyOklzc3VlQ29tbWVudDYyMzAxMDI3Mg==,9599,2020-05-02T20:39:14Z,2020-05-02T20:39:14Z,MEMBER,"Graph of cumulative stars for Datasette over time: https://github-to-sqlite.dogsheep.net/github?sql=select%0D%0A++yyyymmdd%2C%0D%0A++sum%28n%29+over+%28%0D%0A++++order+by%0D%0A++++++yyyymmdd+rows+unbounded+preceding%0D%0A++%29+as+cumulative_count%0D%0Afrom%0D%0A++%28%0D%0A++++select%0D%0A++++++substr%28starred_at%2C+0%2C+11%29+as+yyyymmdd%2C%0D%0A++++++count%28*%29+as+n%0D%0A++++from%0D%0A++++++stars%0D%0A++++where+repo+%3D+107914493%0D%0A++++group+by%0D%0A++++++yyyymmdd%0D%0A++%29#g.mark=line&g.x_column=yyyymmdd&g.x_type=temporal&g.y_column=cumulative_count&g.y_type=quantitative Stars per day (as a label bar chart, so very wide): https://github-to-sqlite.dogsheep.net/github?sql=%0D%0A++++select%0D%0A++++++substr%28starred_at%2C+0%2C+11%29+as+yyyymmdd%2C%0D%0A++++++count%28*%29+as+n%0D%0A++++from%0D%0A++++++stars%0D%0A++++where+repo+%3D+107914493%0D%0A++++group+by%0D%0A++++++yyyymmdd%0D%0A++#g.mark=bar&g.x_column=yyyymmdd&g.x_type=ordinal&g.y_column=n&g.y_type=quantitative ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670730, https://github.com/dogsheep/pocket-to-sqlite/issues/11#issuecomment-1221621466,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11,1221621466,IC_kwDODLZ_YM5I0Hba,9599,2022-08-21T21:09:47Z,2022-08-21T21:09:47Z,MEMBER,"Great catch, thanks. I'm going to use it to mean `--auth` - since other tools in the Dogsheep family have the same convention. `--all` will be the only way to specify all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1345452427, https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-541748580,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10,541748580,MDEyOklzc3VlQ29tbWVudDU0MTc0ODU4MA==,9599,2019-10-14T15:30:44Z,2019-10-14T15:30:44Z,MEMBER,Had several recommendations for https://github.com/tqdm/tqdm which is what goodreads-to-sqlite uses.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",492297930, https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-861041597,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64,861041597,MDEyOklzc3VlQ29tbWVudDg2MTA0MTU5Nw==,9599,2021-06-14T22:44:54Z,2021-06-14T22:44:54Z,MEMBER,Have you found a way to access events in GraphQL? I can only see way to access a timeline of events for a single issue or a single pull request. See also https://github.community/t/get-event-equivalent-for-v4/13600/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",920636216, https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-550388354,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4,550388354,MDEyOklzc3VlQ29tbWVudDU1MDM4ODM1NA==,9599,2019-11-06T16:26:55Z,2019-11-06T16:26:55Z,MEMBER,"Here's a query I figured out using a window function that shows cumulative stargazers over time: ```sql select yyyymmdd, sum(n) over ( order by yyyymmdd rows unbounded preceding ) as cumulative_count from ( select substr(starred_at, 0, 11) as yyyymmdd, count(*) as n from stars group by yyyymmdd ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670730, https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462962682,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11,1462962682,IC_kwDOJHON9s5XMwn6,9599,2023-03-09T23:20:35Z,2023-03-09T23:22:41Z,MEMBER,"Here's a query that returns all notes in folder 1, including notes in descendant folders: ```sql with recursive nested_folders(folder_id, descendant_folder_id) as ( -- base case: select all immediate children of the root folder select id, id from folders where parent is null union all -- recursive case: select all children of the previous level of nested folders select nf.folder_id, f.id from nested_folders nf join folders f on nf.descendant_folder_id = f.parent ) -- Find notes within all descendants of folder 1 select * from notes where folder in ( select descendant_folder_id from nested_folders where folder_id = 1 ); ``` With assistance from ChatGPT. Prompts were: ``` SQLite schema: CREATE TABLE [folders] ( [id] INTEGER PRIMARY KEY, [long_id] TEXT, [name] TEXT, [parent] INTEGER, FOREIGN KEY([parent]) REFERENCES [folders]([id]) ); Write a recursive CTE that returns the following: folder_id | descendant_folder_id With a row for every nested child of every folder - so the top level folder has lots of rows ``` Then I tweaked it a bit, then ran this: ``` WITH RECURSIVE nested_folders(folder_id, descendant_folder_id) AS ( -- base case: select all immediate children of the root folder SELECT id, id FROM folders WHERE parent IS NULL UNION ALL -- recursive case: select all children of the previous level of nested folders SELECT nf.folder_id, f.id FROM nested_folders nf JOIN folders f ON nf.descendant_folder_id = f.parent ) -- select all rows from the recursive CTE SELECT * from notes where folder in (select descendant_folder_id FROM nested_folders where folder_id = 1) Convert all SQL keywords to lower case, and re-indent ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1618130434, https://github.com/dogsheep/github-to-sqlite/issues/74#issuecomment-1188223933,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74,1188223933,IC_kwDODFdgUs5G0tu9,9599,2022-07-18T19:40:50Z,2022-07-18T19:42:41Z,MEMBER,"Here's how the demo is deployed: https://github.com/dogsheep/github-to-sqlite/blob/dbac2e5dd8a562b45d8255a265859cf8020ca22a/.github/workflows/deploy-demo.yml#L103-L119 I'm suspicious of `py-gfm`, which is used like this: https://github.com/dogsheep/github-to-sqlite/blob/dbac2e5dd8a562b45d8255a265859cf8020ca22a/demo-metadata.json#L49-L51","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1308461063, https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623811131,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16,623811131,MDEyOklzc3VlQ29tbWVudDYyMzgxMTEzMQ==,9599,2020-05-05T03:16:18Z,2020-05-05T03:16:18Z,MEMBER,"Here's how to convert two integers unto a UUID using Java. Not sure if it's the solution I need though (or how to do the same thing in Python): https://repl.it/repls/EuphoricSomberClasslibrary ```java import java.util.UUID; class Main { public static void main(String[] args) { java.util.UUID uuid = new java.util.UUID( 2544182952487526660L, -3640314103732024685L ); System.out.println( uuid ); } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612287234, https://github.com/dogsheep/evernote-to-sqlite/issues/5#issuecomment-706834800,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/5,706834800,MDEyOklzc3VlQ29tbWVudDcwNjgzNDgwMA==,9599,2020-10-12T03:24:57Z,2020-10-16T20:16:28Z,MEMBER,"Here's my first attempt at a plugin for this: ```python from datasette import hookimpl import jinja2 START = """" TEMPLATE = """"""
{}
"""""".strip() EN_MEDIA_SCRIPT = """""" Array.from(document.querySelectorAll('en-media')).forEach(el => { let hash = el.getAttribute('hash'); let type = el.getAttribute('type'); let path = `/evernote/resources_data/${hash}.json?_shape=array`; fetch(path).then(r => r.json()).then(rows => { let b64 = rows[0].data.encoded; let data = `data:${type};base64,${b64}`; el.innerHTML = ``; }); }); """""" @hookimpl def render_cell(value, table): if not table: # Don't render content from arbitrary SQL queries, could be XSS hole return if not value or not isinstance(value, str): return value = value.strip() if value.startswith(START) and value.endswith(END): trimmed = value[len(START) : -len(END)] trimmed = trimmed.split("">"", 1)[1] # Replace those horrible double newlines trimmed = trimmed.replace(""

"", ""
"") return jinja2.Markup(TEMPLATE.format(trimmed)) @hookimpl def extra_body_script(): return EN_MEDIA_SCRIPT ``` It works! It does however demonstrate that Evernote's ""clip this webpage"" feature means there is a LOT of weird HTML that can get into a note. It looks like they've filtered out the scripts but I wouldn't bet on it - they certainly don't filter out many of the inline styles. So running Bleach is almost certainly a good idea.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718938889, https://github.com/dogsheep/dogsheep-photos/issues/20#issuecomment-624408738,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20,624408738,MDEyOklzc3VlQ29tbWVudDYyNDQwODczOA==,9599,2020-05-06T02:21:05Z,2020-05-06T02:21:32Z,MEMBER,"Here's rendering code from my hacked-together not-yet-released S3 image proxy: ```python from starlette.responses import Response from PIL import Image, ExifTags import pyheif for ORIENTATION_TAG in ExifTags.TAGS.keys(): if ExifTags.TAGS[ORIENTATION_TAG] == ""Orientation"": break ... # Load it into Pillow if ext == ""heic"": heic = pyheif.read_heif(image_response.content) image = Image.frombytes(mode=heic.mode, size=heic.size, data=heic.data) else: image = Image.open(io.BytesIO(image_response.content)) # Does EXIF tell us to rotate it? try: exif = dict(image._getexif().items()) if exif[ORIENTATION_TAG] == 3: image = image.rotate(180, expand=True) elif exif[ORIENTATION_TAG] == 6: image = image.rotate(270, expand=True) elif exif[ORIENTATION_TAG] == 8: image = image.rotate(90, expand=True) except (AttributeError, KeyError, IndexError): pass # Resize based on ?w= and ?h=, if set width, height = image.size w = request.query_params.get(""w"") h = request.query_params.get(""h"") if w is not None or h is not None: if h is None: # Set h based on w w = int(w) h = int((float(height) / width) * w) elif w is None: h = int(h) # Set w based on h w = int((float(width) / height) * h) w = int(w) h = int(h) image.thumbnail((w, h)) # ?bw= converts to black and white if request.query_params.get(""bw""): image = image.convert(""L"") # ?q= sets the quality - defaults to 75 quality = 75 q = request.query_params.get(""q"") if q and q.isdigit() and 1 <= int(q) <= 100: quality = int(q) # Output as JPEG or PNG output_image = io.BytesIO() image_type = ""JPEG"" kwargs = {""quality"": quality} if image.format == ""PNG"": image_type = ""PNG"" kwargs = {} image.save(output_image, image_type, **kwargs) return Response( output_image.getvalue(), media_type=""image/jpeg"", headers={""cache-control"": ""s-maxage={}, public"".format(365 * 24 * 60 * 60)}, ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",613006393, https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-623730934,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15,623730934,MDEyOklzc3VlQ29tbWVudDYyMzczMDkzNA==,9599,2020-05-04T22:00:38Z,2020-05-04T22:00:48Z,MEMBER,"Here's the query to create the new table: ```sql create table apple_photos_scores as select ZGENERICASSET.ZUUID, ZGENERICASSET.ZOVERALLAESTHETICSCORE, ZGENERICASSET.ZCURATIONSCORE, ZGENERICASSET.ZPROMOTIONSCORE, ZGENERICASSET.ZHIGHLIGHTVISIBILITYSCORE, ZCOMPUTEDASSETATTRIBUTES.ZBEHAVIORALSCORE, ZCOMPUTEDASSETATTRIBUTES.ZFAILURESCORE, ZCOMPUTEDASSETATTRIBUTES.ZHARMONIOUSCOLORSCORE, ZCOMPUTEDASSETATTRIBUTES.ZIMMERSIVENESSSCORE, ZCOMPUTEDASSETATTRIBUTES.ZINTERACTIONSCORE, ZCOMPUTEDASSETATTRIBUTES.ZINTERESTINGSUBJECTSCORE, ZCOMPUTEDASSETATTRIBUTES.ZINTRUSIVEOBJECTPRESENCESCORE, ZCOMPUTEDASSETATTRIBUTES.ZLIVELYCOLORSCORE, ZCOMPUTEDASSETATTRIBUTES.ZLOWLIGHT, ZCOMPUTEDASSETATTRIBUTES.ZNOISESCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTCAMERATILTSCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTCOMPOSITIONSCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTLIGHTINGSCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTPATTERNSCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTPERSPECTIVESCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTPOSTPROCESSINGSCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTREFLECTIONSSCORE, ZCOMPUTEDASSETATTRIBUTES.ZPLEASANTSYMMETRYSCORE, ZCOMPUTEDASSETATTRIBUTES.ZSHARPLYFOCUSEDSUBJECTSCORE, ZCOMPUTEDASSETATTRIBUTES.ZTASTEFULLYBLURREDSCORE, ZCOMPUTEDASSETATTRIBUTES.ZWELLCHOSENSUBJECTSCORE, ZCOMPUTEDASSETATTRIBUTES.ZWELLFRAMEDSUBJECTSCORE, ZCOMPUTEDASSETATTRIBUTES.ZWELLTIMEDSHOTSCORE from attached.ZGENERICASSET join attached.ZCOMPUTEDASSETATTRIBUTES on attached.ZGENERICASSET.Z_PK = attached.ZCOMPUTEDASSETATTRIBUTES.Z_PK; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612151767, https://github.com/dogsheep/healthkit-to-sqlite/issues/6#issuecomment-513626742,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6,513626742,MDEyOklzc3VlQ29tbWVudDUxMzYyNjc0Mg==,9599,2019-07-22T03:28:55Z,2019-07-22T03:28:55Z,MEMBER,"Here's what it looks like now as separate tables: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470856782, https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462965256,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11,1462965256,IC_kwDOJHON9s5XMxQI,9599,2023-03-09T23:22:12Z,2023-03-09T23:22:12Z,MEMBER,"Here's what the CTE from that looks like: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1618130434, https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426877,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31,748426877,MDEyOklzc3VlQ29tbWVudDc0ODQyNjg3Nw==,9599,2020-12-19T06:16:11Z,2020-12-19T06:16:11Z,MEMBER,"Here's why: if ""fts5"" in str(e): But the error being raised here is: sqlite3.OperationalError: no such column: to I'm going to attempt the escaped on on every error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771316301, https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095317,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27,549095317,MDEyOklzc3VlQ29tbWVudDU0OTA5NTMxNw==,9599,2019-11-03T01:08:10Z,2019-11-03T01:08:10Z,MEMBER,"Hmm... one thing that could be useful is that `retweets_of_me` can support a `--since` parameter - so if run frequently it should hopefully let us know which tweets we would need to run `statuses/retweets/:id.json` against. I'm not sure if the `--since` parameter would show me a tweet that was previously retweeted but has now been retweeted again. I'll have a bit of a test and see.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",514459062, https://github.com/dogsheep/dogsheep-photos/issues/4#issuecomment-615946537,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4,615946537,MDEyOklzc3VlQ29tbWVudDYxNTk0NjUzNw==,9599,2020-04-18T20:48:13Z,2020-04-18T20:48:13Z,MEMBER,"How about generating a signed URL? ```python read_client.generate_presigned_url( ""get_object"", Params={ ""Bucket"": ""dogsheep-photos-simon"", ""Key"": ""this_is_fine.jpg"", }, ExpiresIn=600 ) ``` Gave me https://dogsheep-photos-simon.s3.amazonaws.com/this_is_fine.jpg?AWSAccessKeyId=AKIAWXFXAIOZNZ3JFO7I&Signature=x1zrS4w4OTGAACd7yHp9mYqXvN8%3D&Expires=1587243398 Which does this: ``` ~ $ curl -i 'https://dogsheep-photos-simon.s3.amazonaws.com/this_is_fine.jpg?AWSAccessKeyId=AKIAWXFXAIOZNZ3JFO7I&Signature=x1zrS4w4OTGAACd7yHp9mYqXvN8%3D&Expires=1587243398' HTTP/1.1 307 Temporary Redirect x-amz-bucket-region: us-west-1 x-amz-request-id: E78CD859AEE21D33 x-amz-id-2: 648mx+1+YSGga7NDOU7Q6isfsKnEPWOLC+DI4+x2o9FCc6pSCdIaoHJUbFMI8Vsuh1ADtx46ymU= Location: https://dogsheep-photos-simon.s3-us-west-1.amazonaws.com/this_is_fine.jpg?AWSAccessKeyId=AKIAWXFXAIOZNZ3JFO7I&Signature=x1zrS4w4OTGAACd7yHp9mYqXvN8%3D&Expires=1587243398 Content-Type: application/xml Transfer-Encoding: chunked Date: Sat, 18 Apr 2020 20:47:21 GMT Server: AmazonS3 TemporaryRedirectPlease re-send this request to the specified temporary endpoint. Continue to use the original request endpoint for future requests.dogsheep-photos-simon.s3-us-west-1.amazonaws.comdogsheep-photos-simonE78CD859AEE21D33648mx+1+YSGga7NDOU7Q6isfsKnEPWOLC+DI4+x2o9FCc6pSCdIaoHJUbFMI8Vsuh1ADtx46ymU=~ $ ``` So it redirects to another URL... which returns this: ``` ~ $ curl -i 'https://dogsheep-photos-simon.s3-us-west-1.amazonaws.com/this_is_fine.jpg?AWSAccessKeyId=AKIAWXFXAIOZNZ3JFO7I&Signature=x1zrS4w4OTGAACd7yHp9mYqXvN8%3D&Expires=1587243398' HTTP/1.1 200 OK x-amz-id-2: XafOl6mswj3yz0GJC9+Ptot1ll5sROVwqsMc10CUUfgpaUANTdIx2GhnONb5d1GVFJ6wlS2j3UY= x-amz-request-id: 258387C180411AFE Date: Sat, 18 Apr 2020 20:47:52 GMT Last-Modified: Sat, 18 Apr 2020 20:37:35 GMT ETag: ""ee04081c3182a44a1c6944e94012e977"" Accept-Ranges: bytes Content-Type: binary/octet-stream Content-Length: 53072 Server: AmazonS3 ????JFIF??C ``` So that worked! It did come back with `Content-Type: binary/octet-stream` though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",602533539, https://github.com/dogsheep/dogsheep-photos/issues/26#issuecomment-631226953,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/26,631226953,MDEyOklzc3VlQ29tbWVudDYzMTIyNjk1Mw==,9599,2020-05-20T04:20:34Z,2020-05-20T04:20:34Z,MEMBER,"Huh, it looks like Circle CI picked up the name change automatically. https://app.circleci.com/pipelines/github/dogsheep/dogsheep-photos","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",621444763, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790668263,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790668263,MDEyOklzc3VlQ29tbWVudDc5MDY2ODI2Mw==,9599,2021-03-04T14:43:58Z,2021-03-04T14:43:58Z,MEMBER,"I added this code to output a message ID on errors: ```diff print(""Errors: {}"".format(num_errors)) print(traceback.format_exc()) + print(""Message-Id: {}"".format(email.get(""Message-Id"", ""None""))) continue ``` Having found a message ID that had an error, I ran this command to see the context: rg --text --context 20 '44F289B0.000001.02100@SCHWARZE-DWFXMI' ~/gmail.mbox This was for the following error: ``` File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 102, in get_mbox message[""date""] = get_message_date(email.get(""Date""), email.get_from()) File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 178, in get_message_date datetime_tuple = email.utils.parsedate_tz(mail_date) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 50, in parsedate_tz res = _parsedate_tz(data) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 69, in _parsedate_tz data = data.split() AttributeError: 'Header' object has no attribute 'split' ``` Here's what I spotted in the `ripgrep` output: ``` 177133570:Message-Id: <44F289B0.000001.02100@SCHWARZE-DWFXMI> 177133571-Date: Mon, 28 Aug 2006 08:14:08 +0200 (Westeurop�ische Sommerzeit) 177133572-X-Mailer: IncrediMail (5002253) ``` So it could it be that `_parsedate_tz` is having trouble with that `Mon, 28 Aug 2006 08:14:08 +0200 (Westeurop�ische Sommerzeit)` string.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-531404891,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8,531404891,MDEyOklzc3VlQ29tbWVudDUzMTQwNDg5MQ==,9599,2019-09-13T22:01:57Z,2019-09-13T22:01:57Z,MEMBER,I also wrote about this in https://simonwillison.net/2019/Sep/13/weeknotestwitter-sqlite-datasette-rure/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",490803176, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653962708,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653962708,MDEyOklzc3VlQ29tbWVudDY1Mzk2MjcwOA==,9599,2020-07-06T00:43:10Z,2020-07-06T00:43:10Z,MEMBER,I bet it's datasette-search-all.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727, https://github.com/dogsheep/dogsheep-photos/issues/25#issuecomment-631127454,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25,631127454,MDEyOklzc3VlQ29tbWVudDYzMTEyNzQ1NA==,9599,2020-05-19T22:48:00Z,2020-05-21T15:58:32Z,MEMBER,"I built #23 to help with this. $ dogsheep-photos create-subset photos.db public.db \ ""select sha256 from apple_photos where albums like '%Public%'"" And publish with Vercel: $ datasette publish now public.db --project dogsheep-photos \ --about=dogsheep/dogsheep-photos \ --about_url=""https://github.com/dogsheep/dogsheep-photos"" \ --install=datasette-json-html \ --install=datasette-cluster-map","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",621332242, https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855427,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12,542855427,MDEyOklzc3VlQ29tbWVudDU0Mjg1NTQyNw==,9599,2019-10-16T19:27:55Z,2019-10-16T19:27:55Z,MEMBER,I can do that by keeping `source` as a `TEXT` column but turning it into a non-enforced foreign key against a new `sources` table. Then I can run code that scans that column for any values beginning with a `<` and converts them.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503053800, https://github.com/dogsheep/twitter-to-sqlite/issues/35#issuecomment-601875524,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35,601875524,MDEyOklzc3VlQ29tbWVudDYwMTg3NTUyNA==,9599,2020-03-20T19:30:27Z,2020-03-20T19:30:27Z,MEMBER,"I can give it a snazzier progress bar to, as requested by #10.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",585282212, https://github.com/dogsheep/github-to-sqlite/issues/37#issuecomment-622978173,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37,622978173,MDEyOklzc3VlQ29tbWVudDYyMjk3ODE3Mw==,9599,2020-05-02T16:19:31Z,2020-05-02T16:19:47Z,MEMBER,"I can use the new `.create_view(..., replace=True)` parameter in `sqlite-utils` 2.7.2 for this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610843136, https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515226724,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9,515226724,MDEyOklzc3VlQ29tbWVudDUxNTIyNjcyNA==,9599,2019-07-25T21:46:01Z,2019-07-25T21:46:01Z,MEMBER,I can work around this here (prior to the fix in sqlite-utils) by setting the batch size to something a bit lower here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472429048, https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-622990947,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4,622990947,MDEyOklzc3VlQ29tbWVudDYyMjk5MDk0Nw==,9599,2020-05-02T17:54:16Z,2020-05-02T17:54:16Z,MEMBER,"I could add that window function query as a view, but only if the detected version of SQLite supports window functions.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670730, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660553646,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660553646,MDEyOklzc3VlQ29tbWVudDY2MDU1MzY0Ng==,9599,2020-07-18T22:51:41Z,2020-07-18T22:51:41Z,MEMBER,"I could fix this by putting `REFRESH_DB` in a commit message: https://github.com/dogsheep/github-to-sqlite/blob/4ae4aa6f172344b19ff3513707195ee6d2654bd4/.github/workflows/deploy-demo.yml#L41-L46 But... doing so would lose the data I've collected in https://github-to-sqlite.dogsheep.net/github/dependents?_sort_desc=first_seen_utc concerning the first time each dependent repo was spotted.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601, https://github.com/dogsheep/github-to-sqlite/issues/18#issuecomment-602815120,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18,602815120,MDEyOklzc3VlQ29tbWVudDYwMjgxNTEyMA==,9599,2020-03-23T19:40:55Z,2020-03-23T19:43:19Z,MEMBER,I could pull a pk-hashed version of the name/email into separate `raw_author` and `raw_committer` columns perhaps - against a `commit_authors` table. Could be interesting.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",585411547, https://github.com/dogsheep/dogsheep-beta/issues/31#issuecomment-748426501,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31,748426501,MDEyOklzc3VlQ29tbWVudDc0ODQyNjUwMQ==,9599,2020-12-19T06:12:22Z,2020-12-19T06:12:22Z,MEMBER,I deliberately added support for advanced FTS in https://github.com/dogsheep/dogsheep-beta/commit/cbb2491b85d7ff416d6d429b60109e6c2d6d50b9 for #13 but that's the cause of this bug.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771316301, https://github.com/dogsheep/dogsheep-photos/issues/8#issuecomment-618100434,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8,618100434,MDEyOklzc3VlQ29tbWVudDYxODEwMDQzNA==,9599,2020-04-23T00:02:53Z,2020-04-23T00:02:53Z,MEMBER,"I don't think it matters one way or the other - I'm storing the sha256 in the filename, so the fact that I could read the MD5 back from the list bucket operation doesn't give me any benefits.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",605147638, https://github.com/dogsheep/github-to-sqlite/issues/58#issuecomment-746735889,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58,746735889,MDEyOklzc3VlQ29tbWVudDc0NjczNTg4OQ==,9599,2020-12-16T17:59:50Z,2020-12-16T17:59:50Z,MEMBER,"I don't want to add a full HTML parser (like BeautifulSoup) as a dependency for this feature. Since the HTML comes from a single, trusted source (GitHub) I could probably handle this using [regular expressions](https://stackoverflow.com/a/1732454).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769150394, https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623806533,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16,623806533,MDEyOklzc3VlQ29tbWVudDYyMzgwNjUzMw==,9599,2020-05-05T02:50:16Z,2020-05-05T02:50:16Z,MEMBER,"I figured there must be a separate database that Photos uses to store the text of the identified labels. I used ""Open Files and Ports"" in Activity Monitor against the Photos app to try and spot candidates... and found `/Users/simon/Pictures/Photos Library.photoslibrary/database/search/psi.sqlite` - a 53MB SQLite database file. Here's the schema of that file: ``` $ sqlite3 psi.sqlite .schema CREATE TABLE word_embedding(word TEXT, extended_word TEXT, score DOUBLE); CREATE INDEX word_embedding_index ON word_embedding(word); CREATE VIRTUAL TABLE word_embedding_prefix USING fts5(extended_word) /* word_embedding_prefix(extended_word) */; CREATE TABLE IF NOT EXISTS 'word_embedding_prefix_data'(id INTEGER PRIMARY KEY, block BLOB); CREATE TABLE IF NOT EXISTS 'word_embedding_prefix_idx'(segid, term, pgno, PRIMARY KEY(segid, term)) WITHOUT ROWID; CREATE TABLE IF NOT EXISTS 'word_embedding_prefix_content'(id INTEGER PRIMARY KEY, c0); CREATE TABLE IF NOT EXISTS 'word_embedding_prefix_docsize'(id INTEGER PRIMARY KEY, sz BLOB); CREATE TABLE IF NOT EXISTS 'word_embedding_prefix_config'(k PRIMARY KEY, v) WITHOUT ROWID; CREATE TABLE groups(category INT2, owning_groupid INT, content_string TEXT, normalized_string TEXT, lookup_identifier TEXT, token_ranges_0 INT8, token_ranges_1 INT8, UNIQUE(category, owning_groupid, content_string, lookup_identifier, token_ranges_0, token_ranges_1)); CREATE TABLE assets(uuid_0 INT, uuid_1 INT, creationDate INT, UNIQUE(uuid_0, uuid_1)); CREATE TABLE ga(groupid INT, assetid INT, PRIMARY KEY(groupid, assetid)); CREATE TABLE collections(uuid_0 INT, uuid_1 INT, startDate INT, endDate INT, title TEXT, subtitle TEXT, keyAssetUUID_0 INT, keyAssetUUID_1 INT, typeAndNumberOfAssets INT32, sortDate DOUBLE, UNIQUE(uuid_0, uuid_1)); CREATE TABLE gc(groupid INT, collectionid INT, PRIMARY KEY(groupid, collectionid)); CREATE VIRTUAL TABLE prefix USING fts5(content='groups', normalized_string, category UNINDEXED, tokenize = 'PSITokenizer'); CREATE TABLE IF NOT EXISTS 'prefix_data'(id INTEGER PRIMARY KEY, block BLOB); CREATE TABLE IF NOT EXISTS 'prefix_idx'(segid, term, pgno, PRIMARY KEY(segid, term)) WITHOUT ROWID; CREATE TABLE IF NOT EXISTS 'prefix_docsize'(id INTEGER PRIMARY KEY, sz BLOB); CREATE TABLE IF NOT EXISTS 'prefix_config'(k PRIMARY KEY, v) WITHOUT ROWID; CREATE TABLE lookup(identifier TEXT PRIMARY KEY, category INT2); CREATE TRIGGER trigger_groups_insert AFTER INSERT ON groups BEGIN INSERT INTO prefix(rowid, normalized_string, category) VALUES (new.rowid, new.normalized_string, new.category); END; CREATE TRIGGER trigger_groups_delete AFTER DELETE ON groups BEGIN INSERT INTO prefix(prefix, rowid, normalized_string, category) VALUES('delete', old.rowid, old.normalized_string, old.category); END; CREATE INDEX group_pk ON groups(category, content_string, normalized_string, lookup_identifier); CREATE INDEX asset_pk ON assets(uuid_0, uuid_1); CREATE INDEX ga_assetid ON ga(assetid, groupid); CREATE INDEX collection_pk ON collections(uuid_0, uuid_1); CREATE INDEX gc_collectionid ON gc(collectionid); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612287234, https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623805823,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16,623805823,MDEyOklzc3VlQ29tbWVudDYyMzgwNTgyMw==,9599,2020-05-05T02:45:56Z,2020-05-05T02:45:56Z,MEMBER,I filed an issue with `osxphotos` about this here: https://github.com/RhetTbull/osxphotos/issues/121,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612287234, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790373024,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790373024,MDEyOklzc3VlQ29tbWVudDc5MDM3MzAyNA==,9599,2021-03-04T07:01:58Z,2021-03-04T07:04:06Z,MEMBER,"I got 9 warnings that look like this: ``` Errors: 1 Traceback (most recent call last): File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 103, in get_mbox message[""date""] = get_message_date(email.get(""Date""), email.get_from()) File ""/Users/simon/Dropbox/Development/google-takeout-to-sqlite/google_takeout_to_sqlite/utils.py"", line 167, in get_message_date datetime_tuple = email.utils.parsedate_tz(mail_date) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 50, in parsedate_tz res = _parsedate_tz(data) File ""/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.7/lib/python3.7/email/_parseaddr.py"", line 69, in _parsedate_tz data = data.split() AttributeError: 'Header' object has no attribute 'split' ``` It would be useful if those warnings told me the message ID (or similar) of the affected message so I could grep for it in the `mbox` and see what was going on. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/dogsheep-photos/issues/25#issuecomment-631253852,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25,631253852,MDEyOklzc3VlQ29tbWVudDYzMTI1Mzg1Mg==,9599,2020-05-20T05:56:17Z,2020-05-21T22:26:16Z,MEMBER,"I have a `deploy-demo.sh` script now: ```bash #!/bin/bash if [ -f public.db ]; then rm public.db fi pipenv run dogsheep-photos create-subset photos.db public.db \ ""select sha256 from apple_photos where albums like '%Public%'"" pipenv run sqlite-utils create-view public.db photos_on_a_map \ ""select date, latitude, longitude, apple_photos.sha256, uploads.ext, json_object( 'title', 'Taken on ' || date, 'image', 'https://photos.simonwillison.net/i/' || uploads.sha256 || '.' || uploads.ext || '?w=400', 'link', 'https://photos.simonwillison.net/i/' || uploads.sha256 || '.' || uploads.ext || '?w=1200' ) as popup from apple_photos join uploads on apple_photos.sha256 = uploads.sha256 where latitude is not null order by date desc"" \ --replace pipenv run datasette publish now public.db --project dogsheep-photos \ --about=dogsheep/dogsheep-photos \ --about_url=""https://github.com/dogsheep/dogsheep-photos"" \ --install=datasette-json-html \ --install=datasette-pretty-json \ --install=datasette-cluster-map>=0.10 \ --title ""Dogsheep Photos demo"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",621332242, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543290744,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,543290744,MDEyOklzc3VlQ29tbWVudDU0MzI5MDc0NA==,9599,2019-10-17T17:57:14Z,2019-10-17T17:57:14Z,MEMBER,I have a working command now. I'm going to ship it early because it could do with some other people trying it out.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975, https://github.com/dogsheep/dogsheep-beta/issues/16#issuecomment-694548909,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16,694548909,MDEyOklzc3VlQ29tbWVudDY5NDU0ODkwOQ==,9599,2020-09-17T23:15:09Z,2020-09-17T23:15:09Z,MEMBER,"I have sort by date now, #21.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694493566, https://github.com/dogsheep/evernote-to-sqlite/issues/4#issuecomment-706784028,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/4,706784028,MDEyOklzc3VlQ29tbWVudDcwNjc4NDAyOA==,9599,2020-10-11T23:20:32Z,2020-10-11T23:20:32Z,MEMBER,I haven't done the FTS on OCR yet. I'm going to move that to another ticket because it requires more thought.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718938508, https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095463,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27,549095463,MDEyOklzc3VlQ29tbWVudDU0OTA5NTQ2Mw==,9599,2019-11-03T01:10:52Z,2019-11-03T01:10:52Z,MEMBER,"I imagine it won't, since the data I would be recording and then passing to `since_id` would be the highest ID of my own tweets that have been retweeted at least once. So it won't be able to spot if I should check for fresh retweets of a given tweet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",514459062, https://github.com/dogsheep/github-to-sqlite/issues/18#issuecomment-602846293,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18,602846293,MDEyOklzc3VlQ29tbWVudDYwMjg0NjI5Mw==,9599,2020-03-23T20:44:40Z,2020-03-23T20:44:40Z,MEMBER,I implemented the `raw_authors` idea.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",585411547, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790693674,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790693674,MDEyOklzc3VlQ29tbWVudDc5MDY5MzY3NA==,9599,2021-03-04T15:18:36Z,2021-03-04T15:18:36Z,MEMBER,"I imported my 10GB mbox with 750,000 emails in it, ran this tool (with a hacked fix for the blob column problem) - and now a search that returns 92 results takes 25.37ms! This is fantastic.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401, https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462968053,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11,1462968053,IC_kwDOJHON9s5XMx71,9599,2023-03-09T23:24:01Z,2023-03-09T23:24:01Z,MEMBER,"I improved the readability by removing some unnecessary table aliases: ```sql with recursive nested_folders(folder_id, descendant_folder_id) as ( -- base case: select all immediate children of the root folder select id, id from folders where parent is null union all -- recursive case: select all children of the previous level of nested folders select nested_folders.folder_id, folders.id from nested_folders join folders on nested_folders.descendant_folder_id = folders.parent ) -- Find notes within all descendants of folder 1 select * from notes where folder in ( select descendant_folder_id from nested_folders where folder_id = 1 ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1618130434, https://github.com/dogsheep/swarm-to-sqlite/issues/11#issuecomment-761967094,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11,761967094,MDEyOklzc3VlQ29tbWVudDc2MTk2NzA5NA==,9599,2021-01-18T04:11:13Z,2021-01-18T04:11:13Z,MEMBER,"I just got a similar error: ``` File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 79, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""with"", pk=""id"") File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2048, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1781, in insert return self.insert_all( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1899, in insert_all self.insert_chunk( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1709, in insert_chunk result = self.db.execute(query, params) File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 226, in execute return self.conn.execute(sql, parameters) pysqlite3.dbapi2.OperationalError: table users has no column named countryCode ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743400216, https://github.com/dogsheep/swarm-to-sqlite/issues/13#issuecomment-1502543165,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13,1502543165,IC_kwDODD6af85Zjv09,9599,2023-04-11T01:10:36Z,2023-04-11T01:11:47Z,MEMBER,"I just had that error myself on macOS while running the tests: ``` ERROR tests/test_save_checkin.py::test_tables - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_venue - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_event - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_sticker - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_likes - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_with_ - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_users - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_photos - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_posts - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_view - sqlite3.OperationalError: table sqlite_master may not be modified ``` `pytest --pdb` shows it happening in the bit that adds foreign keys: ``` > /Users/simon/.local/share/virtualenvs/swarm-to-sqlite-daPW7yIJ/lib/python3.9/site-packages/sqlite_utils/db.py(1096)add_foreign_keys() -> cursor.execute( (Pdb) list 1096 >> cursor.execute( 1097 ""UPDATE sqlite_master SET sql = ? WHERE name = ?"", 1098 (new_sql, table_name), 1099 ) 1100 cursor.execute(""PRAGMA schema_version = %d"" % (schema_version + 1)) 1101 -> cursor.execute(""PRAGMA writable_schema = 0"") 1102 # Have to VACUUM outside the transaction to ensure .foreign_keys property 1103 # can see the newly created foreign key. 1104 self.vacuum() ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1373210675, https://github.com/dogsheep/dogsheep-photos/issues/26#issuecomment-631226481,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/26,631226481,MDEyOklzc3VlQ29tbWVudDYzMTIyNjQ4MQ==,9599,2020-05-20T04:18:29Z,2020-05-20T04:18:29Z,MEMBER,I just renamed the repository.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",621444763, https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527990908,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2,527990908,MDEyOklzc3VlQ29tbWVudDUyNzk5MDkwOA==,9599,2019-09-04T16:57:24Z,2019-09-04T16:57:24Z,MEMBER,"I just tried this using `max_id=` pagination as described in [Working with timelines](https://developer.twitter.com/en/docs/tweets/timelines/guides/working-with-timelines) and I got back all 17,759 of my tweets.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833698, https://github.com/dogsheep/pocket-to-sqlite/issues/1#issuecomment-605325897,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1,605325897,MDEyOklzc3VlQ29tbWVudDYwNTMyNTg5Nw==,9599,2020-03-27T21:37:26Z,2020-03-27T21:38:37Z,MEMBER,"I keep getting 503 errors even though I appear to be staying within the rate limit: ``` {'Date': 'Fri, 27 Mar 2020 21:35:57 GMT', 'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Connection': 'keep-alive', 'Server': 'Apache/2.4.25 (Debian)', 'Content-Location': 'get.php', 'Vary': 'negotiate', 'TCN': 'choice', 'Set-Cookie': '...; httponly', 'X-Frame-Options': 'SAMEORIGIN', 'Status': '200 OK', 'X-Limit-Key-Limit': '10000', 'X-Limit-Key-Remaining': '9960', 'X-Limit-Key-Reset': '282', 'X-Source': 'Pocket', 'P3P': 'policyref=""/w3c/p3p.xml"", CP=""ALL CURa ADMa DEVa OUR IND UNI COM NAV INT STA PRE""'} [##----------------------------------] 6% 06:49:27 {'Date': 'Fri, 27 Mar 2020 21:36:06 GMT', 'Content-Type': 'text/html; charset=UTF-8', 'Content-Length': '23', 'Connection': 'keep-alive', 'Server': 'Apache/2.4.25 (Debian)', 'Content-Location': 'get.php', 'Vary': 'negotiate', 'TCN': 'choice', 'Set-Cookie': '...', 'X-Frame-Options': 'SAMEORIGIN', 'X-Error': 'Pocket is currently under heavy load. Please wait a moment and try again.', 'X-Error-Code': '199', 'Status': '503 Service Unavailable', 'X-Source': 'Pocket', 'P3P': 'policyref=""/w3c/p3p.xml"", CP=""ALL CURa ADMa DEVa OUR IND UNI COM NAV INT STA PRE""'} ``` I'm going to try doing a few automatic retries any time I see a 503 error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503233021, https://github.com/dogsheep/twitter-to-sqlite/issues/39#issuecomment-606309165,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39,606309165,MDEyOklzc3VlQ29tbWVudDYwNjMwOTE2NQ==,9599,2020-03-30T23:41:31Z,2020-03-30T23:41:31Z,MEMBER,I like the separate `user_timeline_since` table solution.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",590666760, https://github.com/dogsheep/dogsheep-photos/issues/2#issuecomment-615931488,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2,615931488,MDEyOklzc3VlQ29tbWVudDYxNTkzMTQ4OA==,9599,2020-04-18T19:24:02Z,2020-04-18T19:24:02Z,MEMBER,I made a start on this last week with a https://github.com/simonw/heic-to-jpeg proxy.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",602533352, https://github.com/dogsheep/github-to-sqlite/issues/18#issuecomment-602807178,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18,602807178,MDEyOklzc3VlQ29tbWVudDYwMjgwNzE3OA==,9599,2020-03-23T19:24:43Z,2020-03-23T19:24:43Z,MEMBER,I need to find an example before I work on this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",585411547, https://github.com/dogsheep/github-to-sqlite/issues/40#issuecomment-643393506,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/40,643393506,MDEyOklzc3VlQ29tbWVudDY0MzM5MzUwNg==,9599,2020-06-12T17:21:14Z,2020-06-12T17:21:14Z,MEMBER,"I only install SQLite for this: https://github.com/dogsheep/github-to-sqlite/blob/c0d54e0260468be38152293df5abd775c068495d/.github/workflows/deploy-demo.yml#L77-L78 I'm going to remove the need to install sqlite3 by making this possible with sqlite-utils: https://github.com/simonw/sqlite-utils/issues/115","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637899539, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461234311,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461234311,IC_kwDOJHON9s5XGKqH,9599,2023-03-09T03:56:24Z,2023-03-09T03:56:24Z,MEMBER,"I opened the ""Script Editor"" app on my computer, used Window -> Library to open the Library panel, then clicked on the Notes app there. I got this: So the notes object has these properties: - name (text) : the name of the note (normally the first line of the body) - id (text, r/o) : the unique identifier of the note - container ([folder](applewebdata://621FA8D9-C995-4081-B3B3-149B0EA04C7F#Notes-Suite.folder), r/o) : the folder of the note - body (text) : the HTML content of the note - plaintext (text, r/o) : the plaintext content of the note - creation date (date, r/o) : the creation date of the note - modification date (date, r/o) : the modification date of the note - password protected (boolean, r/o) : Is the note password protected? - shared (boolean, r/o) : Is the note shared? I'm going to ignore the concept of attachments for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999, https://github.com/dogsheep/github-to-sqlite/issues/13#issuecomment-602918689,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/13,602918689,MDEyOklzc3VlQ29tbWVudDYwMjkxODY4OQ==,9599,2020-03-23T23:43:39Z,2020-03-23T23:47:50Z,MEMBER,I pointed https://github-to-sqlite.dogsheep.net/ at it. May take a few minutes for the new certificate to provision though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",521275281, https://github.com/dogsheep/twitter-to-sqlite/issues/5#issuecomment-527684202,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5,527684202,MDEyOklzc3VlQ29tbWVudDUyNzY4NDIwMg==,9599,2019-09-03T23:56:28Z,2019-09-03T23:56:28Z,MEMBER,I previously used betamax here: https://github.com/simonw/github-contents/blob/master/test_github_contents.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488874815, https://github.com/dogsheep/dogsheep-beta/issues/4#issuecomment-684395444,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4,684395444,MDEyOklzc3VlQ29tbWVudDY4NDM5NTQ0NA==,9599,2020-09-01T06:00:03Z,2020-09-01T06:00:03Z,MEMBER,I ran `sqlite-utils optimize beta.db` against my test DB and the size reduced from 183M to 176M - and a 450ms search ran in 359ms. So not a huge improvement but still worthwhile.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689839399, https://github.com/dogsheep/healthkit-to-sqlite/issues/1#issuecomment-513437463,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1,513437463,MDEyOklzc3VlQ29tbWVudDUxMzQzNzQ2Mw==,9599,2019-07-20T05:19:59Z,2019-07-20T05:19:59Z,MEMBER,"I ran xml_analyser against the XML HealthKit `export.xml` file and got the following results: ```python { 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980, 'activeEnergyBurnedGoal': 980, 'activeEnergyBurnedUnit': 980, 'appleExerciseTime': 980, 'appleExerciseTimeGoal': 980, 'appleStandHours': 980, 'appleStandHoursGoal': 980, 'dateComponents': 980}, 'child_counts': {}, 'count': 980, 'parent_counts': {'HealthData': 980}}, 'Correlation': {'attr_counts': {'creationDate': 1, 'endDate': 1, 'sourceName': 1, 'sourceVersion': 1, 'startDate': 1, 'type': 1}, 'child_counts': {'MetadataEntry': 1, 'Record': 2}, 'count': 1, 'parent_counts': {'HealthData': 1}}, 'ExportDate': {'attr_counts': {'value': 1}, 'child_counts': {}, 'count': 1, 'parent_counts': {'HealthData': 1}}, 'HealthData': {'attr_counts': {'locale': 1}, 'child_counts': {'ActivitySummary': 980, 'Correlation': 1, 'ExportDate': 1, 'Me': 1, 'Record': 2672231, 'Workout': 663}, 'count': 1, 'parent_counts': {}}, 'HeartRateVariabilityMetadataList': {'attr_counts': {}, 'child_counts': {'InstantaneousBeatsPerMinute': 93653}, 'count': 2318, 'parent_counts': {'Record': 2318}}, 'InstantaneousBeatsPerMinute': {'attr_counts': {'bpm': 93653, 'time': 93653}, 'child_counts': {}, 'count': 93653, 'parent_counts': {'HeartRateVariabilityMetadataList': 93653}}, 'Location': {'attr_counts': {'altitude': 398683, 'course': 398683, 'date': 398683, 'horizontalAccuracy': 398683, 'latitude': 398683, 'longitude': 398683, 'speed': 398683, 'verticalAccuracy': 398683}, 'child_counts': {}, 'count': 398683, 'parent_counts': {'WorkoutRoute': 398683}}, 'Me': {'attr_counts': {'HKCharacteristicTypeIdentifierBiologicalSex': 1, 'HKCharacteristicTypeIdentifierBloodType': 1, 'HKCharacteristicTypeIdentifierDateOfBirth': 1, 'HKCharacteristicTypeIdentifierFitzpatrickSkinType': 1}, 'child_counts': {}, 'count': 1, 'parent_counts': {'HealthData': 1}}, 'MetadataEntry': {'attr_counts': {'key': 290449, 'value': 290449}, 'child_counts': {}, 'count': 290449, 'parent_counts': {'Correlation': 1, 'Record': 287974, 'Workout': 1928, 'WorkoutRoute': 546}}, 'Record': {'attr_counts': {'creationDate': 2672233, 'device': 2665111, 'endDate': 2672233, 'sourceName': 2672233, 'sourceVersion': 2671779, 'startDate': 2672233, 'type': 2672233, 'unit': 2650012, 'value': 2672232}, 'child_counts': {'HeartRateVariabilityMetadataList': 2318, 'MetadataEntry': 287974}, 'count': 2672233, 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}}, 'Workout': {'attr_counts': {'creationDate': 663, 'device': 230, 'duration': 663, 'durationUnit': 663, 'endDate': 663, 'sourceName': 663, 'sourceVersion': 663, 'startDate': 663, 'totalDistance': 663, 'totalDistanceUnit': 663, 'totalEnergyBurned': 663, 'totalEnergyBurnedUnit': 663, 'workoutActivityType': 663}, 'child_counts': {'MetadataEntry': 1928, 'WorkoutEvent': 2094, 'WorkoutRoute': 340}, 'count': 663, 'parent_counts': {'HealthData': 663}}, 'WorkoutEvent': {'attr_counts': {'date': 2094, 'duration': 837, 'durationUnit': 837, 'type': 2094}, 'child_counts': {}, 'count': 2094, 'parent_counts': {'Workout': 2094}}, 'WorkoutRoute': {'attr_counts': {'creationDate': 340, 'endDate': 340, 'sourceName': 340, 'sourceVersion': 340, 'startDate': 340}, 'child_counts': {'Location': 398683, 'MetadataEntry': 546}, 'count': 340, 'parent_counts': {'Workout': 340}}} ``` The most interesting bit is this: ```python 'HealthData': {'attr_counts': {'locale': 1}, 'child_counts': {'ActivitySummary': 980, 'Correlation': 1, 'ExportDate': 1, 'Me': 1, 'Record': 2672231, 'Workout': 663}, ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470637068, https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770071568,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60,770071568,MDEyOklzc3VlQ29tbWVudDc3MDA3MTU2OA==,9599,2021-01-29T21:56:15Z,2021-01-29T21:56:15Z,MEMBER,"I really like the way you're using pipes here - really smart. It's similar to how I build the demo database in this GitHub Actions workflow: https://github.com/dogsheep/github-to-sqlite/blob/62dfd3bc4014b108200001ef4bc746feb6f33b45/.github/workflows/deploy-demo.yml#L52-L82 `twitter-to-sqlite` actually has a mechanism for doing this kind of thing, documented at https://github.com/dogsheep/twitter-to-sqlite#providing-input-from-a-sql-query-with---sql-and---attach It lets you do things like: ``` $ twitter-to-sqlite users-lookup my.db --sql=""select follower_id from following"" --ids ``` Maybe I should add something similar to `github-to-sqlite`? Feels like it could be really useful.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797097140, https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694557425,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24,694557425,MDEyOklzc3VlQ29tbWVudDY5NDU1NzQyNQ==,9599,2020-09-17T23:41:01Z,2020-09-17T23:41:01Z,MEMBER,I removed all of the `json.loads()` calls and I'm still getting that `Undefined` error.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703970814, https://github.com/dogsheep/swarm-to-sqlite/issues/2#issuecomment-526701674,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2,526701674,MDEyOklzc3VlQ29tbWVudDUyNjcwMTY3NA==,9599,2019-08-30T18:24:26Z,2019-08-30T18:24:26Z,MEMBER,I renamed `--file` to `--load` in 0e5b6025c6f9823ff81aa8aae1cbff5c45e57baf,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487598468, https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597354514,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17,597354514,MDEyOklzc3VlQ29tbWVudDU5NzM1NDUxNA==,9599,2020-03-10T22:37:45Z,2020-03-10T22:37:45Z,MEMBER,I should add an option to stop the moment you see a commit you have fetched before.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",578883725, https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559883311,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14,559883311,MDEyOklzc3VlQ29tbWVudDU1OTg4MzMxMQ==,9599,2019-11-29T21:30:37Z,2019-11-29T21:30:37Z,MEMBER,"I should build the command to persist ETags and obey their polling guidelines: > Events are optimized for polling with the ""ETag"" header. If no new events have been triggered, you will see a ""304 Not Modified"" response, and your current rate limit will be untouched. There is also an ""X-Poll-Interval"" header that specifies how often (in seconds) you are allowed to poll. In times of high server load, the time may increase. Please obey the header.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530491074, https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711078917,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11,711078917,MDEyOklzc3VlQ29tbWVudDcxMTA3ODkxNw==,9599,2020-10-17T20:51:55Z,2020-10-17T20:52:03Z,MEMBER,"I switched my phone to Spanish and ran an export - I got a file called `exportar.zip`. Unzipped I still got a `apple_ health_export` folder but the root contained: ``` electrocardiograms/ export_cda.xml exportar.xml workout-routes/ ``` It looks like `export_cda.xml` does not have a translated name, so maybe I can ignore it and look for the _other_ `.xml` file in that directory.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723838331, https://github.com/dogsheep/pocket-to-sqlite/issues/12#issuecomment-1627563202,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12,1627563202,IC_kwDODLZ_YM5hAqTC,9599,2023-07-09T01:14:27Z,2023-07-09T01:14:27Z,MEMBER,I tested this locally with `python -m build` and then `pip install ...whl` in a fresh virtual environment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1795187493, https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-747029636,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29,747029636,MDEyOklzc3VlQ29tbWVudDc0NzAyOTYzNg==,9599,2020-12-16T21:14:03Z,2020-12-16T21:14:03Z,MEMBER,"I think I can do this as a cunning trick in `display_sql`. Consider this example query: https://til.simonwillison.net/tils?sql=select%0D%0A++path%2C%0D%0A++snippet%28til_fts%2C+-1%2C+%27b4de2a49c8%27%2C+%278c94a2ed4b%27%2C+%27...%27%2C+60%29+as+snippet%0D%0Afrom%0D%0A++til%0D%0A++join+til_fts+on+til.rowid+%3D+til_fts.rowid%0D%0Awhere%0D%0A++til_fts+match+escape_fts%28%3Aq%29%0D%0A++and+path+%3D+%27asgi_lifespan-test-httpx.md%27%0D%0A&q=pytest ```sql select path, snippet(til_fts, -1, 'b4de2a49c8', '8c94a2ed4b', '...', 60) as snippet from til join til_fts on til.rowid = til_fts.rowid where til_fts match escape_fts(:q) and path = 'asgi_lifespan-test-httpx.md' ``` The `and path = 'asgi_lifespan-test-httpx.md'` bit means we only get back a specific document - but the snippet highlighting is applied to it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724759588, https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-695113871,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24,695113871,MDEyOklzc3VlQ29tbWVudDY5NTExMzg3MQ==,9599,2020-09-18T22:30:17Z,2020-09-18T22:30:17Z,MEMBER,"I think I know what's going on here: https://github.com/dogsheep/dogsheep-beta/blob/0f1b951c5131d16f3c8559a8e4d79ed5c559e3cb/dogsheep_beta/__init__.py#L166-L171 This is a logic bug - the `compiled` variable could be the template from the previous loop!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703970814, https://github.com/dogsheep/swarm-to-sqlite/issues/13#issuecomment-1502629404,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13,1502629404,IC_kwDODD6af85ZkE4c,9599,2023-04-11T03:15:47Z,2023-04-11T03:46:17Z,MEMBER,"I think `swarm-to-sqlite` needs to avoid this error, maybe by setting up foreign keys in another way - or even by skipping foreign keys entirely on databases that don't support this kind of operation.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1373210675, https://github.com/dogsheep/twitter-to-sqlite/issues/39#issuecomment-606844521,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39,606844521,MDEyOklzc3VlQ29tbWVudDYwNjg0NDUyMQ==,9599,2020-03-31T20:01:39Z,2020-03-31T20:01:39Z,MEMBER,"I think `utils.fetch_timeline()` grows a new argument, `since_key`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",590666760, https://github.com/dogsheep/dogsheep-beta/issues/11#issuecomment-686618669,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11,686618669,MDEyOklzc3VlQ29tbWVudDY4NjYxODY2OQ==,9599,2020-09-03T16:47:34Z,2020-09-03T16:53:25Z,MEMBER,I think a `is_public` integer column which defaults to 0 would be good here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",692125110, https://github.com/dogsheep/twitter-to-sqlite/issues/40#issuecomment-606307376,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40,606307376,MDEyOklzc3VlQ29tbWVudDYwNjMwNzM3Ng==,9599,2020-03-30T23:35:40Z,2020-03-30T23:39:15Z,MEMBER,"I think five separate tables: * followers_count_history * friends_count_history * listed_count_history * favourites_count_history * statuses_count_history Each with the following structure: * datetime (ISO UTC) * user (ID, foreign key to users) * count (integer) I'm tempted to have a compound primary key here - user, datetime ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",590669793, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461232709,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461232709,IC_kwDOJHON9s5XGKRF,9599,2023-03-09T03:54:28Z,2023-03-09T03:54:28Z,MEMBER,"I think the AppleScript I want to pass to `osascript` looks like this: ```applescript tell application ""Notes"" repeat with eachNote in every note set noteId to the id of eachNote set noteTitle to the name of eachNote set noteBody to the body of eachNote log ""------------------------"" & ""\n"" log noteId & ""\n"" log noteTitle & ""\n\n"" log noteBody & ""\n"" end repeat end tell ``` But there are a few more properties I'd like to get - created and updated date for example.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660553711,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660553711,MDEyOklzc3VlQ29tbWVudDY2MDU1MzcxMQ==,9599,2020-07-18T22:52:16Z,2020-07-18T22:52:16Z,MEMBER,"I think the best fix is to download the `github.db` database, manually fix it and then manually deploy it to Cloud Run from my laptop.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601, https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711079056,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11,711079056,MDEyOklzc3VlQ29tbWVudDcxMTA3OTA1Ng==,9599,2020-10-17T20:53:00Z,2020-10-17T20:53:00Z,MEMBER,"I think the safest thing is to sniff the first few lines of the file. Those should be the same no matter the language that was used: ```xml ` content - it's just HTML, not even trying to be XML.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",978743426, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462570187,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462570187,IC_kwDOJHON9s5XLQzL,9599,2023-03-09T18:30:24Z,2023-03-09T18:30:24Z,MEMBER,"I used ChatGPT to write this: ``` osascript -e 'tell application ""Notes"" set allFolders to folders repeat with aFolder in allFolders set folderId to id of aFolder set folderName to name of aFolder set folderContainer to container of aFolder set folderContainerName to name of folderContainer log ""Folder ID: "" & folderId log ""Folder Name: "" & folderName log ""Folder Container: "" & folderContainerName log "" "" --check for nested folders if count of folders of aFolder > 0 then set nestedFolders to folders of aFolder repeat with aNestedFolder in nestedFolders set nestedFolderId to id of aNestedFolder set nestedFolderName to name of aNestedFolder set nestedFolderContainer to container of aNestedFolder set nestedFolderContainerName to name of nestedFolderContainer log "" Nested Folder ID: "" & nestedFolderId log "" Nested Folder Name: "" & nestedFolderName log "" Nested Folder Container: "" & nestedFolderContainerName log "" "" end repeat end if end repeat end tell ' ``` Which for my account output this: ``` Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6113 Folder Name: Blog posts Folder Container: iCloud Nested Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Nested Folder Name: Nested inside blog posts Nested Folder Container: Blog posts Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p698 Folder Name: JSK Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Folder Name: Nested inside blog posts Folder Container: Blog posts Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3526 Folder Name: New Folder Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3839 Folder Name: New Folder 1 Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p2 Folder Name: Notes Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6059 Folder Name: Quick Notes Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7283 Folder Name: UK Christmas 2022 Folder Container: iCloud ``` So I think the correct approach here is to run code at the start to list all of the folders (no need to do fancy recursion though, just a flat list with the parent containers is enough) and create a model of that hierarchy in SQLite. Then when I import notes I can foreign key reference them back to their containing folder. I'm tempted to use `rowid` for the foreign keys because the official IDs are pretty long.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847, https://github.com/dogsheep/twitter-to-sqlite/issues/40#issuecomment-607011972,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40,607011972,MDEyOklzc3VlQ29tbWVudDYwNzAxMTk3Mg==,9599,2020-04-01T03:49:02Z,2020-04-01T03:50:01Z,MEMBER,"I want the datetime value to look like `2020-04-01T03:34:58+00:00` (the format returned by the Twitter API which I am storing in other tables at the moment). ``` >>> datetime.utcnow().isoformat().split('.')[0] + '+00:00' '2020-04-01T03:49:52+00:00' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",590669793, https://github.com/dogsheep/github-to-sqlite/issues/43#issuecomment-660536265,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43,660536265,MDEyOklzc3VlQ29tbWVudDY2MDUzNjI2NQ==,9599,2020-07-18T20:15:12Z,2020-07-18T20:15:12Z,MEMBER,"I want to create a SQL query which shows me all of my repositories that have commits that are NOT in the most recent release. The releases table doesn't have enough information for this because it doesn't tell you the commit hash associated with each release, just the tag: https://github-to-sqlite.dogsheep.net/github/releases","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660355904, https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543266947,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25,543266947,MDEyOklzc3VlQ29tbWVudDU0MzI2Njk0Nw==,9599,2019-10-17T16:56:06Z,2019-10-17T16:56:06Z,MEMBER,I wrote a test that proves that this is a problem. Should be an easy fix though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",508578780, https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711074031,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11,711074031,MDEyOklzc3VlQ29tbWVudDcxMTA3NDAzMQ==,9599,2020-10-17T20:14:01Z,2020-10-17T20:14:01Z,MEMBER,I'd be happy to teach the tool to look for `export.xml` or `eksport.xml` - and then expand that list to other languages.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723838331, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693775622,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693775622,MDEyOklzc3VlQ29tbWVudDY5Mzc3NTYyMg==,9599,2020-09-17T02:48:34Z,2020-09-17T02:48:34Z,MEMBER,I'd like a `--paginate` option that does the same thing as https://github.com/simonw/paginate-json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756, https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694554584,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24,694554584,MDEyOklzc3VlQ29tbWVudDY5NDU1NDU4NA==,9599,2020-09-17T23:31:25Z,2020-09-17T23:31:25Z,MEMBER,"I'd prefer it if errors in these template fragments were displayed as errors inline where the fragment should have been inserted, rather than 500ing the whole page - especially since the template fragments are user-provided and could have all kinds of odd errors in them which should be as easy to debug as possible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703970814, https://github.com/dogsheep/dogsheep-photos/issues/20#issuecomment-633629944,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20,633629944,MDEyOklzc3VlQ29tbWVudDYzMzYyOTk0NA==,9599,2020-05-25T15:47:42Z,2020-05-25T15:47:42Z,MEMBER,"I'll add a proper section to the README, but for the moment here's how I do this. First, install `datasette` and the `datasette-media` plugin. Create a `metadata.yaml` file with the following content: ```yaml plugins: datasette-media: photo: sql: |- select path as filepath, 200 as resize_height from apple_photos where uuid = :key photo-big: sql: |- select path as filepath, 1024 as resize_height from apple_photos where uuid = :key ``` Now run `datasette -m metadata.yaml photos.db` - thumbnails will be served at http://127.0.0.1:8001/-/media/photo/F4469918-13F3-43D8-9EC1-734C0E6B60AD and larger sizes of the image at http://127.0.0.1:8001/-/media/photo-big/A8B02C7D-365E-448B-9510-69F80C26304D I also made myself two custom pages, one showing recent images and one showing random images. To do this, install the `datasette-template-sql` plugin and then create a `templates/pages` directory and add these files: `recent-photos.html` ```html

Recent photos

{% for photo in sql(""select * from apple_photos order by date desc limit 100"") %} {% endfor %}
``` `random-photos.html` ```html

Random photos

{% for photo in sql(""with foo as (select * from apple_photos order by date desc limit 5000) select * from foo order by random() limit 100"") %} {% endfor %}
``` Now run `datasette -m metadata.yaml photos.db --template-dir=templates/` Visit http://127.0.0.1:8001/random-photos to see some random photos or http://127.0.0.1:8002/recent-photos for recent photos. This is using this mechanism: https://datasette.readthedocs.io/en/stable/custom_templates.html#custom-pages","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",613006393, https://github.com/dogsheep/dogsheep-photos/issues/20#issuecomment-633644225,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20,633644225,MDEyOklzc3VlQ29tbWVudDYzMzY0NDIyNQ==,9599,2020-05-25T16:30:44Z,2020-05-25T16:30:44Z,MEMBER,I'll add docs on using `datasette-json-html` too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",613006393, https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685966707,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685966707,MDEyOklzc3VlQ29tbWVudDY4NTk2NjcwNw==,9599,2020-09-02T20:04:08Z,2020-09-02T20:04:08Z,MEMBER,I'll make `category` a foreign key to a `categories` table so Datasette can automatically show the `name` column.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198, https://github.com/dogsheep/github-to-sqlite/issues/13#issuecomment-602862236,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/13,602862236,MDEyOklzc3VlQ29tbWVudDYwMjg2MjIzNg==,9599,2020-03-23T21:20:26Z,2020-03-23T21:20:26Z,MEMBER,I'll run the `commits` and `issues` and `issue-comments` commands in addition to the `releases` command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",521275281, https://github.com/dogsheep/twitter-to-sqlite/issues/21#issuecomment-542333836,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21,542333836,MDEyOklzc3VlQ29tbWVudDU0MjMzMzgzNg==,9599,2019-10-15T18:00:48Z,2019-10-15T18:00:48Z,MEMBER,I'll use `html.unescape()` for this: https://docs.python.org/3/library/html.html#html.unescape,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506432572, https://github.com/dogsheep/dogsheep-photos/issues/6#issuecomment-615979923,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6,615979923,MDEyOklzc3VlQ29tbWVudDYxNTk3OTkyMw==,9599,2020-04-18T23:36:02Z,2020-04-18T23:36:02Z,MEMBER,"I'll use a Click progress bar. To do this I need to first calculate the sum number of bytes in the photos that are going to be uploaded, then run the upload.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",602575575, https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-623006004,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8,623006004,MDEyOklzc3VlQ29tbWVudDYyMzAwNjAwNA==,9599,2020-05-02T20:00:26Z,2020-05-02T20:00:26Z,MEMBER,I'm abandoning this in favour of a new implementation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",516763727, https://github.com/dogsheep/github-to-sqlite/issues/26#issuecomment-614794739,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26,614794739,MDEyOklzc3VlQ29tbWVudDYxNDc5NDczOQ==,9599,2020-04-16T17:38:28Z,2020-04-16T17:38:28Z,MEMBER,I'm already doing this here: https://github.com/dogsheep/github-to-sqlite/blob/c4aaa50e167cfa9021c7c94260bc3e89e10947bf/github_to_sqlite/utils.py#L246-L250,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",601271612,