html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622162835,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34,622162835,MDEyOklzc3VlQ29tbWVudDYyMjE2MjgzNQ==,9599,2020-04-30T22:59:26Z,2020-04-30T22:59:26Z,MEMBER,Documentation: https://github.com/dogsheep/github-to-sqlite/blob/c9f48404481882e8b3af06f35e4801a80ac79ed6/README.md#scraping-dependents-for-a-repository,"{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610408908, https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622135654,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34,622135654,MDEyOklzc3VlQ29tbWVudDYyMjEzNTY1NA==,9599,2020-04-30T21:53:44Z,2020-04-30T21:56:06Z,MEMBER,"I think this is the neatest scraping pattern: ```python [a[""href""].lstrip(""/"") for a in soup.select(""a[data-hovercard-type=repository]"")] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610408908, https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622136585,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34,622136585,MDEyOklzc3VlQ29tbWVudDYyMjEzNjU4NQ==,9599,2020-04-30T21:55:51Z,2020-04-30T21:55:51Z,MEMBER,"And to find the ""Next"" pagination link: ```python soup.select("".paginate-container"")[0].find(""a"", text=""Next"") ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610408908, https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622133775,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34,622133775,MDEyOklzc3VlQ29tbWVudDYyMjEzMzc3NQ==,9599,2020-04-30T21:49:27Z,2020-04-30T21:49:27Z,MEMBER,"Proposed command: github-to-sqlite scrape-dependents github.db simonw/datasette I'll pull full details of the scraped repos from the regular API. I'll also record when they were ""first seen"" by the command.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610408908, https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622133422,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34,622133422,MDEyOklzc3VlQ29tbWVudDYyMjEzMzQyMg==,9599,2020-04-30T21:48:39Z,2020-04-30T21:48:39Z,MEMBER,It looks like the only option is to scrape them. I'll do that and then replace with an API as soon as one becomes available.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610408908, https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622133298,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34,622133298,MDEyOklzc3VlQ29tbWVudDYyMjEzMzI5OA==,9599,2020-04-30T21:48:24Z,2020-04-30T21:48:24Z,MEMBER,"Unfortunately it's not available through any GitHub API - I managed to figure out how to get dependencies, but I need dependents. https://github.com/simonw/til/blob/master/github/dependencies-graphql-api.md","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",610408908,