issue_comments
7 rows where issue = 964322136 and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: updated_at (date)
issue 1
- Manage /robots.txt in Datasette core, block robots by default · 7 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
902263367 | https://github.com/simonw/datasette/issues/1426#issuecomment-902263367 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41x3JH | simonw 9599 | 2021-08-19T21:33:51Z | 2021-08-19T21:36:28Z | OWNER | I was worried about if it's possible to allow access to From various answers on Stack Overflow it looks like this should handle that:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
902260338 | https://github.com/simonw/datasette/issues/1426#issuecomment-902260338 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41x2Zy | simonw 9599 | 2021-08-19T21:28:25Z | 2021-08-19T21:29:40Z | OWNER | Actually it looks like you can send a
According to https://developers.google.com/search/docs/advanced/sitemaps/build-sitemap |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
902260799 | https://github.com/simonw/datasette/issues/1426#issuecomment-902260799 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41x2g_ | simonw 9599 | 2021-08-19T21:29:13Z | 2021-08-19T21:29:13Z | OWNER | Bing's equivalent is: https://www.bing.com/webmasters/help/Sitemaps-3b5cf6ed
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
895522818 | https://github.com/simonw/datasette/issues/1426#issuecomment-895522818 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41YJgC | simonw 9599 | 2021-08-09T20:34:10Z | 2021-08-09T20:34:10Z | OWNER | At the very least Datasette should serve a blank |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
895510773 | https://github.com/simonw/datasette/issues/1426#issuecomment-895510773 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41YGj1 | simonw 9599 | 2021-08-09T20:14:50Z | 2021-08-09T20:19:22Z | OWNER | https://twitter.com/mal/status/1424825895139876870
That's another aspect of this: if you DO want your site crawled, teaching the Annoyingly it looks like you need to configure an auth token of some sort in order to use their API though, which is likely too much hassle to be worth building into Datasette itself: https://developers.google.com/search/apis/indexing-api/v3/using-api ``` curl -X POST https://indexing.googleapis.com/v3/urlNotifications:publish -d '{ "url": "https://careers.google.com/jobs/google/technical-writer", "type": "URL_UPDATED" }' -H "Content-Type: application/json" { "error": { "code": 401, "message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.", "status": "UNAUTHENTICATED" } } ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
895509536 | https://github.com/simonw/datasette/issues/1426#issuecomment-895509536 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41YGQg | simonw 9599 | 2021-08-09T20:12:57Z | 2021-08-09T20:12:57Z | OWNER | I could try out the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
895500565 | https://github.com/simonw/datasette/issues/1426#issuecomment-895500565 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c41YEEV | simonw 9599 | 2021-08-09T20:00:04Z | 2021-08-09T20:00:04Z | OWNER | A few options for how this would work:
Options could be:
The "limited" mode is particularly interesting. Could even make it the default, but I think that may be a bit too confusing. Idea would be to get the key pages indexed but use |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1