issue_comments
20 rows where author_association = "OWNER" and issue = 679808124 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- Move CI to GitHub Issues · 20 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
692340275 | https://github.com/simonw/datasette/issues/940#issuecomment-692340275 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjM0MDI3NQ== | simonw 9599 | 2020-09-14T22:09:35Z | 2020-09-14T22:09:35Z | OWNER | I'm going to cross my fingers and hope that this works - I don't want to leave this issue open until Datasette 0.50. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
692339645 | https://github.com/simonw/datasette/issues/940#issuecomment-692339645 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjMzOTY0NQ== | simonw 9599 | 2020-09-14T22:07:58Z | 2020-09-14T22:07:58Z | OWNER | I shipped the Docker build manually by running the following in a tmate session:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
692337397 | https://github.com/simonw/datasette/issues/940#issuecomment-692337397 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjMzNzM5Nw== | simonw 9599 | 2020-09-14T22:01:56Z | 2020-09-14T22:01:56Z | OWNER | I'm going to switch to using this logic to decide if I should ship to Docker: https://github.community/t/release-prerelease-action-triggers/17275/2
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
692336564 | https://github.com/simonw/datasette/issues/940#issuecomment-692336564 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjMzNjU2NA== | simonw 9599 | 2020-09-14T21:59:40Z | 2020-09-14T21:59:40Z | OWNER | Using https://github.com/marketplace/actions/debugging-with-tmate to manually submit a new build from within an interactive GitHub Actions session. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
692332430 | https://github.com/simonw/datasette/issues/940#issuecomment-692332430 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjMzMjQzMA== | simonw 9599 | 2020-09-14T21:48:59Z | 2020-09-14T21:48:59Z | OWNER | So now I've released Datasette 0.49 but failed to push a new Docker image. This is bad, and I need to fix it. I'd like to push to Docker from GitHub Actions, so I think I'm going to create a one-off workflow task for doing that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
692331919 | https://github.com/simonw/datasette/issues/940#issuecomment-692331919 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjMzMTkxOQ== | simonw 9599 | 2020-09-14T21:47:39Z | 2020-09-14T21:47:39Z | OWNER | I bet that's because the
And the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
692331349 | https://github.com/simonw/datasette/issues/940#issuecomment-692331349 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MjMzMTM0OQ== | simonw 9599 | 2020-09-14T21:46:11Z | 2020-09-14T21:46:11Z | OWNER | Just release Datasette 0.49 - which shipped to PyPI just fine but skipped the Docker step for some reason! https://github.com/simonw/datasette/runs/1114585275?check_suite_focus=true |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
691781345 | https://github.com/simonw/datasette/issues/940#issuecomment-691781345 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MTc4MTM0NQ== | simonw 9599 | 2020-09-14T02:53:25Z | 2020-09-14T02:53:49Z | OWNER | That worked: https://github.com/simonw/datasette/runs/1110040212?check_suite_focus=true ran and deployed https://pypi.org/project/datasette/0.49a1/ to PyPI but it skipped the push to Docker step because there was an "a" in the tag. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
691779693 | https://github.com/simonw/datasette/issues/940#issuecomment-691779693 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MTc3OTY5Mw== | simonw 9599 | 2020-09-14T02:46:39Z | 2020-09-14T02:46:39Z | OWNER | I think those should be single quoted. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
691779510 | https://github.com/simonw/datasette/issues/940#issuecomment-691779510 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MTc3OTUxMA== | simonw 9599 | 2020-09-14T02:45:53Z | 2020-09-14T02:45:53Z | OWNER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | ||
691779361 | https://github.com/simonw/datasette/issues/940#issuecomment-691779361 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY5MTc3OTM2MQ== | simonw 9599 | 2020-09-14T02:45:04Z | 2020-09-14T02:45:04Z | OWNER | Package deploys are still broken, just got this error trying to ship 0.49a1: https://github.com/simonw/datasette/actions/runs/253099665
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
675538586 | https://github.com/simonw/datasette/issues/940#issuecomment-675538586 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NTUzODU4Ng== | simonw 9599 | 2020-08-18T15:11:36Z | 2020-08-18T15:11:36Z | OWNER | I tested this new publish pattern (running the tests in parallel before the deploy step) on |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
675253373 | https://github.com/simonw/datasette/issues/940#issuecomment-675253373 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NTI1MzM3Mw== | simonw 9599 | 2020-08-18T05:10:17Z | 2020-08-18T05:10:17Z | OWNER | I'll close this after the next release successfully goes out. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
675251613 | https://github.com/simonw/datasette/issues/940#issuecomment-675251613 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NTI1MTYxMw== | simonw 9599 | 2020-08-18T05:05:15Z | 2020-08-18T05:05:15Z | OWNER | I think this is ready. I'll only know for sure the first time I push a release through it though! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
674590583 | https://github.com/simonw/datasette/issues/940#issuecomment-674590583 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NDU5MDU4Mw== | simonw 9599 | 2020-08-16T23:15:51Z | 2020-08-18T05:04:43Z | OWNER | This example of jobs depending on each other and sharing data via artifacts looks relevant: https://docs.github.com/en/actions/configuring-and-managing-workflows/persisting-workflow-data-using-artifacts#passing-data-between-jobs-in-a-workflow |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
675250280 | https://github.com/simonw/datasette/issues/940#issuecomment-675250280 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NTI1MDI4MA== | simonw 9599 | 2020-08-18T05:01:34Z | 2020-08-18T05:01:42Z | OWNER | I think |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
674589472 | https://github.com/simonw/datasette/issues/940#issuecomment-674589472 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NDU4OTQ3Mg== | simonw 9599 | 2020-08-16T23:05:57Z | 2020-08-16T23:05:57Z | OWNER | When I figure this out I'll update the https://github.com/simonw/datasette-plugin/blob/main/datasette-%7B%7Bcookiecutter.hyphenated%7D%7D/.github/workflows/publish.yml default workflow to do this - right now it runs the tests once on just a single version of Python as part of the package deploy to PyPI step. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
674589321 | https://github.com/simonw/datasette/issues/940#issuecomment-674589321 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NDU4OTMyMQ== | simonw 9599 | 2020-08-16T23:04:34Z | 2020-08-16T23:04:34Z | OWNER |
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
674589035 | https://github.com/simonw/datasette/issues/940#issuecomment-674589035 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NDU4OTAzNQ== | simonw 9599 | 2020-08-16T23:02:23Z | 2020-08-16T23:02:23Z | OWNER | I'd like to set these up as different workflows that depend on each other, if that's possible. I want to start three test runs in parallel (on three different Python versions), then if all three pass kick off the PyPI push (without running more tests), then if that passes do the Docker build and push. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 | |
674566618 | https://github.com/simonw/datasette/issues/940#issuecomment-674566618 | https://api.github.com/repos/simonw/datasette/issues/940 | MDEyOklzc3VlQ29tbWVudDY3NDU2NjYxOA== | simonw 9599 | 2020-08-16T19:20:58Z | 2020-08-16T19:20:58Z | OWNER | I need to figure out how to build and push the Docker image on releases. Here's the Travis code for that: https://github.com/simonw/datasette/blob/52eabb019d4051084b21524bd0fd9c2731126985/.travis.yml#L38-L47 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Move CI to GitHub Issues 679808124 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1