html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/940#issuecomment-675538586,https://api.github.com/repos/simonw/datasette/issues/940,675538586,MDEyOklzc3VlQ29tbWVudDY3NTUzODU4Ng==,9599,2020-08-18T15:11:36Z,2020-08-18T15:11:36Z,OWNER,I tested this new publish pattern (running the tests in parallel before the deploy step) on `github-to-sqlite` - skipping the Docker step - and it worked: https://github.com/dogsheep/github-to-sqlite/actions/runs/213809864,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-675253373,https://api.github.com/repos/simonw/datasette/issues/940,675253373,MDEyOklzc3VlQ29tbWVudDY3NTI1MzM3Mw==,9599,2020-08-18T05:10:17Z,2020-08-18T05:10:17Z,OWNER,I'll close this after the next release successfully goes out.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-675251613,https://api.github.com/repos/simonw/datasette/issues/940,675251613,MDEyOklzc3VlQ29tbWVudDY3NTI1MTYxMw==,9599,2020-08-18T05:05:15Z,2020-08-18T05:05:15Z,OWNER,I think this is ready. I'll only know for sure the first time I push a release through it though!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-674590583,https://api.github.com/repos/simonw/datasette/issues/940,674590583,MDEyOklzc3VlQ29tbWVudDY3NDU5MDU4Mw==,9599,2020-08-16T23:15:51Z,2020-08-18T05:04:43Z,OWNER,This example of jobs depending on each other and sharing data via artifacts looks relevant: https://docs.github.com/en/actions/configuring-and-managing-workflows/persisting-workflow-data-using-artifacts#passing-data-between-jobs-in-a-workflow,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-675250280,https://api.github.com/repos/simonw/datasette/issues/940,675250280,MDEyOklzc3VlQ29tbWVudDY3NTI1MDI4MA==,9599,2020-08-18T05:01:34Z,2020-08-18T05:01:42Z,OWNER,I think `${GITHUB_REF#refs/tags/}` is the equivalent of `$TRAVIS_TAG`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-674589472,https://api.github.com/repos/simonw/datasette/issues/940,674589472,MDEyOklzc3VlQ29tbWVudDY3NDU4OTQ3Mg==,9599,2020-08-16T23:05:57Z,2020-08-16T23:05:57Z,OWNER,When I figure this out I'll update the https://github.com/simonw/datasette-plugin/blob/main/datasette-%7B%7Bcookiecutter.hyphenated%7D%7D/.github/workflows/publish.yml default workflow to do this - right now it runs the tests once on just a single version of Python as part of the package deploy to PyPI step.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-674589321,https://api.github.com/repos/simonw/datasette/issues/940,674589321,MDEyOklzc3VlQ29tbWVudDY3NDU4OTMyMQ==,9599,2020-08-16T23:04:34Z,2020-08-16T23:04:34Z,OWNER,"https://docs.github.com/en/actions/getting-started-with-github-actions/core-concepts-for-github-actions#job > A set of steps that execute on the same runner. You can define the dependency rules for how jobs run in a workflow file. Jobs can run at the same time in parallel or run sequentially depending on the status of a previous job. For example, a workflow can have two sequential jobs that build and test code, where the test job is dependent on the status of the build job. If the build job fails, the test job will not run.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-674589035,https://api.github.com/repos/simonw/datasette/issues/940,674589035,MDEyOklzc3VlQ29tbWVudDY3NDU4OTAzNQ==,9599,2020-08-16T23:02:23Z,2020-08-16T23:02:23Z,OWNER,"I'd like to set these up as different workflows that depend on each other, if that's possible. I want to start three test runs in parallel (on three different Python versions), then if all three pass kick off the PyPI push (without running more tests), then if that passes do the Docker build and push.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124, https://github.com/simonw/datasette/issues/940#issuecomment-674566618,https://api.github.com/repos/simonw/datasette/issues/940,674566618,MDEyOklzc3VlQ29tbWVudDY3NDU2NjYxOA==,9599,2020-08-16T19:20:58Z,2020-08-16T19:20:58Z,OWNER,I need to figure out how to build and push the Docker image on releases. Here's the Travis code for that: https://github.com/simonw/datasette/blob/52eabb019d4051084b21524bd0fd9c2731126985/.travis.yml#L38-L47,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",679808124,