home / github

Menu
  • Search all tables
  • GraphQL API

steps

Table actions
  • GraphQL API for steps

68 rows where repo = 107914493

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: seq, job, uses, with, run

id ▼ seq job repo uses name with run env if
133396 1 mirror 26772 datasette 107914493 zofrex/mirror-branch@ea152f124954fa4eb26eea3fe0dbe313a3a08d94 Mirror to "master"
{
    "target-branch": "master",
    "force": false
}
     
133397 2 mirror 26772 datasette 107914493 zofrex/mirror-branch@ea152f124954fa4eb26eea3fe0dbe313a3a08d94 Mirror to "main"
{
    "target-branch": "main",
    "force": false
}
     
135667 1 deploy 27224 datasette 107914493 actions/checkout@v3 Check out datasette        
135668 2 deploy 27224 datasette 107914493 actions/setup-python@v4 Set up Python
{
    "python-version": "3.9"
}
     
135669 3 deploy 27224 datasette 107914493 actions/cache@v3 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-pip-\n"
}
     
135670 4 deploy 27224 datasette 107914493   Install Python dependencies   python -m pip install --upgrade pip python -m pip install -e .[test] python -m pip install -e .[docs] python -m pip install sphinx-to-sqlite==0.1a1    
135671 5 deploy 27224 datasette 107914493   Run tests   pytest -n auto -m "not serial" pytest -m "serial"   ${{ github.ref == 'refs/heads/main' }}
135672 6 deploy 27224 datasette 107914493   Build fixtures.db   python tests/fixtures.py fixtures.db fixtures.json plugins --extra-db-filename extra_database.db    
135673 7 deploy 27224 datasette 107914493   Build docs.db   cd docs sphinx-build -b xml . _build sphinx-to-sqlite ../docs.db _build cd ..   ${{ github.ref == 'refs/heads/main' }}
135674 8 deploy 27224 datasette 107914493   Set up the alternate-route demo   echo ' from datasette import hookimpl @hookimpl def startup(datasette): db = datasette.get_database("fixtures2") db.route = "alternative-route" ' > plugins/alternative_route.py cp fixtures.db fixtures2.db    
135675 9 deploy 27224 datasette 107914493   Make some modifications to metadata.json   cat fixtures.json | \ jq '.databases |= . + {"ephemeral": {"allow": {"id": "*"}}}' | \ jq '.plugins |= . + {"datasette-ephemeral-tables": {"table_ttl": 900}}' \ > metadata.json cat metadata.json    
135676 10 deploy 27224 datasette 107914493 google-github-actions/setup-gcloud@v0 Set up Cloud Run
{
    "version": "318.0.0",
    "service_account_email": "${{ secrets.GCP_SA_EMAIL }}",
    "service_account_key": "${{ secrets.GCP_SA_KEY }}"
}
     
135677 11 deploy 27224 datasette 107914493   Deploy to Cloud Run   gcloud config set run/region us-central1 gcloud config set project datasette-222320 export SUFFIX="-${GITHUB_REF#refs/heads/}" export SUFFIX=${SUFFIX#-main} # Replace 1.0 with one-dot-zero in SUFFIX export SUFFIX=${SUFFIX//1.0/one-dot-zero} datasette publish cloudrun fixtures.db fixtures2.db extra_database.db \ -m metadata.json \ --plugins-dir=plugins \ --branch=$GITHUB_SHA \ --version-note=$GITHUB_SHA \ --extra-options="--setting template_debug 1 --setting trace_debug 1 --crossdb" \ --install 'datasette-ephemeral-tables>=0.2.2' \ --service "datasette-latest$SUFFIX" \ --secret $LATEST_DATASETTE_SECRET
{
    "LATEST_DATASETTE_SECRET": "${{ secrets.LATEST_DATASETTE_SECRET }}"
}
 
135678 12 deploy 27224 datasette 107914493   Deploy to docs as well (only for main)   # Deploy docs.db to a different service datasette publish cloudrun docs.db \ --branch=$GITHUB_SHA \ --version-note=$GITHUB_SHA \ --extra-options="--setting template_debug 1" \ --service=datasette-docs-latest   ${{ github.ref == 'refs/heads/main' }}
135679 1 documentation-links 27225 datasette 107914493 readthedocs/actions/preview@v1  
{
    "project-slug": "datasette"
}
     
135680 1 prettier 27226 datasette 107914493 actions/checkout@v2 Check out repo        
135681 2 prettier 27226 datasette 107914493 actions/cache@v2 Configure npm caching
{
    "path": "~/.npm",
    "key": "${{ runner.OS }}-npm-${{ hashFiles('**/package-lock.json') }}",
    "restore-keys": "${{ runner.OS }}-npm-\n"
}
     
135682 3 prettier 27226 datasette 107914493   Install dependencies   npm ci    
135683 4 prettier 27226 datasette 107914493   Run prettier   npm run prettier -- --check    
135684 1 test 27227 datasette 107914493 actions/checkout@v3          
135685 2 test 27227 datasette 107914493 actions/setup-python@v4 Set up Python ${{ matrix.python-version }}
{
    "python-version": "${{ matrix.python-version }}"
}
     
135686 3 test 27227 datasette 107914493 actions/cache@v3 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-pip-\n"
}
     
135687 4 test 27227 datasette 107914493   Install dependencies   pip install -e '.[test]'    
135688 5 test 27227 datasette 107914493   Run tests   pytest    
135689 1 deploy 27228 datasette 107914493 actions/checkout@v3          
135690 2 deploy 27228 datasette 107914493 actions/setup-python@v4 Set up Python
{
    "python-version": "3.11"
}
     
135691 3 deploy 27228 datasette 107914493 actions/cache@v3 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-publish-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-publish-pip-\n"
}
     
135692 4 deploy 27228 datasette 107914493   Install dependencies   pip install setuptools wheel twine    
135693 5 deploy 27228 datasette 107914493   Publish   python setup.py sdist bdist_wheel twine upload dist/*
{
    "TWINE_USERNAME": "__token__",
    "TWINE_PASSWORD": "${{ secrets.PYPI_TOKEN }}"
}
 
135694 1 deploy_static_docs 27229 datasette 107914493 actions/checkout@v2          
135695 2 deploy_static_docs 27229 datasette 107914493 actions/setup-python@v2 Set up Python
{
    "python-version": "3.9"
}
     
135696 3 deploy_static_docs 27229 datasette 107914493 actions/cache@v2 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-publish-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-publish-pip-\n"
}
     
135697 4 deploy_static_docs 27229 datasette 107914493   Install dependencies   python -m pip install -e .[docs] python -m pip install sphinx-to-sqlite==0.1a1    
135698 5 deploy_static_docs 27229 datasette 107914493   Build docs.db   cd docs sphinx-build -b xml . _build sphinx-to-sqlite ../docs.db _build cd ..    
135699 6 deploy_static_docs 27229 datasette 107914493 google-github-actions/setup-gcloud@v0 Set up Cloud Run
{
    "version": "318.0.0",
    "service_account_email": "${{ secrets.GCP_SA_EMAIL }}",
    "service_account_key": "${{ secrets.GCP_SA_KEY }}"
}
     
135700 7 deploy_static_docs 27229 datasette 107914493   Deploy stable-docs.datasette.io to Cloud Run   gcloud config set run/region us-central1 gcloud config set project datasette-222320 datasette publish cloudrun docs.db \ --service=datasette-docs-stable    
135701 1 deploy_docker 27230 datasette 107914493 actions/checkout@v2          
135702 2 deploy_docker 27230 datasette 107914493   Build and push to Docker Hub   sleep 60 # Give PyPI time to make the new release available docker login -u $DOCKER_USER -p $DOCKER_PASS export REPO=datasetteproject/datasette docker build -f Dockerfile \ -t $REPO:${GITHUB_REF#refs/tags/} \ --build-arg VERSION=${GITHUB_REF#refs/tags/} . docker tag $REPO:${GITHUB_REF#refs/tags/} $REPO:latest docker push $REPO:${GITHUB_REF#refs/tags/} docker push $REPO:latest
{
    "DOCKER_USER": "${{ secrets.DOCKER_USER }}",
    "DOCKER_PASS": "${{ secrets.DOCKER_PASS }}"
}
 
135703 1 deploy_docker 27231 datasette 107914493 actions/checkout@v2          
135704 2 deploy_docker 27231 datasette 107914493   Build and push to Docker Hub   docker login -u $DOCKER_USER -p $DOCKER_PASS export REPO=datasetteproject/datasette docker build -f Dockerfile \ -t $REPO:${VERSION_TAG} \ --build-arg VERSION=${VERSION_TAG} . docker push $REPO:${VERSION_TAG}
{
    "DOCKER_USER": "${{ secrets.DOCKER_USER }}",
    "DOCKER_PASS": "${{ secrets.DOCKER_PASS }}",
    "VERSION_TAG": "${{ github.event.inputs.version_tag }}"
}
 
135705 1 spellcheck 27232 datasette 107914493 actions/checkout@v2          
135706 2 spellcheck 27232 datasette 107914493 actions/setup-python@v2 Set up Python ${{ matrix.python-version }}
{
    "python-version": 3.9
}
     
135707 3 spellcheck 27232 datasette 107914493 actions/cache@v2 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-pip-\n"
}
     
135708 4 spellcheck 27232 datasette 107914493   Install dependencies   pip install -e '.[docs]'    
135709 5 spellcheck 27232 datasette 107914493   Check spelling   codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt    
135710 1 test 27233 datasette 107914493 actions/checkout@v2 Check out datasette        
135711 2 test 27233 datasette 107914493 actions/setup-python@v2 Set up Python
{
    "python-version": 3.9
}
     
135712 3 test 27233 datasette 107914493 actions/cache@v2 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-pip-\n"
}
     
135713 4 test 27233 datasette 107914493   Install Python dependencies   python -m pip install --upgrade pip python -m pip install -e .[test] python -m pip install pytest-cov    
135714 5 test 27233 datasette 107914493   Run tests   ls -lah cat .coveragerc pytest --cov=datasette --cov-config=.coveragerc --cov-report xml:coverage.xml --cov-report term ls -lah    
135715 6 test 27233 datasette 107914493 codecov/codecov-action@v1 Upload coverage report
{
    "token": "${{ secrets.CODECOV_TOKEN }}",
    "file": "coverage.xml"
}
     
135716 1 test 27234 datasette 107914493 actions/checkout@v3          
135717 2 test 27234 datasette 107914493 actions/setup-python@v3 Set up Python 3.10
{
    "python-version": "3.10",
    "cache": "pip",
    "cache-dependency-path": "**/setup.py"
}
     
135718 3 test 27234 datasette 107914493 actions/cache@v2 Cache Playwright browsers
{
    "path": "~/.cache/ms-playwright/",
    "key": "${{ runner.os }}-browsers"
}
     
135719 4 test 27234 datasette 107914493   Install Playwright dependencies   pip install shot-scraper build shot-scraper install    
135720 5 test 27234 datasette 107914493   Run test   ./test-in-pyodide-with-shot-scraper.sh    
135721 1 test 27235 datasette 107914493 actions/checkout@v3          
135722 2 test 27235 datasette 107914493 actions/setup-python@v4 Set up Python ${{ matrix.python-version }}
{
    "python-version": "${{ matrix.python-version }}"
}
     
135723 3 test 27235 datasette 107914493 actions/cache@v3 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-pip-\n"
}
     
135724 4 test 27235 datasette 107914493   Build extension for --load-extension test   (cd tests && gcc ext.c -fPIC -shared -o ext.so)    
135725 5 test 27235 datasette 107914493   Install dependencies   pip install -e '.[test]' pip freeze    
135726 6 test 27235 datasette 107914493   Run tests   pytest -n auto -m "not serial" pytest -m "serial" # And the test that exceeds a localhost HTTPS server tests/test_datasette_https_server.sh    
135727 7 test 27235 datasette 107914493   Check if cog needs to be run   cog --check docs/*.rst    
135728 8 test 27235 datasette 107914493   Check if blacken-docs needs to be run   # This fails on syntax errors, or a diff was applied blacken-docs -l 60 docs/*.rst    
135729 1 build 27236 datasette 107914493 actions/checkout@v2          
135730 2 build 27236 datasette 107914493 mxschmitt/action-tmate@v3 Setup tmate session        
135731 1 build 27237 datasette 107914493 actions/checkout@v2          
135732 2 build 27237 datasette 107914493 mxschmitt/action-tmate@v3 Setup tmate session        

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [steps] (
   [id] INTEGER PRIMARY KEY,
   [seq] INTEGER,
   [job] INTEGER REFERENCES [jobs]([id]),
   [repo] INTEGER REFERENCES [repos]([id]),
   [uses] TEXT,
   [name] TEXT,
   [with] TEXT,
   [run] TEXT
, [env] TEXT, [if] TEXT);
CREATE INDEX [idx_steps_repo]
    ON [steps] ([repo]);
CREATE INDEX [idx_steps_job]
    ON [steps] ([job]);
Powered by Datasette · Queries took 1.2ms · About: github-to-sqlite