home / github

Menu
  • Search all tables
  • GraphQL API

steps

Table actions
  • GraphQL API for steps

12 rows where job = 27859

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

id ▼ seq job repo uses name with run env if
138806 1 deploy 27859 datasette 107914493 actions/checkout@v3 Check out datasette        
138807 2 deploy 27859 datasette 107914493 actions/setup-python@v4 Set up Python
{
    "python-version": "3.9"
}
     
138808 3 deploy 27859 datasette 107914493 actions/cache@v3 Configure pip caching
{
    "path": "~/.cache/pip",
    "key": "${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}",
    "restore-keys": "${{ runner.os }}-pip-\n"
}
     
138809 4 deploy 27859 datasette 107914493   Install Python dependencies   python -m pip install --upgrade pip python -m pip install -e .[test] python -m pip install -e .[docs] python -m pip install sphinx-to-sqlite==0.1a1    
138810 5 deploy 27859 datasette 107914493   Run tests   pytest -n auto -m "not serial" pytest -m "serial"   ${{ github.ref == 'refs/heads/main' }}
138811 6 deploy 27859 datasette 107914493   Build fixtures.db and other files needed to deploy the demo   python tests/fixtures.py \ fixtures.db \ fixtures-config.json \ fixtures-metadata.json \ plugins \ --extra-db-filename extra_database.db    
138812 7 deploy 27859 datasette 107914493   Build docs.db   cd docs DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build sphinx-to-sqlite ../docs.db _build cd ..   ${{ github.ref == 'refs/heads/main' }}
138813 8 deploy 27859 datasette 107914493   Set up the alternate-route demo   echo ' from datasette import hookimpl @hookimpl def startup(datasette): db = datasette.get_database("fixtures2") db.route = "alternative-route" ' > plugins/alternative_route.py cp fixtures.db fixtures2.db    
138814 9 deploy 27859 datasette 107914493   And the counters writable canned query demo   cat > plugins/counters.py <<EOF from datasette import hookimpl @hookimpl def startup(datasette): db = datasette.add_memory_database("counters") async def inner(): await db.execute_write("create table if not exists counters (name text primary key, value integer)") await db.execute_write("insert or ignore into counters (name, value) values ('counter_a', 0)") await db.execute_write("insert or ignore into counters (name, value) values ('counter_b', 0)") await db.execute_write("insert or ignore into counters (name, value) values ('counter_c', 0)") return inner @hookimpl def canned_queries(database): if database == "counters": queries = {} for name in ("counter_a", "counter_b", "counter_c"): queries["increment_{}".format(name)] = { "sql": "update counters set value = value + 1 where name = '{}'".format(name), "on_success_message_sql": "select 'Counter {name} incremented to ' || value from counters where name = '{name}'".format(name=name), "write": True, } queries["decrement_{}".format(name)] = { "sql": "update counters set value = value - 1 where name = '{}'".format(name), "on_success_message_sql": "select 'Counter {name} decremented to ' || value from counters where name = '{name}'".format(name=name), "write": True, } return queries EOF    
138815 10 deploy 27859 datasette 107914493 google-github-actions/setup-gcloud@v0 Set up Cloud Run
{
    "version": "318.0.0",
    "service_account_email": "${{ secrets.GCP_SA_EMAIL }}",
    "service_account_key": "${{ secrets.GCP_SA_KEY }}"
}
     
138816 11 deploy 27859 datasette 107914493   Deploy to Cloud Run   gcloud config set run/region us-central1 gcloud config set project datasette-222320 export SUFFIX="-${GITHUB_REF#refs/heads/}" export SUFFIX=${SUFFIX#-main} # Replace 1.0 with one-dot-zero in SUFFIX export SUFFIX=${SUFFIX//1.0/one-dot-zero} datasette publish cloudrun fixtures.db fixtures2.db extra_database.db \ -m fixtures-metadata.json \ --plugins-dir=plugins \ --branch=$GITHUB_SHA \ --version-note=$GITHUB_SHA \ --extra-options="--setting template_debug 1 --setting trace_debug 1 --crossdb" \ --install 'datasette-ephemeral-tables>=0.2.2' \ --service "datasette-latest$SUFFIX" \ --secret $LATEST_DATASETTE_SECRET
{
    "LATEST_DATASETTE_SECRET": "${{ secrets.LATEST_DATASETTE_SECRET }}"
}
 
138817 12 deploy 27859 datasette 107914493   Deploy to docs as well (only for main)   # Deploy docs.db to a different service datasette publish cloudrun docs.db \ --branch=$GITHUB_SHA \ --version-note=$GITHUB_SHA \ --extra-options="--setting template_debug 1" \ --service=datasette-docs-latest   ${{ github.ref == 'refs/heads/main' }}

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [steps] (
   [id] INTEGER PRIMARY KEY,
   [seq] INTEGER,
   [job] INTEGER REFERENCES [jobs]([id]),
   [repo] INTEGER REFERENCES [repos]([id]),
   [uses] TEXT,
   [name] TEXT,
   [with] TEXT,
   [run] TEXT
, [env] TEXT, [if] TEXT);
CREATE INDEX [idx_steps_repo]
    ON [steps] ([repo]);
CREATE INDEX [idx_steps_job]
    ON [steps] ([job]);
Powered by Datasette · Queries took 38.081ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows