4,997 rows sorted by updated_at descending

View and edit SQL

author_association

id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
753660814 https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660814 https://api.github.com/repos/simonw/sqlite-utils/issues/215 MDEyOklzc3VlQ29tbWVudDc1MzY2MDgxNA== simonw 9599 2021-01-03T18:53:05Z 2021-01-03T18:53:05Z OWNER

Here's the current .count property: https://github.com/simonw/sqlite-utils/blob/036ec6d32313487527c66dea613a3e7118b97459/sqlite_utils/db.py#L597-L609

It's implemented on Queryable which means it's available on both Table and View - the optimization doesn't make sense for views.

I'm a bit cautious about making that property so much more complex. In order to decide if it should try the _counts table first it needs to know:

  • Should it be trusting the counts? I'm thinking a .should_trust_counts property on Database which defaults to True would be good - then advanced users can turn that off if they know the counts should not be trusted.
  • Does the _counts table exist?
  • Are the triggers defined?

Then it can do the query, and if the query fails it can fall back on the count(*). That's quite a lot of extra activity though.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Use _counts to speed up counts 777535402  
753660379 https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660379 https://api.github.com/repos/simonw/sqlite-utils/issues/215 MDEyOklzc3VlQ29tbWVudDc1MzY2MDM3OQ== simonw 9599 2021-01-03T18:50:15Z 2021-01-03T18:50:15Z OWNER
    def cached_counts(self, tables=None):
        sql = "select [table], count from {}".format(self._counts_table_name)
        if tables:
            sql += " where [table] in ({})".format(", ".join("?" for table in tables))
        return {r[0]: r[1] for r in self.execute(sql, tables).fetchall()}
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Use _counts to speed up counts 777535402  
753659260 https://github.com/simonw/sqlite-utils/issues/206#issuecomment-753659260 https://api.github.com/repos/simonw/sqlite-utils/issues/206 MDEyOklzc3VlQ29tbWVudDc1MzY1OTI2MA== simonw 9599 2021-01-03T18:42:01Z 2021-01-03T18:42:01Z OWNER
% sqlite-utils insert blah.db blah global_power_plant_database.csv
Error: Invalid JSON - use --csv for CSV or --tsv for TSV files
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
sqlite-utils should suggest --csv if JSON parsing fails 761915790  
753657180 https://github.com/simonw/datasette/issues/1169#issuecomment-753657180 https://api.github.com/repos/simonw/datasette/issues/1169 MDEyOklzc3VlQ29tbWVudDc1MzY1NzE4MA== simonw 9599 2021-01-03T18:23:30Z 2021-01-03T18:23:30Z OWNER

Also welcome in that PR would be a bit of documentation for contributors, see #1167 - but no problem if you leave that out, I'm happy to add it later.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Prettier package not actually being cached 777677671  
753653260 https://github.com/simonw/datasette/issues/1169#issuecomment-753653260 https://api.github.com/repos/simonw/datasette/issues/1169 MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA== simonw 9599 2021-01-03T17:54:40Z 2021-01-03T17:54:40Z OWNER

And @benpickles yes I would land that pull request straight away as-is. Thanks!

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Prettier package not actually being cached 777677671  
753653033 https://github.com/simonw/datasette/issues/1169#issuecomment-753653033 https://api.github.com/repos/simonw/datasette/issues/1169 MDEyOklzc3VlQ29tbWVudDc1MzY1MzAzMw== simonw 9599 2021-01-03T17:52:53Z 2021-01-03T17:52:53Z OWNER

Oh that's so frustrating! I was worried about that - I spotted a few runs that seemed faster and hoped that it meant that the package was coming out of the ~/.npm cache, but evidently that's not the case.

You've convinced me that Datasette itself should have a package.json - the Dependabot argument is a really good one.

But... I'd really love to figure out a general pattern for using npx scripts in GitHub Actions workflows in a cache-friendly way. I have plenty of other projects that I'd love to run Prettier or Uglify or puppeteer-cli in without adding a package.json to them.

Any ideas? The best I can think of is for the workflow itself to write out a package.json file (using echo '{ ... }' > package.json) as part of the run - that way the cache should work (I think) but I don't get a misleading package.json file sitting in the repo.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Prettier package not actually being cached 777677671  
753600999 https://github.com/simonw/datasette/issues/983#issuecomment-753600999 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzYwMDk5OQ== MarkusH 475613 2021-01-03T11:11:21Z 2021-01-03T11:11:21Z NONE

With regards to JS/Browser events, given your example of menu items that plugins could add, I could imagine this code to work:

// as part of datasette
datasette.events.AddMenuItem = 'DatasetteAddMenuItemEvent';
document.addEventListener(datasette.events.AddMenuItem, (e) => {
  // do whatever is needed to add the menu item. Data comes from `e`
  alert(e.title + ' ' + e.link);
});

// as part of a plugin
const event = new Event(datasette.events.AddMenuItem, {link: '/foo/bar', title: 'Go somewhere'});
Document.dispatchEvent(event)
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753587963 https://github.com/simonw/datasette/issues/983#issuecomment-753587963 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzU4Nzk2Mw== dracos 154364 2021-01-03T09:02:50Z 2021-01-03T10:00:05Z NONE

but I'm already commited to requiring support for () => {} arrow functions

Don't think you are :) (e.g. gzipped, using arrow functions in my example saves 2 bytes over spelling out function). On FMS, past month, looking at popular browsers, looks like we'd have 95.41% arrow support, 94.19% module support, and 4.58% (mostly IE9/IE11/Safari 9) supporting neither.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753570710 https://github.com/simonw/datasette/issues/983#issuecomment-753570710 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzU3MDcxMA== simonw 9599 2021-01-03T05:29:56Z 2021-01-03T05:29:56Z OWNER

I thought about using browser events, but they don't quite match the API that I'm looking to provide. In particular, the great thing about Pluggy is that if you have multiple handlers registered for a specific plugin hook each of those handlers can return a value, and Pluggy will combine those values into a list of replies.

This is great for things like plugin hooks that add extra menu items - each plugin can return a menu item (maybe as a label/URL/click-callback object) and the calling code can then add all of those items to the menu. See https://docs.datasette.io/en/stable/plugin_hooks.html#table-actions-datasette-actor-database-table for a Python example.

I'm on the fence about relying on JavaScript modules. I need to think about browser compatibility for them - but I'm already commited to requiring support for () => {} arrow functions so maybe I'm committed to module support too already?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753568428 https://github.com/simonw/datasette/issues/1160#issuecomment-753568428 https://api.github.com/repos/simonw/datasette/issues/1160 MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA== simonw 9599 2021-01-03T05:02:32Z 2021-01-03T05:02:32Z OWNER

Should this command include a --fts option for configuring full-text search on one-or-more columns?

I thought about doing that for sqlite-utils insert in https://github.com/simonw/sqlite-utils/issues/202 and decided not to because of the need to include extra options covering the FTS version, porter stemming options and whether or not to create triggers.

But maybe I can set sensible defaults for that with datasette insert ... -f title -f body? Worth thinking about a bit more.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"datasette insert" command and plugin hook 775666296  
753568264 https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753568264 https://api.github.com/repos/simonw/sqlite-utils/issues/202 MDEyOklzc3VlQ29tbWVudDc1MzU2ODI2NA== simonw 9599 2021-01-03T05:00:24Z 2021-01-03T05:00:24Z OWNER

I'm not going to implement this, because it actually needs several additional options that already exist on sqlite-utils enable-fts:

  --fts4                 Use FTS4
  --fts5                 Use FTS5
  --tokenize TEXT        Tokenizer to use, e.g. porter
  --create-triggers      Create triggers to update the FTS tables when the
                         parent table changes.

I'd rather not add all four of those options to sqlite-utils insert just to support this shortcut.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
sqlite-utils insert -f colname - for configuring full-text search 738514367  
753567969 https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753567969 https://api.github.com/repos/simonw/sqlite-utils/issues/202 MDEyOklzc3VlQ29tbWVudDc1MzU2Nzk2OQ== simonw 9599 2021-01-03T04:55:17Z 2021-01-03T04:55:43Z OWNER

The long version of this can be --fts, same as in csvs-to-sqlite.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
sqlite-utils insert -f colname - for configuring full-text search 738514367  
753567932 https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932 https://api.github.com/repos/simonw/sqlite-utils/issues/203 MDEyOklzc3VlQ29tbWVudDc1MzU2NzkzMg== simonw 9599 2021-01-03T04:54:43Z 2021-01-03T04:54:43Z OWNER

Another option: expand the ForeignKey object to have .columns and .other_columns properties in addition to the existing .column and .other_column properties. These new plural properties would always return a tuple, which would be a one-item tuple for a non-compound-foreign-key.

The question then is what should .column and .other_column return for compound foreign keys?

I'd be inclined to say they should return None - which would trigger errors in code that encounters a compound foreign key for the first time, but those errors would at least be a strong indicator as to what had gone wrong.

We can label .column and .other_column as deprecated and then remove them in sqlite-utils 4.0.

Since this would still be a breaking change in some minor edge-cases I'm thinking maybe 4.0 needs to happen in order to land this feature. I'm not opposed to doing that, I was just hoping it might be avoidable.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
changes to allow for compound foreign keys 743384829  
753567744 https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567744 https://api.github.com/repos/simonw/sqlite-utils/issues/203 MDEyOklzc3VlQ29tbWVudDc1MzU2Nzc0NA== simonw 9599 2021-01-03T04:51:44Z 2021-01-03T04:51:44Z OWNER

One way that this could avoid a breaking change would be to have fk.column and fk.other_column remain as strings for non-compound-foreign-keys, but turn into tuples for a compound foreign key.

This is a bit of an ugly API design, and it could still break existing code that encounters a compound foreign key for the first time - but it would leave code working for the more common case of a non-compound-foreign-key.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
changes to allow for compound foreign keys 743384829  
753567508 https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567508 https://api.github.com/repos/simonw/sqlite-utils/issues/203 MDEyOklzc3VlQ29tbWVudDc1MzU2NzUwOA== simonw 9599 2021-01-03T04:48:17Z 2021-01-03T04:48:17Z OWNER

Sorry for taking so long to review this!

This approach looks great to me - being able to optionally pass a tuple anywhere the API currently expects a column is smart, and it's consistent with how the pk= parameter works elsewhere.

There's just one problem I can see with this: the way it changes the ForeignKey(...) interface to always return a tuple for .column and .other_column, even if that tuple only contains a single item.

This represents a breaking change to the existing API - any code that expects ForeignKey.column to be a single string (which is any code that has been written against that) will break.

As such, I'd have to bump the major version of sqlite-utils to 4.0 in order to ship this.

Ideally I'd like to make this change in a way that doesn't represent an API compatibility break. I need to think a bit harder about how that might be achieved.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
changes to allow for compound foreign keys 743384829  
753566184 https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753566184 https://api.github.com/repos/simonw/sqlite-utils/issues/217 MDEyOklzc3VlQ29tbWVudDc1MzU2NjE4NA== simonw 9599 2021-01-03T04:27:38Z 2021-01-03T04:27:38Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Rename .escape() to .quote() 777543336  
753566156 https://github.com/simonw/sqlite-utils/issues/216#issuecomment-753566156 https://api.github.com/repos/simonw/sqlite-utils/issues/216 MDEyOklzc3VlQ29tbWVudDc1MzU2NjE1Ng== simonw 9599 2021-01-03T04:27:14Z 2021-01-03T04:27:14Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
database.triggers_dict introspection property 777540352  
753563757 https://github.com/simonw/sqlite-utils/issues/218#issuecomment-753563757 https://api.github.com/repos/simonw/sqlite-utils/issues/218 MDEyOklzc3VlQ29tbWVudDc1MzU2Mzc1Nw== simonw 9599 2021-01-03T03:49:51Z 2021-01-03T03:49:51Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
"sqlite-utils triggers" command 777560474  
753545757 https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757 https://api.github.com/repos/simonw/sqlite-utils/issues/215 MDEyOklzc3VlQ29tbWVudDc1MzU0NTc1Nw== simonw 9599 2021-01-02T23:58:07Z 2021-01-02T23:58:07Z OWNER

Thought: maybe there should be a .reset_counts() method too, for if the table gets out of date with the triggers.

One way that could happen is if a table is dropped and recreated - the counts in the _counts table would likely no longer match the number of rows in that table.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Use _counts to speed up counts 777535402  
753545381 https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545381 https://api.github.com/repos/simonw/sqlite-utils/issues/215 MDEyOklzc3VlQ29tbWVudDc1MzU0NTM4MQ== simonw 9599 2021-01-02T23:52:52Z 2021-01-02T23:52:52Z OWNER

Idea: a db.cached_counts() method that returns a dictionary of data from the _counts table. Call it with a list of tables to get back the counts for just those tables.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Use _counts to speed up counts 777535402  
753544914 https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753544914 https://api.github.com/repos/simonw/sqlite-utils/issues/217 MDEyOklzc3VlQ29tbWVudDc1MzU0NDkxNA== simonw 9599 2021-01-02T23:47:42Z 2021-01-02T23:47:42Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Rename .escape() to .quote() 777543336  
753535488 https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753535488 https://api.github.com/repos/simonw/sqlite-utils/issues/213 MDEyOklzc3VlQ29tbWVudDc1MzUzNTQ4OA== simonw 9599 2021-01-02T22:03:48Z 2021-01-02T22:03:48Z OWNER

I got this error while prototyping this:

too many levels of trigger recursion

It looks like that's because SQLite doesn't like triggers on a table that themselves then update that table - so I'm going to exclude the _counts table from this mechanism.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
db.enable_counts() method 777529979  
753533775 https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753533775 https://api.github.com/repos/simonw/sqlite-utils/issues/213 MDEyOklzc3VlQ29tbWVudDc1MzUzMzc3NQ== simonw 9599 2021-01-02T21:47:10Z 2021-01-02T21:47:10Z OWNER

I'm going to skip virtual tables, which I can identify using this property: https://github.com/simonw/sqlite-utils/blob/1cad7fad3e7a5b734088f5cc545b69a055e636da/sqlite_utils/db.py#L720-L726

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
db.enable_counts() method 777529979  
753531657 https://github.com/simonw/datasette/issues/1012#issuecomment-753531657 https://api.github.com/repos/simonw/datasette/issues/1012 MDEyOklzc3VlQ29tbWVudDc1MzUzMTY1Nw== bollwyvl 45380 2021-01-02T21:25:36Z 2021-01-02T21:25:36Z CONTRIBUTOR

Actually, on more research, I found out this is handled by the trove-classifiers package now, so it's just a one-liner pr instead of fire-up-a-docker-container-and-do-some-migrations

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
For 1.0 update trove classifier in setup.py 718540751  
753524779 https://github.com/simonw/datasette/issues/1168#issuecomment-753524779 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzUyNDc3OQ== simonw 9599 2021-01-02T20:19:26Z 2021-01-02T20:19:26Z OWNER

Idea: version the metadata scheme. If the table is called _metadata_v1 it gives me a clear path to designing a new scheme in the future.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753422324 https://github.com/simonw/sqlite-utils/issues/212#issuecomment-753422324 https://api.github.com/repos/simonw/sqlite-utils/issues/212 MDEyOklzc3VlQ29tbWVudDc1MzQyMjMyNA== simonw 9599 2021-01-02T03:00:34Z 2021-01-02T03:00:34Z OWNER

Here's a prototype:

with db.conn:
    db.conn.executescript("""
CREATE TABLE IF NOT EXISTS [_counts] ([table] TEXT PRIMARY KEY, [count] INTEGER DEFAULT 0);
CREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ai] AFTER INSERT ON [Street_Tree_List] BEGIN
    INSERT OR REPLACE INTO _counts
        VALUES ('Street_Tree_List', COALESCE(
            (SELECT count FROM _counts
                WHERE [table]='Street_Tree_List'),
            0) + 1);
END;
CREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ad] AFTER DELETE ON [Street_Tree_List] BEGIN
    INSERT OR REPLACE INTO _counts
        VALUES ('Street_Tree_List', COALESCE(
            (SELECT count FROM _counts
                WHERE [table]='Street_Tree_List'),
            0) - 1);
END;
INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', (select count(*) from [Street_Tree_List]));
""")
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for maintaining cache of table counts using triggers 777392020  
753406744 https://github.com/simonw/sqlite-utils/issues/210#issuecomment-753406744 https://api.github.com/repos/simonw/sqlite-utils/issues/210 MDEyOklzc3VlQ29tbWVudDc1MzQwNjc0NA== simonw 9599 2021-01-02T00:02:39Z 2021-01-02T00:02:39Z OWNER

It looks like https://github.com/ofajardo/pyreadr is a good library for this.

I won't add this to sqlite-utils because it's quite a bulky dependency for a relatively small feature.

Normally I'd write a rdata-to-sqlite tool similar to https://pypi.org/project/dbf-to-sqlite/ - but I'm actually working on a new plugin hook for Datasette that might be an even better fit for this. The idea is to allow Datasette plugins to define input formats - such as RData - which would then result in being able to import them on the command-line with datasette insert my.db file.rdata or by uploading a file through the Datasette web interface.

That work is happening over here: https://github.com/simonw/datasette/issues/1160 - I'll close this issue in favour of a sometime-in-the-future datasette-import-rdata plugin.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support of RData files 767685961  
753405835 https://github.com/simonw/sqlite-utils/issues/209#issuecomment-753405835 https://api.github.com/repos/simonw/sqlite-utils/issues/209 MDEyOklzc3VlQ29tbWVudDc1MzQwNTgzNQ== simonw 9599 2021-01-01T23:52:06Z 2021-01-01T23:52:06Z OWNER

I just hit this one too. Such a weird bug!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Test failure with sqlite 3.34 in test_cli.py::test_optimize 766156875  
753402423 https://github.com/simonw/datasette/issues/1168#issuecomment-753402423 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzQwMjQyMw== simonw 9599 2021-01-01T23:16:05Z 2021-01-01T23:16:05Z OWNER

One catch: solving the "show me all metadata for everything in this Datasette instance" problem.

Ideally there would be a SQLite table that can be queried for this. But the need to resolve the potentially complex set of precedence rules means that table would be difficult if not impossible to provide at run-time.

Ideally a denormalized table would be available that featured the results of running those precedence rule calculations. But how to handle keeping this up-to-date? It would need to be recalculated any time a _metadata table in any of the attached databases had an update.

This is a much larger problem - but one potential fix would be to use triggers to maintain a "version number" for the _metadata table - similar to SQLite's own built-in schema_version mechanism. Triggers could increment a counter any time a record in that table was added, deleted or updated.

Such a mechanism would have applications outside of just this _metadata system. The ability to attach a version number to any table and have it automatically incremented when that table changes (via triggers) could help with all kinds of other Datasette-at-scale problems, including things like cached table counts.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753401001 https://github.com/simonw/datasette/issues/1168#issuecomment-753401001 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzQwMTAwMQ== simonw 9599 2021-01-01T23:01:45Z 2021-01-01T23:01:45Z OWNER

I need to prototype this. Could I do that as a plugin? I think so - I could try out the algorithm for loading metadata and display it on pages using some custom templates.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753400420 https://github.com/simonw/datasette/issues/1168#issuecomment-753400420 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzQwMDQyMA== simonw 9599 2021-01-01T22:53:58Z 2021-01-01T22:53:58Z OWNER

Precedence idea:
- First priority is non-_internal metadata from other databases - if those conflict then pick then the alphabetically-ordered-first database name wins
- Next priority: _internal metadata, which should have been loaded from metadata.json
- Last priority: the _metadata table from that database itself, i.e. the default "baked in" metadata

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753400306 https://github.com/simonw/datasette/issues/1168#issuecomment-753400306 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzQwMDMwNg== simonw 9599 2021-01-01T22:52:44Z 2021-01-01T22:52:44Z OWNER

Also: probably load column metadata as part of the table metadata rather than loading column metadata individually, since it's going to be rare to want the metadata for a single column rather than for an entire table full of columns.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753400265 https://github.com/simonw/datasette/issues/1168#issuecomment-753400265 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzQwMDI2NQ== simonw 9599 2021-01-01T22:52:09Z 2021-01-01T22:52:09Z OWNER

From an implementation perspective, I think the way this works is SQL queries read the relevant metadata from ALL available metadata tables, then Python code solves the precedence rules to produce the final, combined metadata for a database/table/column.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753399635 https://github.com/simonw/datasette/issues/1168#issuecomment-753399635 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5OTYzNQ== simonw 9599 2021-01-01T22:45:21Z 2021-01-01T22:50:21Z OWNER

Would also need to figure out the precedence rules:

  • What happens if the database has a _metadata table with data that conflicts with a remote metadata record from another database? I think the other database should win, because that allows plugins to over-ride the default metadata for something.
  • Do JSON values get merged together? So if one table provides a description and another provides a title do both values get returned?
  • If a database has a license, does that "cascade" down to the tables? What about source and about?
  • What if there are two databases (or more) that provide conflicting metadata for a table in some other database? Also, _internal may have loaded data from metadata.json that conflicts with some other remote table metadata definition.
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753399428 https://github.com/simonw/datasette/issues/1168#issuecomment-753399428 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5OTQyOA== simonw 9599 2021-01-01T22:43:14Z 2021-01-01T22:43:22Z OWNER

Could this use a compound primary key on database, table, column? Does that work with null values?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753399366 https://github.com/simonw/datasette/issues/1168#issuecomment-753399366 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5OTM2Ng== simonw 9599 2021-01-01T22:42:37Z 2021-01-01T22:42:37Z OWNER

So what would the database schema for this look like?

I'm leaning towards a single table called _metadata, because that's a neater fit for baking the metadata into the database file along with the data that it is describing. Alternatively I could have multiple tables sharing that prefix - _metadata_database and _metadata_tables and _metadata_columns perhaps.

If it's just a single _metadata table, the schema could look like this:

<table> <thead> <tr> <th>database</th> <th>table</th> <th>column</th> <th>metadata</th> </tr> </thead> <tbody> <tr> <td></td> <td>mytable</td> <td></td> <td>{"title": "My Table" }</td> </tr> <tr> <td></td> <td>mytable</td> <td>mycolumn</td> <td>{"description": "Column description" }</td> </tr> <tr> <td>otherdb</td> <td>othertable</td> <td></td> <td>{"description": "Table in another DB" }</td> </tr> </tbody> </table>

If the database column is null it means "this is describing a table in the same database file as this _metadata table".

The alternative to the metadata JSON column would be separate columns for each potential metadata value - license, source, about, about_url etc. But that makes it harder for people to create custom metadata fields.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753398542 https://github.com/simonw/datasette/issues/1168#issuecomment-753398542 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5ODU0Mg== simonw 9599 2021-01-01T22:37:24Z 2021-01-01T22:37:24Z OWNER

The direction I'm leaning in now is the following:

  • Metadata always lives in SQLite tables
  • These tables can be co-located with the database they describe (same DB file)
  • ... or they can be in a different DB file and reference the other database that they are describing
  • Metadata provided on startup in a metadata.json file is loaded into an in-memory metadata table using that same mechanism

Plugins that want to provide metadata can do so by populating a table. They could even maintain their own in-memory database for this, or they could write to the _internal in-memory database, or they could write to a table in a database on disk.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753392102 https://github.com/simonw/datasette/issues/1168#issuecomment-753392102 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5MjEwMg== simonw 9599 2021-01-01T22:06:33Z 2021-01-01T22:06:33Z OWNER

Some SQLite databases include SQL comments in the schema definition which tell you what each column means:

CREATE TABLE User
        -- A table comment
(
        uid INTEGER,    -- A field comment
        flags INTEGER   -- Another field comment
);

The problem with these is that they're not exposed to SQLite in any mechanism other than parsing the CREATE TABLE statement from the sqlite_master table to extract those columns.

I had an idea to build a plugin that could return these. That would be easy with a "get metadata for this column" plugin hook - in the absence of one a plugin could still run that reads the schemas on startup and uses them to populate a metadata database table somewhere.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753391869 https://github.com/simonw/datasette/issues/1168#issuecomment-753391869 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5MTg2OQ== simonw 9599 2021-01-01T22:04:30Z 2021-01-01T22:04:30Z OWNER

The sticking point here seems to be the plugin hook. Allowing plugins to over-ride the way the question "give me the metadata for this database/table/column" is answered makes the database-backed metadata mechanisms much more complicated to think about.

What if plugins didn't get to over-ride metadata in this way, but could instead update the metadata in a persistent Datasette-managed storage mechanism?

Then maybe Datasette could do the following:

  • Maintain metadata in _internal that has been loaded from metadata.json
  • Know how to check a database for baked-in metadata (maybe in a _metadata table)
  • Know how to fall back on the _internal metadata if no baked-in metadata is available

If database files were optionally allowed to store metadata about tables that live in another database file this could perhaps solve the plugin needs - since an "edit metadata" plugin would be able to edit records in a separate, dedicated metadata.db database to store new information about tables in other files.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753390791 https://github.com/simonw/datasette/issues/1168#issuecomment-753390791 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5MDc5MQ== simonw 9599 2021-01-01T22:00:42Z 2021-01-01T22:00:42Z OWNER

Here are the requirements I'm currently trying to satisfy:

  • It should be possible to query the metadata for ALL attached tables in one place, potentially with pagination and filtering
  • Metadata should be able to exist in the current metadata.json file
  • It should also be possible to bundle metadata in a table in the SQLite database files themselves
  • Plugins should be able to define their own special mechanisms for metadata. This is particularly interesting for providing a UI that allows users to edit the metadata for their existing tables.
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753390262 https://github.com/simonw/datasette/issues/1168#issuecomment-753390262 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM5MDI2Mg== simonw 9599 2021-01-01T21:58:11Z 2021-01-01T21:58:11Z OWNER

One possibility: plugins could write directly to that in-memory database table. But how would they know to write again should the server restart? Maybe they would write to it once when called by the startup plugin hook, and then update it (and their own backing store) when metadata changes for some reason. Feels a bit messy though.

Also: if I want to support metadata optionally living in a _metadata table colocated with the data in a SQLite database file itself, how would that affect the metadata columns in _internal? How often would Datasette denormalize and copy data across from the on-disk _metadata tables to the _internal in-memory columns?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753389938 https://github.com/simonw/datasette/issues/1168#issuecomment-753389938 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM4OTkzOA== simonw 9599 2021-01-01T21:54:15Z 2021-01-01T21:54:15Z OWNER

So what if the databases, tables and columns tables in _internal each grew a new metadata text column?

These columns could be populated by Datasette on startup through reading the metadata.json file. But how would plugins interact with them?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753389477 https://github.com/simonw/datasette/issues/1168#issuecomment-753389477 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM4OTQ3Nw== simonw 9599 2021-01-01T21:49:57Z 2021-01-01T21:49:57Z OWNER

What if metadata was stored in a JSON text column in the existing _internal tables? This would allow for users to invent additional metadata fields in the future beyond the current license, license_url etc fields - without needing a schema change.

The downside of JSON columns generally is that they're harder to run indexed queries against. For metadata I don't think that matters - even with 10,000 tables each with their own metadata a SQL query asking for e.g. "everything that has Apache 2 as the license" would return in just a few ms.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753388809 https://github.com/simonw/datasette/issues/1168#issuecomment-753388809 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM4ODgwOQ== simonw 9599 2021-01-01T21:47:51Z 2021-01-01T21:47:51Z OWNER

A database that exposes metadata will have the same restriction as the new _internal database that exposes columns and tables, in that it needs to take permissions into account. A user should not be able to view metadata for tables that they are not able to see.

As such, I'd rather bundle any metadata tables into the existing _internal database so I don't have to solve that permissions problem in two places.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753366024 https://github.com/simonw/datasette/issues/1168#issuecomment-753366024 https://api.github.com/repos/simonw/datasette/issues/1168 MDEyOklzc3VlQ29tbWVudDc1MzM2NjAyNA== simonw 9599 2021-01-01T18:48:34Z 2021-01-01T18:48:34Z OWNER

Also: in #188 I proposed bundling metadata in the SQLite database itself alongside the data. This is a great way of ensuring metadata travels with the data when it is downloaded as a SQLite .db file. But how would that play with the idea of an in-memory _metadata table? Could that table perhaps offer views that join data across multiple attached physical databases?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for storing metadata in _metadata tables 777333388  
753224999 https://github.com/simonw/datasette/issues/983#issuecomment-753224999 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIyNDk5OQ== jussiarpalahti 11941245 2020-12-31T23:29:36Z 2020-12-31T23:29:36Z NONE

I have yet to build Datasette plugin and am unfamiliar with Pluggy. Since browsers have event handling builtin Datasette could communicate with plugins through it. Handlers register as listeners for custom Datasette events and Datasette's JS can then trigger said events.

I was also wondering if you had looked at Javascript Modules for JS plugins? With services like Skypack (https://www.skypack.dev) NPM libraries can be loaded directly into browser, no build step needed. Same goes for local JS if you adhere to ES Module spec.

If minification is required then tools such as Snowpack (https://www.snowpack.dev) could fit better. It uses https://github.com/evanw/esbuild for bundling and minification.

On plugins you'd simply:

import {register} from '/assets/js/datasette'
register.on({'click' : my_func})

In Datasette HTML pages' head you'd merely import these files as modules one by one.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753224351 https://github.com/simonw/datasette/issues/1166#issuecomment-753224351 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzIyNDM1MQ== simonw 9599 2020-12-31T23:23:29Z 2020-12-31T23:23:29Z OWNER

I should configure the action to only run if changes have been made within the datasette/static directory.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753221646 https://github.com/simonw/datasette/issues/983#issuecomment-753221646 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIyMTY0Ng== simonw 9599 2020-12-31T22:58:47Z 2020-12-31T22:58:47Z OWNER

https://github.com/mishoo/UglifyJS/issues/1905#issuecomment-300485490 says:

sourceMappingURL aren't added by default in 3.x due to one of the feature requests not to - some users are putting them within HTTP response headers instead.

So the command line for that would be:

js $ uglifyjs main.js -cmo main.min.js --source-map url=main.min.js.map

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753221362 https://github.com/simonw/datasette/issues/1164#issuecomment-753221362 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1MzIyMTM2Mg== simonw 9599 2020-12-31T22:55:57Z 2020-12-31T22:55:57Z OWNER

I had to add this as the first line in table.min.js for the source mapping to work:

//# sourceMappingURL=/-/static/table.min.js.map
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
753220665 https://github.com/simonw/datasette/issues/1164#issuecomment-753220665 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1MzIyMDY2NQ== simonw 9599 2020-12-31T22:49:36Z 2020-12-31T22:49:36Z OWNER

I started with a 7K table.js file.

npx uglifyjs table.js --source-map -o table.min.js gave me a 5.6K table.min.js file.

npx uglifyjs table.js --source-map -o table.min.js --compress --mangle gave me 4.5K.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
753220412 https://github.com/simonw/datasette/issues/1164#issuecomment-753220412 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1MzIyMDQxMg== simonw 9599 2020-12-31T22:47:36Z 2020-12-31T22:47:36Z OWNER

I'm trying to minify table.js and I ran into a problem:

Uglification failed. Unexpected character '`'

It turns out uglify-js doesn't support ES6 syntax!

But uglify-es does:

npm install uglify-es

Annoyingly it looks like uglify-es uses the same CLI command, uglifyjs. So after installing it this seemed to work:

npx uglifyjs table.js --source-map -o table.min.js

I really don't like how npx uglifyjs could mean different things depending on which package was installed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
753219521 https://github.com/simonw/datasette/issues/983#issuecomment-753219521 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxOTUyMQ== simonw 9599 2020-12-31T22:39:52Z 2020-12-31T22:39:52Z OWNER

For inlining the plugins.min.js file into the Jinja templates I could use the trick described here: https://stackoverflow.com/a/41404611 - which adds a {{ include_file('file.txt') }} function to Jinja.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753219407 https://github.com/simonw/datasette/issues/983#issuecomment-753219407 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxOTQwNw== simonw 9599 2020-12-31T22:38:45Z 2020-12-31T22:39:10Z OWNER

You'll be able to add JavaScript plugins using a bunch of different mechanisms:

{
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753218817 https://github.com/simonw/datasette/issues/983#issuecomment-753218817 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxODgxNw== yozlet 173848 2020-12-31T22:32:25Z 2020-12-31T22:32:25Z NONE

Amazing work! And you've put in far more work than I'd expect to reduce the payload (which is admirable).

So, to add a plugin with the current design, it goes in (a) the template or (b) a bookmarklet, right?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753217917 https://github.com/simonw/datasette/issues/983#issuecomment-753217917 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxNzkxNw== simonw 9599 2020-12-31T22:23:29Z 2020-12-31T22:23:36Z OWNER

If I'm going to do that, it would be good if subsequent plugins that register against the load event are executed straight away. That's a bit of a weird edge-case in plugin world - it would involve the bulkier code that gets loaded redefining how datasette.plugins.register works to special-case the 'load' hook.

Maybe the tiny bootstrap code could define a datasette.plugins.onload(callbackFunction) method which gets upgraded later into something that fires straight away? Would add more bytes though.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753217714 https://github.com/simonw/datasette/issues/983#issuecomment-753217714 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxNzcxNA== simonw 9599 2020-12-31T22:21:33Z 2020-12-31T22:21:33Z OWNER

Eventually I'd like to provide a whole bunch of other datasette.X utility functions that plugins can use - things like datasette.addTabbedContentPane() or similar.

But I don't want to inline those into the page.

So... I think the basic plugin system remains inline - maybe from an inlined file called plugins-bootstrap.js. Then a separate plugins.js contains the rest of the API functionality.

If a plugin wants to take advantage of those APIs, maybe it registers itself using datasette.plugins.register('load', () => ...) - that load hook can then be fired once the bulkier plugin code has been loaded.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753217127 https://github.com/simonw/datasette/issues/987#issuecomment-753217127 https://api.github.com/repos/simonw/datasette/issues/987 MDEyOklzc3VlQ29tbWVudDc1MzIxNzEyNw== simonw 9599 2020-12-31T22:16:46Z 2020-12-31T22:16:46Z OWNER

I'm going to use class="plugin-content-pre-table" rather than id= - just because I still want to be able to display all of this stuff on the single https://latest.datasette.io/-/patterns page so duplicate IDs are best avoided.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Documented HTML hooks for JavaScript plugin authors 712984738  
753215761 https://github.com/simonw/datasette/issues/983#issuecomment-753215761 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxNTc2MQ== simonw 9599 2020-12-31T22:07:31Z 2020-12-31T22:07:31Z OWNER

I think I need to keep the mechanism whereby a plugin can return undefined in order to indicate that it has nothing to say for that specific item - that's borrowed from Pluggy and I've used it a bunch in my Python plugins. That makes the code a bit longer.

I'll write some example plugins to help me decide if the filtering-out-of-undefined mechanism is needed or not.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753215545 https://github.com/simonw/datasette/issues/983#issuecomment-753215545 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxNTU0NQ== simonw 9599 2020-12-31T22:05:41Z 2020-12-31T22:05:41Z OWNER

Using object destructuring like that is a great idea. I'm going to play with your version - it's delightfully succinct.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
753214664 https://github.com/simonw/datasette/issues/1166#issuecomment-753214664 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzIxNDY2NA== simonw 9599 2020-12-31T21:58:04Z 2020-12-31T21:58:04Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753211535 https://github.com/simonw/datasette/issues/1166#issuecomment-753211535 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzIxMTUzNQ== simonw 9599 2020-12-31T21:46:04Z 2020-12-31T21:46:04Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753210536 https://github.com/simonw/datasette/issues/1166#issuecomment-753210536 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzIxMDUzNg== simonw 9599 2020-12-31T21:45:19Z 2020-12-31T21:45:19Z OWNER

Oops, committed that bad formatting test to main instead of a branch!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753209192 https://github.com/simonw/datasette/issues/1166#issuecomment-753209192 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzIwOTE5Mg== simonw 9599 2020-12-31T21:44:22Z 2020-12-31T21:44:22Z OWNER

Tests passed in https://github.com/simonw/datasette/runs/1631677726?check_suite_focus=true

I'm going to try submitting a pull request with badly formatted JavaScript to see if it gets caught.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753200580 https://github.com/simonw/datasette/issues/1166#issuecomment-753200580 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzIwMDU4MA== simonw 9599 2020-12-31T21:38:06Z 2020-12-31T21:38:06Z OWNER

I think this should work:

- uses: actions/cache@v2
  with:
    path: ~/.npm
    key: ${{ runner.os }}-node-${{ hashFiles('**/prettier.yml' }}

I'll use the prettier.yml workflow that I'm about to create as the cache key for the NPM cache.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753197957 https://github.com/simonw/datasette/issues/1166#issuecomment-753197957 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzE5Nzk1Nw== simonw 9599 2020-12-31T21:36:14Z 2020-12-31T21:36:14Z OWNER

Maybe not that action actually - I wanted to use a pre-built action to avoid installing Prettier every time, but that's what it seems to do: https://github.com/creyD/prettier_action/blob/bb361e2979cff283ca7684908deac8f95400e779/entrypoint.sh#L28-L37

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753195905 https://github.com/simonw/datasette/issues/1166#issuecomment-753195905 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzE5NTkwNQ== simonw 9599 2020-12-31T21:34:46Z 2020-12-31T21:34:46Z OWNER

This action looks good - tag 3.2 is equivalent to this commit hash: https://github.com/creyD/prettier_action/tree/bb361e2979cff283ca7684908deac8f95400e779

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753193475 https://github.com/simonw/datasette/issues/1166#issuecomment-753193475 https://api.github.com/repos/simonw/datasette/issues/1166 MDEyOklzc3VlQ29tbWVudDc1MzE5MzQ3NQ== simonw 9599 2020-12-31T21:33:00Z 2020-12-31T21:33:00Z OWNER

I want a CI check that confirms that files conform to prettier - but only datasette/static/*.js files that are not already minified.

This seems to do the job:

npx prettier --check 'datasette/static/*[!.min].js'
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Adopt Prettier for JavaScript code formatting 777140799  
753033121 https://github.com/simonw/datasette/issues/1165#issuecomment-753033121 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1MzAzMzEyMQ== dracos 154364 2020-12-31T19:33:47Z 2020-12-31T19:33:47Z NONE

Sorry to go on about it, but it's my only example ;) And thought it might be of interest/use. Here is FixMyStreet's Cypress workflow https://github.com/mysociety/fixmystreet/blob/master/.github/workflows/cypress.yml with the master script that sets up server etc at https://github.com/mysociety/fixmystreet/blob/master/bin/browser-tests (that has features such as working inside/outside Vagrant, and can do JS code coverage) and then the tests are at https://github.com/mysociety/fixmystreet/tree/master/.cypress/cypress/integration

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752882797 https://github.com/simonw/datasette/issues/983#issuecomment-752882797 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjg4Mjc5Nw== dracos 154364 2020-12-31T08:07:59Z 2020-12-31T15:04:32Z NONE

If you're using arrow functions, you can presumably use default parameters, not much difference in support. That would save you 9 bytes. But OTOH you need "use strict"; to use arrow functions etc, and that's 13 bytes.

Your latest 250-byte one, with use strict, gzips to 199 bytes. The following might be 292 bytes, but compresses to 204, basically the same, and works in any browser (well, IE9+) at all:

var datasette=datasette||{};datasette.plugins=function(){var d={};return{register:function(b,c,e){d[b]||(d[b]=[]);d[b].push([c,e])},call:function(b,c){c=c||{};var e=[];(d[b]||[]).forEach(function(a){a=a[0].apply(a[0],a[1].map(function(a){return c[a]}));void 0!==a&&e.push(a)});return e}}}();

Source for that is below; I replaced the [fn,parameters] because closure-compiler includes a polyfill for that, and I ran closure-compiler --language_out ECMASCRIPT3:

var datasette = datasette || {};
datasette.plugins = (() => {
    var registry = {};
    return {
        register: (hook, fn, parameters) => {
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },
        call: (hook, args) => {
            args = args || {};
            var results = [];
            (registry[hook] || []).forEach((data) => {
                /* Call with the correct arguments */
                var result = data[0].apply(data[0], data[1].map(parameter => args[parameter]));
                if (result !== undefined) {
                    results.push(result);
                }
            });
            return results;
        }
    };
})();
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752888552 https://github.com/simonw/datasette/issues/983#issuecomment-752888552 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjg4ODU1Mg== dracos 154364 2020-12-31T08:33:11Z 2020-12-31T08:34:27Z NONE

If you could say that all hook functions had to accept one options parameter (and could use object destructuring if they wished to only see a subset), you could have this, which minifies (to all-browser-JS) to 200 bytes, gzips to 146, and works practically the same:

var datasette = datasette || {};
datasette.plugins = (() => {
    var registry = {};
    return {
        register: (hook, fn) => {
            registry[hook] = registry[hook] || [];
            registry[hook].push(fn);
        },
        call: (hook, args) => {
            var results = (registry[hook] || []).map(fn => fn(args||{}));
            return results;
        }
    };
})();

var datasette=datasette||{};datasette.plugins=function(){var b={};return{register:function(a,c){b[a]=b[a]||[];b[a].push(c)},call:function(a,c){return(b[a]||[]).map(function(a){return a(c||{})})}}}();

Called the same, definitions tiny bit different:

datasette.plugins.register('numbers', ({a, b}) => a + b)
datasette.plugins.register('numbers', o => o.a * o.b)
datasette.plugins.call('numbers', {a: 4, b: 6})
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752846267 https://github.com/simonw/datasette/issues/1165#issuecomment-752846267 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1Mjg0NjI2Nw== simonw 9599 2020-12-31T05:10:41Z 2020-12-31T05:13:14Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752839433 https://github.com/simonw/datasette/issues/1165#issuecomment-752839433 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1MjgzOTQzMw== simonw 9599 2020-12-31T04:29:40Z 2020-12-31T04:29:40Z OWNER

Important to absorb the slightly bizarre assertion syntax from Chai - docs here https://www.chaijs.com/api/bdd/

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752828851 https://github.com/simonw/datasette/issues/1165#issuecomment-752828851 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1MjgyODg1MQ== simonw 9599 2020-12-31T03:19:38Z 2020-12-31T03:19:38Z OWNER

I got Cypress working! I added the datasette.plugins code to the table template and ran a test called plugins.spec.js using the following:

context('datasette.plugins API', () => {
    beforeEach(() => {
      cy.visit('/fixtures/compound_three_primary_keys')
    });
    it('should exist', () => {
        let datasette;
        cy.window().then(win => {
            datasette = win.datasette;
        }).then(() => {
            expect(datasette).to.exist;
            expect(datasette.plugins).to.exist;
        });
    });
    it('should register and execute plugins', () => {
        let datasette;
        cy.window().then(win => {
            datasette = win.datasette;
        }).then(() => {
            expect(datasette.plugins.call('numbers')).to.deep.equal([]);
            // Register a plugin
            datasette.plugins.register("numbers", (a, b) => a + b, ['a', 'b']);
            var result = datasette.plugins.call("numbers", {a: 1, b: 2});
            expect(result).to.deep.equal([3]);
            // Second plugin
            datasette.plugins.register("numbers", (a, b) => a * b, ['a', 'b']);
            var result2 = datasette.plugins.call("numbers", {a: 1, b: 2});
            expect(result2).to.deep.equal([3, 2]);
        });
    });
});
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752780000 https://github.com/simonw/datasette/issues/1165#issuecomment-752780000 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1Mjc4MDAwMA== simonw 9599 2020-12-30T22:41:25Z 2020-12-30T22:41:25Z OWNER

Jest works with Puppeteer: https://jestjs.io/docs/en/puppeteer

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752779820 https://github.com/simonw/datasette/issues/1165#issuecomment-752779820 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1Mjc3OTgyMA== simonw 9599 2020-12-30T22:40:28Z 2020-12-30T22:40:28Z OWNER

I don't know if Jest on the command-line is the right tool for this. It works for the plugins.js script but I'm increasingly going to want to start adding tests for browser JavaScript features - like the https://github.com/simonw/datasette/blob/0.53/datasette/static/table.js script - which will need to run in a browser.

So maybe I should just find a browser testing solution and figure out how to run that under CI in GitHub Actions. Maybe https://www.cypress.io/ ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752779490 https://github.com/simonw/datasette/issues/1165#issuecomment-752779490 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1Mjc3OTQ5MA== simonw 9599 2020-12-30T22:38:43Z 2020-12-30T22:38:43Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752777744 https://github.com/simonw/datasette/issues/1165#issuecomment-752777744 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1Mjc3Nzc0NA== simonw 9599 2020-12-30T22:30:24Z 2020-12-30T22:30:24Z OWNER

https://www.valentinog.com/blog/jest/ was useful.

I created a static/__tests__ folder and added this file as plugins.spec.js:

const datasette = require("../plugins.js");

describe("Datasette Plugins", () => {
  test("it should have datasette.plugins", () => {
    expect(!!datasette.plugins).toEqual(true);
  });
  test("registering a plugin should work", () => {
    datasette.plugins.register("numbers", (a, b) => a + b, ["a", "b"]);
    var result = datasette.plugins.call("numbers", { a: 1, b: 2 });
    expect(result).toEqual([3]);
    datasette.plugins.register("numbers", (a, b) => a * b, ["a", "b"]);
    var result2 = datasette.plugins.call("numbers", { a: 1, b: 2 });
    expect(result2).toEqual([3, 2]);
  });
});

In static/plugins.js I put this:

var datasette = datasette || {};
datasette.plugins = (() => {
    var registry = {};
    return {
        register: (hook, fn, parameters) => {
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },
        call: (hook, args) => {
            args = args || {};
            var results = [];
            (registry[hook] || []).forEach(([fn, parameters]) => {
                /* Call with the correct arguments */
                var result = fn.apply(fn, parameters.map(parameter => args[parameter]));
                if (result !== undefined) {
                    results.push(result);
                }
            });
            return results;
        }
    };
})();

module.exports = datasette;

Note the module.exports line at the end.

Then inside static/ I ran the following command:

% npx jest -c '{}'
 PASS  __tests__/plugins.spec.js
  Datasette Plugins
    ✓ it should have datasette.plugins (3 ms)
    ✓ registering a plugin should work (1 ms)

Test Suites: 1 passed, 1 total
Tests:       2 passed, 2 total
Snapshots:   0 total
Time:        1.163 s
Ran all test suites.

The -c {} was necessary because I didn't have a Jest configuration or a package.json.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752773508 https://github.com/simonw/datasette/issues/983#issuecomment-752773508 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc3MzUwOA== simonw 9599 2020-12-30T22:10:08Z 2020-12-30T22:11:34Z OWNER

https://twitter.com/dracos/status/1344402639476424706 points out that plugins returning 0 will be ignored.

This should probably check for result !== undefined instead - knocks the size up to 250.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752770488 https://github.com/simonw/datasette/issues/983#issuecomment-752770488 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc3MDQ4OA== simonw 9599 2020-12-30T21:55:35Z 2020-12-30T21:58:26Z OWNER

This one minifies to 241:

var datasette = datasette || {};
datasette.plugins = (() => {
    var registry = {};
    return {
        register: (hook, fn, parameters) => {
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },
        call: (hook, args) => {
            args = args || {};
            var results = [];
            (registry[hook] || []).forEach(([fn, parameters]) => {
                /* Call with the correct arguments */
                var result = fn.apply(fn, parameters.map(parameter => args[parameter]));
                if (result) {
                    results.push(result);
                }
            });
            return results;
        }
    };
})();

var datasette=datasette||{};datasette.plugins=(()=>{var a={};return{register:(t,r,e)=>{a[t]||(a[t]=[]),a[t].push([r,e])},call:(t,r)=>{r=r||{};var e=[];return(a[t]||[]).forEach(([a,t])=>{var s=a.apply(a,t.map(a=>r[a]));s&&e.push(s)}),e}}})();

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752770133 https://github.com/simonw/datasette/issues/983#issuecomment-752770133 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc3MDEzMw== simonw 9599 2020-12-30T21:53:45Z 2020-12-30T21:54:22Z OWNER

FixMyStreet inlines some JavaScript, and it's always a good idea to copy what they're doing when it comes to web performance: https://github.com/mysociety/fixmystreet/blob/23e9564b58a86b783ce47f3c0bf837cbd4fe7282/templates/web/base/common_header_tags.html#L19-L25

Note var fixmystreet=fixmystreet||{}; which is shorter - https://twitter.com/dracos/status/1344399909794045954

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752769452 https://github.com/simonw/datasette/issues/1164#issuecomment-752769452 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1Mjc2OTQ1Mg== simonw 9599 2020-12-30T21:50:16Z 2020-12-30T21:50:16Z OWNER

If I implement this I can automate the CodeMirror minification and remove the bit about running uglify-js against it from the documentation here: https://docs.datasette.io/en/0.53/contributing.html#upgrading-codemirror

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
752768785 https://github.com/simonw/datasette/issues/1164#issuecomment-752768785 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1Mjc2ODc4NQ== simonw 9599 2020-12-30T21:47:06Z 2020-12-30T21:47:06Z OWNER

If I'm going to minify table.js I'd like to offer a source map for it.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
752768652 https://github.com/simonw/datasette/issues/1164#issuecomment-752768652 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1Mjc2ODY1Mg== simonw 9599 2020-12-30T21:46:29Z 2020-12-30T21:46:29Z OWNER
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
752767500 https://github.com/simonw/datasette/issues/983#issuecomment-752767500 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc2NzUwMA== simonw 9599 2020-12-30T21:42:07Z 2020-12-30T21:42:07Z OWNER

Another option: have both "dev" and "production" versions of the plugin mechanism script. Make it easy to switch between the two. Build JavaScript unit tests that exercise the "production" APIs against the development version, and have extra tests that just work against the features in the development version.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752767174 https://github.com/simonw/datasette/issues/983#issuecomment-752767174 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc2NzE3NA== simonw 9599 2020-12-30T21:40:44Z 2020-12-30T21:40:44Z OWNER

Started a Twitter thread about this here: https://twitter.com/simonw/status/1344392603794477056

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752751490 https://github.com/simonw/datasette/issues/983#issuecomment-752751490 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc1MTQ5MA== simonw 9599 2020-12-30T20:40:04Z 2020-12-30T21:34:22Z OWNER

This one is 683 bytes with Uglify - I like how https://skalman.github.io/UglifyJS-online/ shows you the minified character count as you edit the script:

window.datasette = window.datasette || {};
window.datasette.plugins = (() => {
    var registry = {};
    var definitions = {};
    var stringify = JSON.stringify;

    function extractParameters(fn) {
        var match = /\((.*)\)/.exec(fn.toString());
        if (match && match[1].trim()) {
            return match[1].split(',').map(s => s.trim());
        } else {
            return [];
        }
    }

    function isSubSet(a, b) {
        return a.every(parameter => b.includes(parameter))
    }

    return {
        _r: registry,
        define: (hook, parameters) => {
            definitions[hook] = parameters || [];
        },
        register: (hook, fn, parameters) => {
            parameters = parameters || extractParameters(fn);
            if (!definitions[hook]) {
                throw 'Hook "' + hook + '" not defined';
            }
            /* Check parameters is a subset of definitions[hook] */
            var validParameters = definitions[hook];
            if (!isSubSet(parameters, validParameters)) {
                throw '"' + hook + '" valid args: ' + stringify(validParameters);
            }
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },

        call: (hook, args) => {
            args = args || {};
            if (!definitions[hook]) {
                throw '"' + hook + '" hook not defined';
            }
            if (!isSubSet(Object.keys(args), definitions[hook])) {
                throw '"' + hook + '" valid args: ' + stringify(definitions[hook]);
            }

            var implementations = registry[hook] || [];
            var results = [];
            implementations.forEach(([fn, parameters]) => {
                /* Call with the correct arguments */
                var callWith = parameters.map(parameter => args[parameter]);
                var result = fn.apply(fn, callWith);
                if (result) {
                    results.push(result);
                }
            });
            return results;
        }       
    };
})();

window.datasette=window.datasette||{},window.datasette.plugins=(()=>{var t={},r={},e=JSON.stringify;function i(t,r){return t.every(t=>r.includes(t))}return{_r:t,define:(t,e)=>{r[t]=e||[]},register:(a,n,o)=>{if(o=o||function(t){var r=/\((.*)\)/.exec(t.toString());return r&&r[1].trim()?r[1].split(",").map(t=>t.trim()):[]}(n),!r[a])throw'Hook "'+a+'" not defined';var d=r[a];if(!i(o,d))throw'"'+a+'" valid args: '+e(d);t[a]||(t[a]=[]),t[a].push([n,o])},call:(a,n)=>{if(n=n||{},!r[a])throw'"'+a+'" hook not defined';if(!i(Object.keys(n),r[a]))throw'"'+a+'" valid args: '+e(r[a]);var o=t[a]||[],d=[];return o.forEach(([t,r])=>{var e=r.map(t=>n[t]),i=t.apply(t,e);i&&d.push(i)}),d}}})();

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752760815 https://github.com/simonw/datasette/issues/983#issuecomment-752760815 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc2MDgxNQ== simonw 9599 2020-12-30T21:15:41Z 2020-12-30T21:15:41Z OWNER

I'm going to write a few example plugins and try them out against the longer and shorter versions of the script, to get a better feel for how useful the longer versions with the error handling and explicit definition actually are.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752760054 https://github.com/simonw/datasette/issues/983#issuecomment-752760054 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc2MDA1NA== simonw 9599 2020-12-30T21:12:36Z 2020-12-30T21:14:05Z OWNER

I gotta admit that 262 byte version is pretty tempting, if it's going to end up in the <head> of every single page.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752759885 https://github.com/simonw/datasette/issues/983#issuecomment-752759885 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc1OTg4NQ== simonw 9599 2020-12-30T21:11:52Z 2020-12-30T21:14:00Z OWNER

262 bytes if I remove the parameter introspection code, instead requiring plugin authors to specify the arguments they take:

window.datasette = window.datasette || {};
window.datasette.plugins = (() => {
    var registry = {};
    return {
        register: (hook, fn, parameters) => {
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },
        call: (hook, args) => {
            args = args || {};
            var results = [];
            (registry[hook] || []).forEach(([fn, parameters]) => {
                /* Call with the correct arguments */
                var callWith = parameters.map(parameter => args[parameter]);
                var result = fn.apply(fn, callWith);
                if (result) {
                    results.push(result);
                }
            });
            return results;
        }
    };
})();

window.datasette=window.datasette||{},window.datasette.plugins=(()=>{var a={};return{register:(t,e,r)=>{a[t]||(a[t]=[]),a[t].push([e,r])},call:(t,e)=>{e=e||{};var r=[];return(a[t]||[]).forEach(([a,t])=>{var s=t.map(a=>e[a]),d=a.apply(a,s);d&&r.push(d)}),r}}})();

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752758802 https://github.com/simonw/datasette/issues/983#issuecomment-752758802 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc1ODgwMg== simonw 9599 2020-12-30T21:07:33Z 2020-12-30T21:10:10Z OWNER

Removing the datasette.plugin.define() method and associated error handling reduces the uglified version from 683 bytes to 380 bytes. I think the error checking is worth the extra 303 bytes per page load, even if it's only really needed for a better developer experience.

window.datasette = window.datasette || {};
window.datasette.plugins = (() => {
    var registry = {};

    function extractParameters(fn) {
        var match = /\((.*)\)/.exec(fn.toString());
        if (match && match[1].trim()) {
            return match[1].split(',').map(s => s.trim());
        } else {
            return [];
        }
    }
    return {
        register: (hook, fn, parameters) => {
            parameters = parameters || extractParameters(fn);
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },

        call: (hook, args) => {
            args = args || {};
            var implementations = registry[hook] || [];
            var results = [];
            implementations.forEach(([fn, parameters]) => {
                /* Call with the correct arguments */
                var callWith = parameters.map(parameter => args[parameter]);
                var result = fn.apply(fn, callWith);
                if (result) {
                    results.push(result);
                }
            });
            return results;
        }
    };
})();

window.datasette=window.datasette||{},window.datasette.plugins=(()=>{var t={};return{register:(r,a,e)=>{e=e||function(t){var r=/\((.*)\)/.exec(t.toString());return r&&r[1].trim()?r[1].split(",").map(t=>t.trim()):[]}(a),t[r]||(t[r]=[]),t[r].push([a,e])},call:(r,a)=>{a=a||{};var e=t[r]||[],i=[];return e.forEach(([t,r])=>{var e=r.map(t=>a[t]),n=t.apply(t,e);n&&i.push(n)}),i}}})();

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752757910 https://github.com/simonw/datasette/issues/1165#issuecomment-752757910 https://api.github.com/repos/simonw/datasette/issues/1165 MDEyOklzc3VlQ29tbWVudDc1Mjc1NzkxMA== simonw 9599 2020-12-30T21:04:18Z 2020-12-30T21:04:18Z OWNER

https://jestjs.io/ looks worth trying here.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for executing JavaScript unit tests 776635426  
752757289 https://github.com/simonw/datasette/issues/983#issuecomment-752757289 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc1NzI4OQ== simonw 9599 2020-12-30T21:02:20Z 2020-12-30T21:02:20Z OWNER

I'm going to need to add JavaScript unit tests for this new plugin system.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752757075 https://github.com/simonw/datasette/issues/1164#issuecomment-752757075 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1Mjc1NzA3NQ== simonw 9599 2020-12-30T21:01:27Z 2020-12-30T21:01:27Z OWNER

I don't want Datasette contributors to need a working Node.js install to run the tests or work on Datasette unless they are explicitly working on the JavaScript.

I think I'm going to do this with a unit test that runs only if upglify-js is available on the path and confirms that the *.min.js version of each script in the repository correctly matches the results from running uglify-js against it.

That way if anyone checks in a change to JavaScript but forgets to run the minifier the tests will fail in CI.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
752756612 https://github.com/simonw/datasette/issues/1164#issuecomment-752756612 https://api.github.com/repos/simonw/datasette/issues/1164 MDEyOklzc3VlQ29tbWVudDc1Mjc1NjYxMg== simonw 9599 2020-12-30T20:59:54Z 2020-12-30T20:59:54Z OWNER

I tried a few different pure-Python JavaScript minifying libraries and none of them produced results as good as https://www.npmjs.com/package/uglify-js for the plugin code I'm considering in #983.

So I think I'll need to rely on a Node.js tool for this.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for minifying JavaScript that ships with Datasette 776634318  
752750551 https://github.com/simonw/datasette/issues/983#issuecomment-752750551 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc1MDU1MQ== simonw 9599 2020-12-30T20:36:38Z 2020-12-30T20:37:48Z OWNER

This version minifies to 702 characters:

window.datasette = window.datasette || {};
window.datasette.plugins = (() => {
    var registry = {};
    var definitions = {};
    var stringify = JSON.stringify;

    function extractParameters(fn) {
        var match = /\((.*)\)/.exec(fn.toString());
        if (match && match[1].trim()) {
            return match[1].split(',').map(s => s.trim());
        } else {
            return [];
        }
    }

    function isSubSet(a, b) {
        return a.every(parameter => b.includes(parameter))
    }

    return {
        _registry: registry,
        define: (hook, parameters) => {
            definitions[hook] = parameters || [];
        },
        register: (hook, fn, parameters) => {
            parameters = parameters || extractParameters(fn);
            if (!definitions[hook]) {
                throw '"' + hook + '" is not a defined hook';
            }
            /* Check parameters is a subset of definitions[hook] */
            var validParameters = definitions[hook];
            if (!isSubSet(parameters, validParameters)) {
                throw '"' + hook + '" valid args are ' + stringify(validParameters);
            }
            if (!registry[hook]) {
                registry[hook] = [];
            }
            registry[hook].push([fn, parameters]);
        },

        call: (hook, args) => {
            args = args || {};
            if (!definitions[hook]) {
                throw '"' + hook + '" hook is not defined';
            }
            if (!isSubSet(Object.keys(args), definitions[hook])) {
                throw '"' + hook + '" valid args: ' + stringify(definitions[hook]);
            }

            var implementations = registry[hook] || [];
            var results = [];
            implementations.forEach(([fn, parameters]) => {
                /* Call with the correct arguments */
                var callWith = parameters.map(parameter => args[parameter]);
                var result = fn.apply(fn, callWith);
                if (result) {
                    results.push(result);
                }
            });
            return results;
        }       
    };
})();

Or 701 characters using https://skalman.github.io/UglifyJS-online/

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752749189 https://github.com/simonw/datasette/issues/983#issuecomment-752749189 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc0OTE4OQ== simonw 9599 2020-12-30T20:31:28Z 2020-12-30T20:31:28Z OWNER

Using raw string exceptions, throw '"' + hook + '" hook has not been defined';, knocks it down to 795 characters.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752748496 https://github.com/simonw/datasette/issues/983#issuecomment-752748496 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc0ODQ5Ng== simonw 9599 2020-12-30T20:28:48Z 2020-12-30T20:28:48Z OWNER

If I'm going to minify it I'll need to figure out a build step in Datasette itself so that I can easily work on that minified version.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752747999 https://github.com/simonw/datasette/issues/983#issuecomment-752747999 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc0Nzk5OQ== simonw 9599 2020-12-30T20:27:00Z 2020-12-30T20:27:00Z OWNER

I need to decide how this code is going to be loaded. Putting it in a blocking <script> element in the head would work, but I'd rather not block loading of the rest of the page. Using a <script async> method would be nicer, but then I have to worry about plugins attempting to register themselves before the page has fully loaded.

Running it through https://javascript-minifier.com/ produces this, which is 855 characters - so maybe I could inline that into the header of the page?

window.datasette={},window.datasette.plugins=function(){var r={},n={};function e(r,n){return r.every(r=>n.includes(r))}return{define:function(r,e){n[r]=e||[]},register:function(t,i,o){if(o=o||function(r){var n=/\((.*)\)/.exec(r.toString());return n&&n[1].trim()?n[1].split(",").map(r=>r.trim()):[]}(i),!n[t])throw new Error('"'+t+'" is not a defined plugin hook');if(!n[t])throw new Error('"'+t+'" is not a defined plugin hook');var a=n[t];if(!e(o,a))throw new Error('"'+t+'" valid parameters are '+JSON.stringify(a));r[t]||(r[t]=[]),r[t].push([i,o])},_registry:r,call:function(t,i){if(i=i||{},!n[t])throw new Error('"'+t+'" hook has not been defined');if(!e(Object.keys(i),n[t]))throw new Error('"'+t+'" valid arguments are '+JSON.stringify(n[t]));var o=r[t]||[],a=[];return o.forEach(([r,n])=>{var e=n.map(r=>i[r]),t=r.apply(r,e);t&&a.push(t)}),a}}}();

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752747169 https://github.com/simonw/datasette/issues/983#issuecomment-752747169 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc0NzE2OQ== simonw 9599 2020-12-30T20:24:07Z 2020-12-30T20:24:07Z OWNER

This version adds datasette.plugins.define() plus extra validation of both .register() and .call():

window.datasette = {};
window.datasette.plugins = (function() {
    var registry = {};
    var definitions = {};

    function extractParameters(fn) {
        var match = /\((.*)\)/.exec(fn.toString());
        if (match && match[1].trim()) {
            return match[1].split(',').map(s => s.trim());
        } else {
            return [];
        }
    }

    function define(hook, parameters) {
        definitions[hook] = parameters || [];
    }

    function isSubSet(a, b) {
        return a.every(parameter => b.includes(parameter))
    }

    function register(hook, fn, parameters) {
        parameters = parameters || extractParameters(fn);
        if (!definitions[hook]) {
            throw new Error('"' + hook + '" is not a defined plugin hook');
        }
        if (!definitions[hook]) {
            throw new Error('"' + hook + '" is not a defined plugin hook');
        }
        /* Check parameters is a subset of definitions[hook] */
        var validParameters = definitions[hook];
        if (!isSubSet(parameters, validParameters)) {
            throw new Error('"' + hook + '" valid parameters are ' + JSON.stringify(validParameters));
        }
        if (!registry[hook]) {
            registry[hook] = [];
        }
        registry[hook].push([fn, parameters]);
    }

    function call(hook, args) {
        args = args || {};
        if (!definitions[hook]) {
            throw new Error('"' + hook + '" hook has not been defined');
        }
        if (!isSubSet(Object.keys(args), definitions[hook])) {
            throw new Error('"' + hook + '" valid arguments are ' + JSON.stringify(definitions[hook]));
        }

        var implementations = registry[hook] || [];
        var results = [];
        implementations.forEach(([fn, parameters]) => {
            /* Call with the correct arguments */
            var callWith = parameters.map(parameter => args[parameter]);
            var result = fn.apply(fn, callWith);
            if (result) {
                results.push(result);
            }
        });
        return results;
    }
    return {
        define: define,
        register: register,
        _registry: registry,
        call: call
    };
})();

Usage:

datasette.plugins.define('numbers', ['a', 'b'])
datasette.plugins.register('numbers', (a, b) => a + b)
datasette.plugins.register('numbers', (a, b) => a * b)
datasette.plugins.call('numbers', {a: 4, b: 6})
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  
752744311 https://github.com/simonw/datasette/issues/983#issuecomment-752744311 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1Mjc0NDMxMQ== simonw 9599 2020-12-30T20:12:50Z 2020-12-30T20:13:02Z OWNER

This could work to define a plugin hook:

datasette.plugins.define('numbers', ['a' ,'b'])
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
JavaScript plugin hooks mechanism similar to pluggy 712260429  

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);