{"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-754007242", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 754007242, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDAwNzI0Mg==", "user": {"value": 3637, "label": "benpickles"}, "created_at": "2021-01-04T14:29:57Z", "updated_at": "2021-01-04T14:29:57Z", "author_association": "CONTRIBUTOR", "body": "I somewhat share your reluctance to add a package.json to seemingly every project out there but ultimately if they're project dependencies it's important they're managed within the codebase.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1170#issuecomment-754004715", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1170", "id": 754004715, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDAwNDcxNQ==", "user": {"value": 3637, "label": "benpickles"}, "created_at": "2021-01-04T14:25:44Z", "updated_at": "2021-01-04T14:25:44Z", "author_association": "CONTRIBUTOR", "body": "I was going to re-add the filter to only run Prettier when there have been changes in `datasette/static` but that would mean it wouldn't run when the package is updated. That plus the fact that [the last run of the job took only 8 seconds](https://github.com/benpickles/datasette/runs/1640121514) is why I decided not to re-add the filter.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778126516, "label": "Install Prettier via package.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1170#issuecomment-754002859", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1170", "id": 754002859, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDAwMjg1OQ==", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2021-01-04T14:22:52Z", "updated_at": "2021-01-04T14:22:52Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=h1) Report\n> Merging [#1170](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=desc) (a5761cc) into [main](https://codecov.io/gh/simonw/datasette/commit/1e8fa3ac7cb2d6e516c47c306c86ed2334fc3dc0?el=desc) (1e8fa3a) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1170/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## main #1170 +/- ##\n=======================================\n Coverage 91.55% 91.55% \n=======================================\n Files 32 32 \n Lines 3932 3932 \n=======================================\n Hits 3600 3600 \n Misses 332 332 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=footer). Last update [1e8fa3a...a5761cc](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778126516, "label": "Install Prettier via package.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753690280", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753690280, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY5MDI4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T23:13:30Z", "updated_at": "2021-01-03T23:13:30Z", "author_association": "OWNER", "body": "Oh that's interesting, I hadn't thought about plugins firing events - just responding to events fired by the rest of the application.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671902", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753671902, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MTkwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:31:04Z", "updated_at": "2021-01-03T20:32:13Z", "author_association": "OWNER", "body": "A `table.has_count_triggers` property.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671235", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753671235, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MTIzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:25:10Z", "updated_at": "2021-01-03T20:25:10Z", "author_association": "OWNER", "body": "To detect tables, look at the names of the triggers - `{table}{counts_table}_insert` and `{table}{counts_table}_delete`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671009", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753671009, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MTAwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:22:53Z", "updated_at": "2021-01-03T20:22:53Z", "author_association": "OWNER", "body": "I think this should be accompanied by a `sqlite-utils reset-counts` command.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753670833", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/219", "id": 753670833, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY3MDgzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T20:20:54Z", "updated_at": "2021-01-03T20:20:54Z", "author_association": "OWNER", "body": "This is a little tricky. We should assume that the existing values in the `_counts` table cannot be trusted at all when this method is called - so we should probably clear that table entirely and then re-populate it.\r\n\r\nBut that means we need to figure out which tables in the database have the counts triggers defined.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777707544, "label": "reset_counts() method and command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753668099", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753668099, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2ODA5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T19:55:53Z", "updated_at": "2021-01-03T19:55:53Z", "author_association": "OWNER", "body": "So if you instantiate the `Database()` constructor with `use_counts_table=True` any access to the `.count` properties will go through this table - otherwise regular `count(*)` queries will be executed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753665521", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753665521, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2NTUyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T19:31:33Z", "updated_at": "2021-01-03T19:31:33Z", "author_association": "OWNER", "body": "I'm having second thoughts about this being the default behaviour. It's pretty weird. I feel like HUGE databases that need this are rare, so having it on by default doesn't make sense.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753662490", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753662490, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MjQ5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T19:05:53Z", "updated_at": "2021-01-03T19:05:53Z", "author_association": "OWNER", "body": "Idea: a `.execute_count()` method that never uses the cache.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661292", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753661292, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MTI5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:56:06Z", "updated_at": "2021-01-03T18:56:23Z", "author_association": "OWNER", "body": "Another option: on creation of the `Database()` object, check to see if the `_counts` table exists and use that as the default for a `use_counts_table` property. Also flip that property to `True` if the user calls `.enable_counts()` at any time.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661158", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753661158, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MTE1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:55:16Z", "updated_at": "2021-01-03T18:55:16Z", "author_association": "OWNER", "body": "Alternative implementation: provided `db.should_trust_counts` is `True`, try running the query:\r\n```sql\r\nselect count from _counts where [table] = ?\r\n```\r\nIf the query fails to return a result OR throws an error because the table doesn't exist, run the `count(*)` query.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660814", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753660814, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MDgxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:53:05Z", "updated_at": "2021-01-03T18:53:05Z", "author_association": "OWNER", "body": "Here's the current `.count` property: https://github.com/simonw/sqlite-utils/blob/036ec6d32313487527c66dea613a3e7118b97459/sqlite_utils/db.py#L597-L609\r\n\r\nIt's implemented on `Queryable` which means it's available on both `Table` and `View` - the optimization doesn't make sense for views.\r\n\r\nI'm a bit cautious about making that property so much more complex. In order to decide if it should try the `_counts` table first it needs to know:\r\n\r\n- Should it be trusting the counts? I'm thinking a `.should_trust_counts` property on `Database` which defaults to `True` would be good - then advanced users can turn that off if they know the counts should not be trusted.\r\n- Does the `_counts` table exist?\r\n- Are the triggers defined?\r\n\r\nThen it can do the query, and if the query fails it can fall back on the `count(*)`. That's quite a lot of extra activity though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660379", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753660379, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY2MDM3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:50:15Z", "updated_at": "2021-01-03T18:50:15Z", "author_association": "OWNER", "body": "```python\r\n def cached_counts(self, tables=None):\r\n sql = \"select [table], count from {}\".format(self._counts_table_name)\r\n if tables:\r\n sql += \" where [table] in ({})\".format(\", \".join(\"?\" for table in tables))\r\n return {r[0]: r[1] for r in self.execute(sql, tables).fetchall()}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/206#issuecomment-753659260", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/206", "id": 753659260, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1OTI2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:42:01Z", "updated_at": "2021-01-03T18:42:01Z", "author_association": "OWNER", "body": "```\r\n% sqlite-utils insert blah.db blah global_power_plant_database.csv\r\nError: Invalid JSON - use --csv for CSV or --tsv for TSV files\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 761915790, "label": "sqlite-utils should suggest --csv if JSON parsing fails"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753657180", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753657180, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1NzE4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T18:23:30Z", "updated_at": "2021-01-03T18:23:30Z", "author_association": "OWNER", "body": "Also welcome in that PR would be a bit of documentation for contributors, see #1167 - but no problem if you leave that out, I'm happy to add it later.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753653260", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753653260, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T17:54:40Z", "updated_at": "2021-01-03T17:54:40Z", "author_association": "OWNER", "body": "And @benpickles yes I would land that pull request straight away as-is. Thanks!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753653033", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753653033, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1MzAzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T17:52:53Z", "updated_at": "2021-01-03T17:52:53Z", "author_association": "OWNER", "body": "Oh that's so frustrating! I was worried about that - I spotted a few runs that seemed faster and hoped that it meant that the package was coming out of the `~/.npm` cache, but evidently that's not the case.\r\n\r\nYou've convinced me that Datasette itself should have a `package.json` - the Dependabot argument is a really good one.\r\n\r\nBut... I'd really love to figure out a general pattern for using `npx` scripts in GitHub Actions workflows in a cache-friendly way. I have plenty of other projects that I'd love to run Prettier or Uglify or `puppeteer-cli` in without adding a `package.json` to them.\r\n\r\nAny ideas? The best I can think of is for the workflow itself to write out a `package.json` file (using `echo '{ ... }' > package.json`) as part of the run - that way the cache should work (I think) but I don't get a misleading `package.json` file sitting in the repo.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753600999", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753600999, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzYwMDk5OQ==", "user": {"value": 475613, "label": "MarkusH"}, "created_at": "2021-01-03T11:11:21Z", "updated_at": "2021-01-03T11:11:21Z", "author_association": "NONE", "body": "With regards to JS/Browser events, given your example of menu items that plugins could add, I could imagine this code to work:\r\n\r\n```js\r\n// as part of datasette\r\ndatasette.events.AddMenuItem = 'DatasetteAddMenuItemEvent';\r\ndocument.addEventListener(datasette.events.AddMenuItem, (e) => {\r\n // do whatever is needed to add the menu item. Data comes from `e`\r\n alert(e.title + ' ' + e.link);\r\n});\r\n\r\n// as part of a plugin\r\nconst event = new Event(datasette.events.AddMenuItem, {link: '/foo/bar', title: 'Go somewhere'});\r\nDocument.dispatchEvent(event)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753587963", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753587963, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU4Nzk2Mw==", "user": {"value": 154364, "label": "dracos"}, "created_at": "2021-01-03T09:02:50Z", "updated_at": "2021-01-03T10:00:05Z", "author_association": "NONE", "body": "> but I'm already commited to requiring support for () => {} arrow functions\r\n\r\nDon't think you are :) (e.g. gzipped, using arrow functions in my example saves 2 bytes over spelling out function). On FMS, past month, looking at popular browsers, looks like we'd have 95.41% arrow support, 94.19% module support, and 4.58% (mostly IE9/IE11/Safari 9) supporting neither.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753570710", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753570710, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU3MDcxMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T05:29:56Z", "updated_at": "2021-01-03T05:29:56Z", "author_association": "OWNER", "body": "I thought about using browser events, but they don't quite match the API that I'm looking to provide. In particular, the great thing about Pluggy is that if you have multiple handlers registered for a specific plugin hook each of those handlers can return a value, and Pluggy will combine those values into a list of replies.\r\n\r\nThis is great for things like plugin hooks that add extra menu items - each plugin can return a menu item (maybe as a label/URL/click-callback object) and the calling code can then add all of those items to the menu. See https://docs.datasette.io/en/stable/plugin_hooks.html#table-actions-datasette-actor-database-table for a Python example.\r\n\r\nI'm on the fence about relying on JavaScript modules. I need to think about browser compatibility for them - but I'm already commited to requiring support for `() => {}` arrow functions so maybe I'm committed to module support too already?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1160#issuecomment-753568428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1160", "id": 753568428, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T05:02:32Z", "updated_at": "2021-01-03T05:02:32Z", "author_association": "OWNER", "body": "Should this command include a `--fts` option for configuring full-text search on one-or-more columns?\r\n\r\nI thought about doing that for `sqlite-utils insert` in https://github.com/simonw/sqlite-utils/issues/202 and decided not to because of the need to include extra options covering the FTS version, porter stemming options and whether or not to create triggers.\r\n\r\nBut maybe I can set sensible defaults for that with `datasette insert ... -f title -f body`? Worth thinking about a bit more.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 775666296, "label": "\"datasette insert\" command and plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753568264", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/202", "id": 753568264, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2ODI2NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T05:00:24Z", "updated_at": "2021-01-03T05:00:24Z", "author_association": "OWNER", "body": "I'm not going to implement this, because it actually needs several additional options that already exist on `sqlite-utils enable-fts`:\r\n```\r\n --fts4 Use FTS4\r\n --fts5 Use FTS5\r\n --tokenize TEXT Tokenizer to use, e.g. porter\r\n --create-triggers Create triggers to update the FTS tables when the\r\n parent table changes.\r\n```\r\nI'd rather not add all four of those options to `sqlite-utils insert` just to support this shortcut.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 738514367, "label": "sqlite-utils insert -f colname - for configuring full-text search"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753567969", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/202", "id": 753567969, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2Nzk2OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:55:17Z", "updated_at": "2021-01-03T04:55:43Z", "author_association": "OWNER", "body": "The long version of this can be `--fts`, same as in `csvs-to-sqlite`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 738514367, "label": "sqlite-utils insert -f colname - for configuring full-text search"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 753567932, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NzkzMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:54:43Z", "updated_at": "2021-01-03T04:54:43Z", "author_association": "OWNER", "body": "Another option: expand the `ForeignKey` object to have `.columns` and `.other_columns` properties in addition to the existing `.column` and `.other_column` properties. These new plural properties would always return a tuple, which would be a one-item tuple for a non-compound-foreign-key.\r\n\r\nThe question then is what should `.column` and `.other_column` return for compound foreign keys?\r\n\r\nI'd be inclined to say they should return `None` - which would trigger errors in code that encounters a compound foreign key for the first time, but those errors would at least be a strong indicator as to what had gone wrong.\r\n\r\nWe can label `.column` and `.other_column` as deprecated and then remove them in `sqlite-utils 4.0`.\r\n\r\nSince this would still be a breaking change in some minor edge-cases I'm thinking maybe 4.0 needs to happen in order to land this feature. I'm not opposed to doing that, I was just hoping it might be avoidable.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567744", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 753567744, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2Nzc0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:51:44Z", "updated_at": "2021-01-03T04:51:44Z", "author_association": "OWNER", "body": "One way that this could avoid a breaking change would be to have `fk.column` and `fk.other_column` remain as strings for non-compound-foreign-keys, but turn into tuples for a compound foreign key.\r\n\r\nThis is a bit of an ugly API design, and it could still break existing code that encounters a compound foreign key for the first time - but it would leave code working for the more common case of a non-compound-foreign-key.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567508", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/203", "id": 753567508, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NzUwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:48:17Z", "updated_at": "2021-01-03T04:48:17Z", "author_association": "OWNER", "body": "Sorry for taking so long to review this!\r\n\r\nThis approach looks great to me - being able to optionally pass a tuple anywhere the API currently expects a column is smart, and it's consistent with how the `pk=` parameter works elsewhere.\r\n\r\nThere's just one problem I can see with this: the way it changes the `ForeignKey(...)` interface to always return a tuple for `.column` and `.other_column`, even if that tuple only contains a single item.\r\n\r\nThis represents a breaking change to the existing API - any code that expects `ForeignKey.column` to be a single string (which is any code that has been written against that) will break.\r\n\r\nAs such, I'd have to bump the major version of `sqlite-utils` to `4.0` in order to ship this.\r\n\r\nIdeally I'd like to make this change in a way that doesn't represent an API compatibility break. I need to think a bit harder about how that might be achieved.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 743384829, "label": "changes to allow for compound foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753566184", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/217", "id": 753566184, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NjE4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:27:38Z", "updated_at": "2021-01-03T04:27:38Z", "author_association": "OWNER", "body": "Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#quoting-strings-for-use-in-sql", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777543336, "label": "Rename .escape() to .quote()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/216#issuecomment-753566156", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/216", "id": 753566156, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2NjE1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T04:27:14Z", "updated_at": "2021-01-03T04:27:14Z", "author_association": "OWNER", "body": "Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#introspection", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777540352, "label": "database.triggers_dict introspection property"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/218#issuecomment-753563757", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/218", "id": 753563757, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU2Mzc1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T03:49:51Z", "updated_at": "2021-01-03T03:49:51Z", "author_association": "OWNER", "body": "Documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#listing-triggers", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777560474, "label": "\"sqlite-utils triggers\" command"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753545757, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU0NTc1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T23:58:07Z", "updated_at": "2021-01-02T23:58:07Z", "author_association": "OWNER", "body": "Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers.\r\n\r\nOne way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545381", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/215", "id": 753545381, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU0NTM4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T23:52:52Z", "updated_at": "2021-01-02T23:52:52Z", "author_association": "OWNER", "body": "Idea: a `db.cached_counts()` method that returns a dictionary of data from the `_counts` table. Call it with a list of tables to get back the counts for just those tables.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777535402, "label": "Use _counts to speed up counts"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753544914", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/217", "id": 753544914, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzU0NDkxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T23:47:42Z", "updated_at": "2021-01-02T23:47:42Z", "author_association": "OWNER", "body": "https://github.com/simonw/sqlite-utils/blob/9a5c92b63e7917c93cc502478493c51c781b2ecc/sqlite_utils/db.py#L231-L239", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777543336, "label": "Rename .escape() to .quote()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753535488", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/213", "id": 753535488, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUzNTQ4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T22:03:48Z", "updated_at": "2021-01-02T22:03:48Z", "author_association": "OWNER", "body": "I got this error while prototyping this:\r\n\r\n too many levels of trigger recursion\r\n\r\nIt looks like that's because SQLite doesn't like triggers on a table that themselves then update that table - so I'm going to exclude the `_counts` table from this mechanism.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777529979, "label": "db.enable_counts() method"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753533775", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/213", "id": 753533775, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUzMzc3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T21:47:10Z", "updated_at": "2021-01-02T21:47:10Z", "author_association": "OWNER", "body": "I'm going to skip virtual tables, which I can identify using this property: https://github.com/simonw/sqlite-utils/blob/1cad7fad3e7a5b734088f5cc545b69a055e636da/sqlite_utils/db.py#L720-L726", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777529979, "label": "db.enable_counts() method"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1012#issuecomment-753531657", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1012", "id": 753531657, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUzMTY1Nw==", "user": {"value": 45380, "label": "bollwyvl"}, "created_at": "2021-01-02T21:25:36Z", "updated_at": "2021-01-02T21:25:36Z", "author_association": "CONTRIBUTOR", "body": "Actually, on more research, I found out this is handled by the [trove-classifiers package](https://github.com/pypa/trove-classifiers/blob/master/src/trove_classifiers/__init__.py#L2) now, so it's just a one-liner pr instead of fire-up-a-docker-container-and-do-some-migrations", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 718540751, "label": "For 1.0 update trove classifier in setup.py"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753524779", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753524779, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzUyNDc3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T20:19:26Z", "updated_at": "2021-01-02T20:19:26Z", "author_association": "OWNER", "body": "Idea: version the metadata scheme. If the table is called `_metadata_v1` it gives me a clear path to designing a new scheme in the future.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/212#issuecomment-753422324", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/212", "id": 753422324, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQyMjMyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T03:00:34Z", "updated_at": "2021-01-02T03:00:34Z", "author_association": "OWNER", "body": "Here's a prototype:\r\n```python\r\nwith db.conn:\r\n db.conn.executescript(\"\"\"\r\nCREATE TABLE IF NOT EXISTS [_counts] ([table] TEXT PRIMARY KEY, [count] INTEGER DEFAULT 0);\r\nCREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ai] AFTER INSERT ON [Street_Tree_List] BEGIN\r\n INSERT OR REPLACE INTO _counts\r\n VALUES ('Street_Tree_List', COALESCE(\r\n (SELECT count FROM _counts\r\n WHERE [table]='Street_Tree_List'),\r\n 0) + 1);\r\nEND;\r\nCREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ad] AFTER DELETE ON [Street_Tree_List] BEGIN\r\n INSERT OR REPLACE INTO _counts\r\n VALUES ('Street_Tree_List', COALESCE(\r\n (SELECT count FROM _counts\r\n WHERE [table]='Street_Tree_List'),\r\n 0) - 1);\r\nEND;\r\nINSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', (select count(*) from [Street_Tree_List]));\r\n\"\"\")\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777392020, "label": "Mechanism for maintaining cache of table counts using triggers"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/210#issuecomment-753406744", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/210", "id": 753406744, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwNjc0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-02T00:02:39Z", "updated_at": "2021-01-02T00:02:39Z", "author_association": "OWNER", "body": "It looks like https://github.com/ofajardo/pyreadr is a good library for this.\r\n\r\nI won't add this to `sqlite-utils` because it's quite a bulky dependency for a relatively small feature.\r\n\r\nNormally I'd write a `rdata-to-sqlite` tool similar to https://pypi.org/project/dbf-to-sqlite/ - but I'm actually working on a new plugin hook for Datasette that might be an even better fit for this. The idea is to allow Datasette plugins to define input formats - such as RData - which would then result in being able to import them on the command-line with `datasette insert my.db file.rdata` or by uploading a file through the Datasette web interface.\r\n\r\nThat work is happening over here: https://github.com/simonw/datasette/issues/1160 - I'll close this issue in favour of a sometime-in-the-future `datasette-import-rdata` plugin.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 767685961, "label": "Support of RData files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/209#issuecomment-753405835", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/209", "id": 753405835, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwNTgzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T23:52:06Z", "updated_at": "2021-01-01T23:52:06Z", "author_association": "OWNER", "body": "I just hit this one too. Such a weird bug!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 766156875, "label": "Test failure with sqlite 3.34 in test_cli.py::test_optimize"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753402423", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753402423, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMjQyMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T23:16:05Z", "updated_at": "2021-01-01T23:16:05Z", "author_association": "OWNER", "body": "One catch: solving the \"show me all metadata for everything in this Datasette instance\" problem.\r\n\r\nIdeally there would be a SQLite table that can be queried for this. But the need to resolve the potentially complex set of precedence rules means that table would be difficult if not impossible to provide at run-time.\r\n\r\nIdeally a denormalized table would be available that featured the results of running those precedence rule calculations. But how to handle keeping this up-to-date? It would need to be recalculated any time a `_metadata` table in any of the attached databases had an update.\r\n\r\nThis is a much larger problem - but one potential fix would be to use triggers to maintain a \"version number\" for the `_metadata` table - similar to SQLite's own built-in `schema_version` mechanism. Triggers could increment a counter any time a record in that table was added, deleted or updated.\r\n\r\nSuch a mechanism would have applications outside of just this `_metadata` system. The ability to attach a version number to any table and have it automatically incremented when that table changes (via triggers) could help with all kinds of other Datasette-at-scale problems, including things like cached table counts.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753401001", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753401001, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMTAwMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T23:01:45Z", "updated_at": "2021-01-01T23:01:45Z", "author_association": "OWNER", "body": "I need to prototype this. Could I do that as a plugin? I think so - I could try out the algorithm for loading metadata and display it on pages using some custom templates.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753400420", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753400420, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMDQyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:53:58Z", "updated_at": "2021-01-01T22:53:58Z", "author_association": "OWNER", "body": "Precedence idea:\r\n- First priority is non-_internal metadata from other databases - if those conflict then pick then the alphabetically-ordered-first database name wins\r\n- Next priority: `_internal` metadata, which should have been loaded from `metadata.json`\r\n- Last priority: the `_metadata` table from that database itself, i.e. the default \"baked in\" metadata", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753400306", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753400306, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMDMwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:52:44Z", "updated_at": "2021-01-01T22:52:44Z", "author_association": "OWNER", "body": "Also: probably load column metadata as part of the table metadata rather than loading column metadata individually, since it's going to be rare to want the metadata for a single column rather than for an entire table full of columns.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753400265", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753400265, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzQwMDI2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:52:09Z", "updated_at": "2021-01-01T22:52:09Z", "author_association": "OWNER", "body": "From an implementation perspective, I think the way this works is SQL queries read the relevant metadata from ALL available metadata tables, then Python code solves the precedence rules to produce the final, combined metadata for a database/table/column.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753399635", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753399635, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5OTYzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:45:21Z", "updated_at": "2021-01-01T22:50:21Z", "author_association": "OWNER", "body": "Would also need to figure out the precedence rules:\r\n\r\n- What happens if the database has a `_metadata` table with data that conflicts with a remote metadata record from another database? I think the other database should win, because that allows plugins to over-ride the default metadata for something.\r\n- Do JSON values get merged together? So if one table provides a description and another provides a title do both values get returned?\r\n- If a database has a `license`, does that \"cascade\" down to the tables? What about `source` and `about`?\r\n- What if there are two databases (or more) that provide conflicting metadata for a table in some other database? Also, `_internal` may have loaded data from `metadata.json` that conflicts with some other remote table metadata definition.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753399428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753399428, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5OTQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:43:14Z", "updated_at": "2021-01-01T22:43:22Z", "author_association": "OWNER", "body": "Could this use a compound primary key on `database, table, column`? Does that work with null values?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753399366", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753399366, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5OTM2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:42:37Z", "updated_at": "2021-01-01T22:42:37Z", "author_association": "OWNER", "body": "So what would the database schema for this look like?\r\n\r\nI'm leaning towards a single table called `_metadata`, because that's a neater fit for baking the metadata into the database file along with the data that it is describing. Alternatively I could have multiple tables sharing that prefix - `_metadata_database` and `_metadata_tables` and `_metadata_columns` perhaps.\r\n\r\nIf it's just a single `_metadata` table, the schema could look like this:\r\n\r\n| database | table | column | metadata |\r\n| --- | --- | --- | --- |\r\n| | mytable | | {\"title\": \"My Table\" } |\r\n| | mytable | mycolumn | {\"description\": \"Column description\" } |\r\n| otherdb | othertable | | {\"description\": \"Table in another DB\" } |\r\n\r\nIf the `database` column is `null` it means \"this is describing a table in the same database file as this `_metadata` table\".\r\n\r\nThe alternative to the `metadata` JSON column would be separate columns for each potential metadata value - `license`, `source`, `about`, `about_url` etc. But that makes it harder for people to create custom metadata fields.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753398542", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753398542, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5ODU0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:37:24Z", "updated_at": "2021-01-01T22:37:24Z", "author_association": "OWNER", "body": "The direction I'm leaning in now is the following:\r\n\r\n- Metadata always lives in SQLite tables\r\n- These tables can be co-located with the database they describe (same DB file)\r\n- ... or they can be in a different DB file and reference the other database that they are describing\r\n- Metadata provided on startup in a `metadata.json` file is loaded into an in-memory metadata table using that same mechanism\r\n\r\nPlugins that want to provide metadata can do so by populating a table. They could even maintain their own in-memory database for this, or they could write to the `_internal` in-memory database, or they could write to a table in a database on disk.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753392102", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753392102, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5MjEwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:06:33Z", "updated_at": "2021-01-01T22:06:33Z", "author_association": "OWNER", "body": "Some SQLite databases include SQL comments in the schema definition which tell you what each column means:\r\n\r\n```sql\r\nCREATE TABLE User\r\n -- A table comment\r\n(\r\n uid INTEGER, -- A field comment\r\n flags INTEGER -- Another field comment\r\n);\r\n```\r\nThe problem with these is that they're not exposed to SQLite in any mechanism other than parsing the `CREATE TABLE` statement from the `sqlite_master` table to extract those columns.\r\n\r\nI had an idea to build a plugin that could return these. That would be easy with a \"get metadata for this column\" plugin hook - in the absence of one a plugin could still run that reads the schemas on startup and uses them to populate a metadata database table somewhere.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753391869", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753391869, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5MTg2OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:04:30Z", "updated_at": "2021-01-01T22:04:30Z", "author_association": "OWNER", "body": "The sticking point here seems to be the plugin hook. Allowing plugins to over-ride the way the question \"give me the metadata for this database/table/column\" is answered makes the database-backed metadata mechanisms much more complicated to think about.\r\n\r\nWhat if plugins didn't get to over-ride metadata in this way, but could instead update the metadata in a persistent Datasette-managed storage mechanism?\r\n\r\nThen maybe Datasette could do the following:\r\n\r\n- Maintain metadata in `_internal` that has been loaded from `metadata.json`\r\n- Know how to check a database for baked-in metadata (maybe in a `_metadata` table)\r\n- Know how to fall back on the `_internal` metadata if no baked-in metadata is available\r\n\r\nIf database files were optionally allowed to store metadata about tables that live in another database file this could perhaps solve the plugin needs - since an \"edit metadata\" plugin would be able to edit records in a separate, dedicated `metadata.db` database to store new information about tables in other files.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753390791", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753390791, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5MDc5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T22:00:42Z", "updated_at": "2021-01-01T22:00:42Z", "author_association": "OWNER", "body": "Here are the requirements I'm currently trying to satisfy:\r\n\r\n- It should be possible to query the metadata for ALL attached tables in one place, potentially with pagination and filtering\r\n- Metadata should be able to exist in the current `metadata.json` file\r\n- It should also be possible to bundle metadata in a table in the SQLite database files themselves\r\n- Plugins should be able to define their own special mechanisms for metadata. This is particularly interesting for providing a UI that allows users to edit the metadata for their existing tables.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753390262", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753390262, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM5MDI2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T21:58:11Z", "updated_at": "2021-01-01T21:58:11Z", "author_association": "OWNER", "body": "One possibility: plugins could write directly to that in-memory database table. But how would they know to write again should the server restart? Maybe they would write to it once when called by the `startup` plugin hook, and then update it (and their own backing store) when metadata changes for some reason. Feels a bit messy though.\r\n\r\nAlso: if I want to support metadata optionally living in a `_metadata` table colocated with the data in a SQLite database file itself, how would that affect the `metadata` columns in `_internal`? How often would Datasette denormalize and copy data across from the on-disk `_metadata` tables to the `_internal` in-memory columns?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753389938", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753389938, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM4OTkzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T21:54:15Z", "updated_at": "2021-01-01T21:54:15Z", "author_association": "OWNER", "body": "So what if the `databases`, `tables` and `columns` tables in `_internal` each grew a new `metadata` text column?\r\n\r\nThese columns could be populated by Datasette on startup through reading the `metadata.json` file. But how would plugins interact with them?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753389477", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753389477, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM4OTQ3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T21:49:57Z", "updated_at": "2021-01-01T21:49:57Z", "author_association": "OWNER", "body": "What if metadata was stored in a JSON text column in the existing `_internal` tables? This would allow for users to invent additional metadata fields in the future beyond the current `license`, `license_url` etc fields - without needing a schema change.\r\n\r\nThe downside of JSON columns generally is that they're harder to run indexed queries against. For metadata I don't think that matters - even with 10,000 tables each with their own metadata a SQL query asking for e.g. \"everything that has Apache 2 as the license\" would return in just a few ms.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753388809", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753388809, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM4ODgwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T21:47:51Z", "updated_at": "2021-01-01T21:47:51Z", "author_association": "OWNER", "body": "A database that exposes metadata will have the same restriction as the new `_internal` database that exposes columns and tables, in that it needs to take permissions into account. A user should not be able to view metadata for tables that they are not able to see.\r\n\r\nAs such, I'd rather bundle any metadata tables into the existing `_internal` database so I don't have to solve that permissions problem in two places.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1168#issuecomment-753366024", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1168", "id": 753366024, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzM2NjAyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-01T18:48:34Z", "updated_at": "2021-01-01T18:48:34Z", "author_association": "OWNER", "body": "Also: in #188 I proposed bundling metadata in the SQLite database itself alongside the data. This is a great way of ensuring metadata travels with the data when it is downloaded as a SQLite `.db` file. But how would that play with the idea of an in-memory `_metadata` table? Could that table perhaps offer views that join data across multiple attached physical databases?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777333388, "label": "Mechanism for storing metadata in _metadata tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753224999", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753224999, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyNDk5OQ==", "user": {"value": 11941245, "label": "jussiarpalahti"}, "created_at": "2020-12-31T23:29:36Z", "updated_at": "2020-12-31T23:29:36Z", "author_association": "NONE", "body": "I have yet to build Datasette plugin and am unfamiliar with Pluggy. Since browsers have event handling builtin Datasette could communicate with plugins through it. Handlers register as listeners for custom Datasette events and Datasette's JS can then trigger said events.\r\n\r\nI was also wondering if you had looked at Javascript Modules for JS plugins? With services like Skypack (https://www.skypack.dev) NPM libraries can be loaded directly into browser, no build step needed. Same goes for local JS if you adhere to ES Module spec. \r\n\r\nIf minification is required then tools such as Snowpack (https://www.snowpack.dev) could fit better. It uses https://github.com/evanw/esbuild for bundling and minification.\r\n\r\nOn plugins you'd simply:\r\n\r\n```javascript\r\nimport {register} from '/assets/js/datasette'\r\nregister.on({'click' : my_func})\r\n```\r\n\r\nIn Datasette HTML pages' head you'd merely import these files as modules one by one.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1166#issuecomment-753224351", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1166", "id": 753224351, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyNDM1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T23:23:29Z", "updated_at": "2020-12-31T23:23:29Z", "author_association": "OWNER", "body": "I should configure the action to only run if changes have been made within the `datasette/static` directory.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777140799, "label": "Adopt Prettier for JavaScript code formatting"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753221646", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753221646, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyMTY0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T22:58:47Z", "updated_at": "2020-12-31T22:58:47Z", "author_association": "OWNER", "body": "https://github.com/mishoo/UglifyJS/issues/1905#issuecomment-300485490 says:\r\n\r\n> `sourceMappingURL` aren't added by default in `3.x` due to one of the feature requests not to - some users are putting them within HTTP response headers instead.\r\n> \r\n> So the command line for that would be:\r\n> \r\n> ```js\r\n> $ uglifyjs main.js -cmo main.min.js --source-map url=main.min.js.map\r\n> ```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1164#issuecomment-753221362", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1164", "id": 753221362, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyMTM2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T22:55:57Z", "updated_at": "2020-12-31T22:55:57Z", "author_association": "OWNER", "body": "I had to add this as the first line in `table.min.js` for the source mapping to work:\r\n```\r\n//# sourceMappingURL=/-/static/table.min.js.map\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 776634318, "label": "Mechanism for minifying JavaScript that ships with Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1164#issuecomment-753220665", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1164", "id": 753220665, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyMDY2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T22:49:36Z", "updated_at": "2020-12-31T22:49:36Z", "author_association": "OWNER", "body": "I started with a 7K `table.js` file.\r\n\r\n`npx uglifyjs table.js --source-map -o table.min.js` gave me a 5.6K `table.min.js` file. \r\n\r\n`npx uglifyjs table.js --source-map -o table.min.js --compress --mangle` gave me 4.5K.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 776634318, "label": "Mechanism for minifying JavaScript that ships with Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1164#issuecomment-753220412", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1164", "id": 753220412, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIyMDQxMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T22:47:36Z", "updated_at": "2020-12-31T22:47:36Z", "author_association": "OWNER", "body": "I'm trying to minify `table.js` and I ran into a problem:\r\n\r\n Uglification failed. Unexpected character '`'\r\n\r\nIt turns out `uglify-js` doesn't support ES6 syntax!\r\n\r\nBut `uglify-es` does:\r\n\r\n npm install uglify-es\r\n\r\nAnnoyingly it looks like `uglify-es` uses the same CLI command, `uglifyjs`. So after installing it this seemed to work:\r\n\r\n npx uglifyjs table.js --source-map -o table.min.js\r\n\r\nI really don't like how `npx uglifyjs` could mean different things depending on which package was installed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 776634318, "label": "Mechanism for minifying JavaScript that ships with Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753219521", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753219521, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIxOTUyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T22:39:52Z", "updated_at": "2020-12-31T22:39:52Z", "author_association": "OWNER", "body": "For inlining the `plugins.min.js` file into the Jinja templates I could use the trick described here: https://stackoverflow.com/a/41404611 - which adds a `{{ include_file('file.txt') }}` function to Jinja.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712260429, "label": "JavaScript plugin hooks mechanism similar to pluggy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/983#issuecomment-753219407", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/983", "id": 753219407, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzIxOTQwNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-31T22:38:45Z", "updated_at": "2020-12-31T22:39:10Z", "author_association": "OWNER", "body": "You'll be able to add JavaScript plugins using a bunch of different mechanisms:\r\n\r\n- In a custom template, dropping the code in to a `