{"html_url": "https://github.com/simonw/datasette/issues/1880#issuecomment-1317420812", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1880", "id": 1317420812, "node_id": "IC_kwDOBm6k_c5Ohj8M", "user": {"value": 525934, "label": "amitkoth"}, "created_at": "2022-11-16T17:50:29Z", "updated_at": "2022-11-16T17:50:29Z", "author_association": "NONE", "body": "I appreciate your response @simonw - thanks!\r\n\r\nI'll clarify what we need further - let's imagine we have 2000 SQLLite databases (for 2000 tenants), but we only want to run _one_ datasette instance for each of those tenants to query/use datasette against their _own_ database only. This means the \"connection\" between datasette and the SQLLite database would be dynamic, based on the tenantID that's required on an incoming request. \r\n\r\nIs there any specific config or other considerations in this use case, to minimize memory use on a single, efficient VM and serve queries to all these tenants? \r\n\r\ncc @muadham", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1433576351, "label": "Datasette with many and large databases > Memory use"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1880#issuecomment-1311273063", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1880", "id": 1311273063, "node_id": "IC_kwDOBm6k_c5OKHBn", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-11-11T06:15:28Z", "updated_at": "2022-11-11T06:15:28Z", "author_association": "OWNER", "body": "The `_internal` database is intended to help Datasette handle much larger attached databases. Right now Datasette attempts to show every database on the https://latest.datasette.io/ index page and every table on the https://latest.datasette.io/fixtures database index page - but these are not paginated. If you had a database containing 1,000 tables the database index page would get pretty slow.\r\n\r\nSo I want to be able to paginate (and search) those. But to paginate them it's useful to have them in a database table itself, since then I can paginate using SQL.\r\n\r\nMy plan for `_internal` is to use it to implement those advanced browsing features. I've not completed this work yet though. See this issue for more details on that:\r\n\r\n- #417", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1433576351, "label": "Datasette with many and large databases > Memory use"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1880#issuecomment-1311271298", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1880", "id": 1311271298, "node_id": "IC_kwDOBm6k_c5OKGmC", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-11-11T06:12:29Z", "updated_at": "2022-11-11T06:12:29Z", "author_association": "OWNER", "body": "I think you may have misunderstood this feature. This is talking about the `_internal` in-memory database, which maintains a set of tables that list the databases and tables that are attached to Datasette.\r\n\r\nThey're not a copy of the data itself - just a list of table names, column names and database names.\r\n\r\nYou can see what that database looks like by signing in as root - running `datasette --root` and clicking the link. Or you can see an example here:\r\n\r\n- Click the button on https://latest.datasette.io/login-as-root\r\n- Now visit https://latest.datasette.io/_internal\r\n\r\nFor the example instance that looks like this:\r\n\r\n\"image\"\r\n\r\nThe two most interesting tables in there are these ones:\r\n\r\n\"image\"\r\n\r\n\"CleanShot\r\n\r\nAs you can see, it's just the table schema itself and the columns that make up the tables. Even if you have hundreds of databases connected each with hundreds of tables this should still only add up to a few MB of RAM.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1433576351, "label": "Datasette with many and large databases > Memory use"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1880#issuecomment-1301043042", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1880", "id": 1301043042, "node_id": "IC_kwDOBm6k_c5NjFdi", "user": {"value": 525934, "label": "amitkoth"}, "created_at": "2022-11-02T18:20:14Z", "updated_at": "2022-11-02T18:20:14Z", "author_association": "NONE", "body": "Follow on question - is all memory use @simonw - for both datasette and SQLlite confined to the \"query time\" itself i.e. the memory use is relevant only to a particular transaction or query - and then subsequently released?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1433576351, "label": "Datasette with many and large databases > Memory use"}, "performed_via_github_app": null}