html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/sqlite-utils/issues/595#issuecomment-1733312349,https://api.github.com/repos/simonw/sqlite-utils/issues/595,1733312349,IC_kwDOCGYnMM5nUD9d,123451970,2023-09-25T09:38:13Z,2023-09-25T09:38:57Z,NONE,"Never mind When I created the connection using `sqlite_utils.Database(path)` I just needed to add the following statement right after and it did the trick `self.db.conn.execute(""PRAGMA foreign_keys = ON"")` Hope this helps people in the future 👍 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1907281675, https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1129332959,https://api.github.com/repos/simonw/sqlite-utils/issues/425,1129332959,IC_kwDOCGYnMM5DUEDf,102771161,2022-05-17T21:27:02Z,2022-05-17T21:27:02Z,NONE,"Hi, I'm trying to deploy my site using elasticbeanstalk and I keep getting this same error : deterministic=True requires SQLite 3.8.3 or higher I saw your previous solution that involves editing sqlite-utils/sqlite_utils/db.py file, but I'm curious as to how that will work in production.","{""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1203842656, https://github.com/simonw/datasette/issues/1426#issuecomment-985982668,https://api.github.com/repos/simonw/datasette/issues/1426,985982668,IC_kwDOBm6k_c46xObM,95520595,2021-12-04T07:11:29Z,2021-12-04T07:11:29Z,NONE,You can generate xml site map from the online tools using https://tools4seo.site/xml-sitemap-generator. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",964322136, https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1509951952,https://api.github.com/repos/simonw/sqlite-utils/issues/425,1509951952,IC_kwDOCGYnMM5aAAnQ,89400147,2023-04-15T20:14:58Z,2023-04-15T20:14:58Z,NONE,is this change released ? Because when we run docker containers issue still persists for production deployments.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1203842656, https://github.com/simonw/datasette/pull/2031#issuecomment-1483248966,https://api.github.com/repos/simonw/datasette/issues/2031,1483248966,IC_kwDOBm6k_c5YaJVG,82332573,2023-03-24T18:35:24Z,2023-03-24T18:35:24Z,NONE,I've rebased my patch on the latest main. It should be ready to merge.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1605481359, https://github.com/simonw/datasette/pull/2031#issuecomment-1457243738,https://api.github.com/repos/simonw/datasette/issues/2031,1457243738,IC_kwDOBm6k_c5W28Za,82332573,2023-03-07T00:05:25Z,2023-03-07T00:12:09Z,NONE,"I've implemented the test (thanks for pointing me in the right direction!). At [tmcl-it/datasette:0.64.1+row-view-expand-labels](https://github.com/tmcl-it/datasette/tree/0.64.1%2Brow-view-expand-labels) I also have a variant of this patch that applies to the 0.64.x branch. Please let me know if you'd be interested in merging that as well and I'll open another PR.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1605481359, https://github.com/simonw/datasette/issues/2023#issuecomment-1425988018,https://api.github.com/repos/simonw/datasette/issues/2023,1425988018,IC_kwDOBm6k_c5U_tmy,80409402,2023-02-10T15:39:59Z,2023-02-10T15:39:59Z,NONE,"Thanks for confirming my doubts! I removed it after opening this issue, yup, then had another issue with `default_cache_ttl_hashed` which I assume was removed at the same time. Sorry for not trying that before opening the issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1579695809, https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1640826795,https://api.github.com/repos/simonw/sqlite-utils/issues/433,1640826795,IC_kwDOCGYnMM5hzQer,76528036,2023-07-18T19:08:50Z,2023-07-18T19:08:50Z,NONE,"Came here to report this, but instead I'll confirm the issue across two terminal emulators (Gnome Terminal and Alacritty) on Pop_OS! 22.04 (currently based on Ubuntu/Gnome). Also messes up the formatting of the terminal. Can also confirm that reset fixes it until the next sqlite-utils command. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1239034903, https://github.com/simonw/datasette/issues/1479#issuecomment-930071625,https://api.github.com/repos/simonw/datasette/issues/1479,930071625,IC_kwDOBm6k_c43b8RJ,76450761,2021-09-29T11:01:30Z,2021-09-29T11:01:30Z,NONE,"Thanks, but this one has a different error type. Unfortunately, still not working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1010112818, https://github.com/simonw/datasette/issues/370#issuecomment-1261930179,https://api.github.com/repos/simonw/datasette/issues/370,1261930179,IC_kwDOBm6k_c5LN4bD,72577720,2022-09-29T08:17:46Z,2022-09-29T08:17:46Z,CONTRIBUTOR,"Just watched this video which demonstrates the integration of *any* webapp into JupyterLab: https://youtu.be/FH1dKKmvFtc Maybe this is the answer?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320, https://github.com/simonw/datasette/issues/1396#issuecomment-946467547,https://api.github.com/repos/simonw/datasette/issues/1396,946467547,IC_kwDOBm6k_c44afLb,72577720,2021-10-19T08:10:26Z,2021-10-19T08:10:26Z,CONTRIBUTOR,"Now that 0.59 has excellent annotated release notes, you can re-confirm this is fixed by updating the published Docker image and checking that these fixes still work ;-)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",944903881, https://github.com/simonw/datasette/issues/1826#issuecomment-1260373403,https://api.github.com/repos/simonw/datasette/issues/1826,1260373403,IC_kwDOBm6k_c5LH8Wb,66709385,2022-09-28T04:30:27Z,2022-09-28T04:30:27Z,NONE,"I'm glad the bug report served some purpose. Frankly I just needed the method signature, that is why the documentation you mention wasn't read. On Tue, Sep 27, 2022, 9:05 PM Simon Willison ***@***.***> wrote: > Though now I notice that the copy right there needs to be updated to > reflect the new row parameter to render_cell! > > — > Reply to this email directly, view it on GitHub > , > or unsubscribe > > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1388631785, https://github.com/simonw/sqlite-utils/issues/319#issuecomment-905024066,https://api.github.com/repos/simonw/sqlite-utils/issues/319,905024066,IC_kwDOCGYnMM418ZJC,66709385,2021-08-24T22:41:39Z,2021-08-24T22:41:39Z,NONE,"I'm happy with this functionality left the way you describe. In my case the data is homogeneous but other cases would work just by being consistent on the encoding. Thanks a lot, Simon!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",976399638, https://github.com/simonw/sqlite-utils/issues/319#issuecomment-905021010,https://api.github.com/repos/simonw/sqlite-utils/issues/319,905021010,IC_kwDOCGYnMM418YZS,66709385,2021-08-24T22:33:42Z,2021-08-24T22:33:42Z,NONE,"Oh, I misread. Yes some files will not be valid UTF-8, I'd throw a warning and continue (not adding that file) but if you want to get more elaborate you could allow to define a policy on what to do. Not adding the file, index binary content or use a conversion policy like the ones available on Python's decode. From https://stackoverflow.com/questions/24616678/unicodedecodeerror-in-python-when-reading-a-file-how-to-ignore-the-error-and-ju : - 'ignore' ignores errors. Note that ignoring encoding errors can lead to data loss. - 'replace' causes a replacement marker (such as '?') to be inserted where there is malformed data. - 'surrogateescape' will represent any incorrect bytes as code points in the Unicode Private Use Area ranging from U+DC80 to U+DCFF. These private code points will then be turned back into the same bytes when the surrogateescape error handler is used when writing data. This is useful for processing files in an unknown encoding. - 'xmlcharrefreplace' is only supported when writing to a file. Characters not supported by the encoding are replaced with the appropriate XML character reference &#nnn;. - 'backslashreplace' (also only supported when writing) replaces unsupported characters with Python’s backslashed escape sequences.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",976399638, https://github.com/simonw/sqlite-utils/issues/319#issuecomment-905003381,https://api.github.com/repos/simonw/sqlite-utils/issues/319,905003381,IC_kwDOCGYnMM418UF1,66709385,2021-08-24T21:56:49Z,2021-08-24T21:56:49Z,NONE,"I was thinking that an approach could be making FILE_COLUMNS a generator (_get_file_columns(mode)) or you can just have a different set of columns (is there something else that makes sense to be changed on the text scenario?). About UTF-8 I was referring to the encoding to use when reading files. This can be difficult to auto-detect but I believe that UTF-8 is pretty much the standard for text files.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",976399638, https://github.com/simonw/datasette/issues/777#issuecomment-635513983,https://api.github.com/repos/simonw/datasette/issues/777,635513983,MDEyOklzc3VlQ29tbWVudDYzNTUxMzk4Mw==,63653929,2020-05-28T18:16:49Z,2020-05-28T18:16:49Z,NONE," think, because the given URL of the CSS file doesn't have any complete parameters after query Try to complete the parameter ``","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",626171242, https://github.com/simonw/sqlite-utils/issues/26#issuecomment-1170595021,https://api.github.com/repos/simonw/sqlite-utils/issues/26,1170595021,IC_kwDOCGYnMM5FxdzN,60892516,2022-06-29T23:35:29Z,2022-06-29T23:35:29Z,NONE,"Have you seen [MakeTypes](https://github.com/jvilk/MakeTypes)? Not the exact same thing but it may be relevant. And it's inspired by the paper [""Types from Data: Making Structured Data First-Class Citizens in F#""](https://dl.acm.org/citation.cfm?id=2908115).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",455486286, https://github.com/simonw/datasette/issues/394#issuecomment-642522285,https://api.github.com/repos/simonw/datasette/issues/394,642522285,MDEyOklzc3VlQ29tbWVudDY0MjUyMjI4NQ==,58298410,2020-06-11T09:15:19Z,2020-06-11T09:15:19Z,NONE,"Hi @wragge, This looks great, thanks for the share! I refactored it into a self-contained function, binding on a random available TCP port (multi-user context). I am using subprocess API directly since the `%run` magic was leaving defunct process behind :/ ![image](https://user-images.githubusercontent.com/58298410/84367566-b5d0d500-abd4-11ea-96e2-f5c05a28e506.png) ```python import socket from signal import SIGINT from subprocess import Popen, PIPE from IPython.display import display, HTML from notebook.notebookapp import list_running_servers def get_free_tcp_port(): """""" Get a free TCP port. """""" tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM) tcp.bind(('', 0)) _, port = tcp.getsockname() tcp.close() return port def datasette(database): """""" Run datasette on an SQLite database. """""" # Get current running servers servers = list_running_servers() # Get the current base url base_url = next(servers)['base_url'] # Get a free port port = get_free_tcp_port() # Create a base url for Datasette suing the proxy path proxy_url = f'{base_url}proxy/absolute/{port}/' # Display a link to Datasette display(HTML(f'

View Datasette (Click on the stop button to close the Datasette server)

')) # Launch Datasette with Popen( [ 'python', '-m', 'datasette', '--', database, '--port', str(port), '--config', f'base_url:{proxy_url}' ], stdout=PIPE, stderr=PIPE, bufsize=1, universal_newlines=True ) as p: print(p.stdout.readline(), end='') while True: try: line = p.stderr.readline() if not line: break print(line, end='') exit_code = p.poll() except KeyboardInterrupt: p.send_signal(SIGINT) ``` Ideally, I'd like some extra magic to notify users when they are leaving the closing the notebook tab and make them terminate the running datasette processes. I'll be looking for it.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",396212021, https://github.com/simonw/datasette/issues/394#issuecomment-641889565,https://api.github.com/repos/simonw/datasette/issues/394,641889565,MDEyOklzc3VlQ29tbWVudDY0MTg4OTU2NQ==,58298410,2020-06-10T09:49:34Z,2020-06-10T09:49:34Z,NONE,"Hi, I came across this issue while looking for a way to spawn Datasette as a SQLite files viewer in JupyterLab. I found https://github.com/simonw/jupyterserverproxy-datasette-demo which seems to be the most up to date proof of concept, but it seems to be failing to list the available db (at least in the Binder demo, https://hub.gke.mybinder.org/user/simonw-jupyters--datasette-demo-uw4dmlnn/datasette/, I only have `:memory`). Does anyone tried to improve on this proof of concept to have a Datasette visualization for SQLite files? Thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021, https://github.com/simonw/datasette/issues/676#issuecomment-590209074,https://api.github.com/repos/simonw/datasette/issues/676,590209074,MDEyOklzc3VlQ29tbWVudDU5MDIwOTA3NA==,58088336,2020-02-24T08:20:15Z,2020-02-24T08:20:15Z,NONE,"Awesome, thank you so much. I’ll try it out and let you know. On Sun, Feb 23, 2020 at 1:44 PM Simon Willison wrote: > You can try this right now like so: > > pip install https://github.com/simonw/datasette/archive/search-raw.zip > > Then use the following: > > ?_search=foo*&_searchmode=raw` > > — > You are receiving this because you authored the thread. > Reply to this email directly, view it on GitHub > , > or unsubscribe > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",568091133, https://github.com/simonw/datasette/issues/676#issuecomment-589922016,https://api.github.com/repos/simonw/datasette/issues/676,589922016,MDEyOklzc3VlQ29tbWVudDU4OTkyMjAxNg==,58088336,2020-02-22T05:50:10Z,2020-02-22T05:50:10Z,NONE,"Thanks Simon, My use case is using Datasette for full text search type ahead. That was working pretty well. The _search_wildcard= option will be awesome. Thanks ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",568091133, https://github.com/simonw/datasette/issues/858#issuecomment-858831895,https://api.github.com/repos/simonw/datasette/issues/858,858831895,MDEyOklzc3VlQ29tbWVudDg1ODgzMTg5NQ==,56045588,2021-06-10T17:44:09Z,2021-06-10T17:44:09Z,NONE,"any fixes for that recursive issue with temp file? I get it using both heroku and cloudrun, although it seems to still publish and deploy fine","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642388564, https://github.com/simonw/datasette/issues/858#issuecomment-858813675,https://api.github.com/repos/simonw/datasette/issues/858,858813675,MDEyOklzc3VlQ29tbWVudDg1ODgxMzY3NQ==,56045588,2021-06-10T17:27:46Z,2021-06-10T17:27:46Z,NONE,shell=True is added to line 56 (I guess it used to be 54) of heroku.py as detailed in the original issue. (for posterity),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642388564, https://github.com/simonw/datasette/issues/1241#issuecomment-1692322342,https://api.github.com/repos/simonw/datasette/issues/1241,1692322342,IC_kwDOBm6k_c5k3som,52261150,2023-08-24T19:56:15Z,2023-08-24T20:09:52Z,NONE,"Something to think about, but I hate how long the url is when sharing a custom SQL query. Would it be possible to hash the query and state of a page instead so the url is more manageable? The mapping from hash to query would have to be stored in order to recover/lookup the page after sharing. It's not uncommon to have things like this currently: ```https://global-power-plants.datasettes.com/global-power-plants?sql=select+rowid%2C+country%2C+country_long%2C+name%2C+gppd_idnr%2C+capacity_mw%2C+latitude%2C+longitude%2C+primary_fuel%2C+other_fuel1%2C+other_fuel2%2C+other_fuel3%2C+commissioning_year%2C+owner%2C+source%2C+url%2C+geolocation_source%2C+wepp_id%2C+year_of_capacity_data%2C+generation_gwh_2013%2C+generation_gwh_2014%2C+generation_gwh_2015%2C+generation_gwh_2016%2C+generation_gwh_2017%2C+generation_gwh_2018%2C+generation_gwh_2019%2C+generation_data_source%2C+estimated_generation_gwh_2013%2C+estimated_generation_gwh_2014%2C+estimated_generation_gwh_2015%2C+estimated_generation_gwh_2016%2C+estimated_generation_gwh_2017%2C+estimated_generation_note_2013%2C+estimated_generation_note_2014%2C+estimated_generation_note_2015%2C+estimated_generation_note_2016%2C+estimated_generation_note_2017+from+%5Bglobal-power-plants%5D+order+by+rowid+limit+101``` I'm thinking a plugin like [https://datasette.io/plugins/datasette-query-files](https://datasette.io/plugins/datasette-query-files), but could be created and managed from the UI (with the right permissions).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",814595021, https://github.com/simonw/datasette/issues/670#issuecomment-848425056,https://api.github.com/repos/simonw/datasette/issues/670,848425056,MDEyOklzc3VlQ29tbWVudDg0ODQyNTA1Ng==,52261150,2021-05-26T03:22:32Z,2021-05-26T03:22:32Z,NONE,"I've also been investigating serving postgresql databases over postgrest. I like the idea of hosting some static html + js on github, but having it backed by datasets I can update and control on the database server. I started from SQLite + datasette but would like to host larger datasets (with smaller materialized views exposed publicly). I think the postgrest model where all the authorization and ownership is defined in database role grants is really powerful. But I really miss being able to define an ad-hoc query in sql, then instantly link to a json representation of it like datasette does. P.S.: I've been sort of following along as you pop up in hacker news here and there. It's been great! Thanks for doing this all out in the open!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",564833696, https://github.com/simonw/sqlite-utils/issues/477#issuecomment-1238815924,https://api.github.com/repos/simonw/sqlite-utils/issues/477,1238815924,IC_kwDOCGYnMM5J1tS0,49702524,2022-09-07T01:46:24Z,2022-09-07T01:46:24Z,NONE,Will do!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1353441389, https://github.com/simonw/datasette/pull/2202#issuecomment-1801876943,https://api.github.com/repos/simonw/datasette/issues/2202,1801876943,IC_kwDOBm6k_c5rZnXP,49699333,2023-11-08T13:19:00Z,2023-11-08T13:19:00Z,CONTRIBUTOR,Superseded by #2206.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1959278971,"{""id"": 29110, ""slug"": ""dependabot"", ""node_id"": ""MDM6QXBwMjkxMTA="", ""owner"": {""login"": ""github"", ""id"": 9919, ""node_id"": ""MDEyOk9yZ2FuaXphdGlvbjk5MTk="", ""avatar_url"": ""https://avatars.githubusercontent.com/u/9919?v=4"", ""gravatar_id"": """", ""url"": ""https://api.github.com/users/github"", ""html_url"": ""https://github.com/github"", ""followers_url"": ""https://api.github.com/users/github/followers"", ""following_url"": ""https://api.github.com/users/github/following{/other_user}"", ""gists_url"": ""https://api.github.com/users/github/gists{/gist_id}"", ""starred_url"": ""https://api.github.com/users/github/starred{/owner}{/repo}"", ""subscriptions_url"": ""https://api.github.com/users/github/subscriptions"", ""organizations_url"": ""https://api.github.com/users/github/orgs"", ""repos_url"": ""https://api.github.com/users/github/repos"", ""events_url"": ""https://api.github.com/users/github/events{/privacy}"", ""received_events_url"": ""https://api.github.com/users/github/received_events"", ""type"": ""Organization"", ""site_admin"": false}, ""name"": ""Dependabot"", ""description"": """", ""external_url"": ""https://dependabot-api.githubapp.com"", ""html_url"": ""https://github.com/apps/dependabot"", ""created_at"": ""2019-04-16T22:34:25Z"", ""updated_at"": ""2023-10-12T13:35:09Z"", ""permissions"": {""checks"": ""write"", ""contents"": ""write"", ""issues"": ""write"", ""members"": ""read"", ""metadata"": ""read"", ""pull_requests"": ""write"", ""statuses"": ""read"", ""vulnerability_alerts"": ""read"", ""workflows"": ""write""}, ""events"": [""check_suite"", ""issues"", ""issue_comment"", ""label"", ""pull_request"", ""pull_request_review"", ""pull_request_review_comment"", ""repository""]}" https://github.com/simonw/datasette/pull/2200#issuecomment-1777228352,https://api.github.com/repos/simonw/datasette/issues/2200,1777228352,IC_kwDOBm6k_c5p7lpA,49699333,2023-10-24T13:40:25Z,2023-10-24T13:40:25Z,CONTRIBUTOR,Superseded by #2202.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1949756141, https://github.com/simonw/datasette/pull/2182#issuecomment-1719451803,https://api.github.com/repos/simonw/datasette/issues/2182,1719451803,IC_kwDOBm6k_c5mfMCb,49699333,2023-09-14T13:27:26Z,2023-09-14T13:27:26Z,CONTRIBUTOR,"Looks like these dependencies are updatable in another way, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1890593563, https://github.com/simonw/datasette/pull/2148#issuecomment-1696591957,https://api.github.com/repos/simonw/datasette/issues/2148,1696591957,IC_kwDOBm6k_c5lH_BV,49699333,2023-08-29T00:15:29Z,2023-08-29T00:15:29Z,CONTRIBUTOR,This pull request was built based on a group rule. Closing it will not ignore any of these versions in future pull requests.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1859415334, https://github.com/simonw/datasette/pull/2152#issuecomment-1695736691,https://api.github.com/repos/simonw/datasette/issues/2152,1695736691,IC_kwDOBm6k_c5lEuNz,49699333,2023-08-28T13:49:35Z,2023-08-28T13:49:35Z,CONTRIBUTOR,Superseded by #2160.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1865174661, https://github.com/simonw/datasette/pull/2148#issuecomment-1689198413,https://api.github.com/repos/simonw/datasette/issues/2148,1689198413,IC_kwDOBm6k_c5krx9N,49699333,2023-08-23T02:57:55Z,2023-08-23T02:57:55Z,CONTRIBUTOR,"Looks like this PR has been edited by someone other than Dependabot. That means Dependabot can't rebase it - sorry! If you're happy for Dependabot to recreate it from scratch, overwriting any edits, you can request `@dependabot recreate`. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1859415334, https://github.com/simonw/datasette/pull/2144#issuecomment-1686366557,https://api.github.com/repos/simonw/datasette/issues/2144,1686366557,IC_kwDOBm6k_c5kg-ld,49699333,2023-08-21T13:48:15Z,2023-08-21T13:48:15Z,CONTRIBUTOR,Superseded by #2148.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1856760386, https://github.com/simonw/datasette/pull/2142#issuecomment-1683950031,https://api.github.com/repos/simonw/datasette/issues/2142,1683950031,IC_kwDOBm6k_c5kXwnP,49699333,2023-08-18T13:49:24Z,2023-08-18T13:49:24Z,CONTRIBUTOR,"Looks like these dependencies are updatable in another way, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1854970601, https://github.com/simonw/datasette/pull/2141#issuecomment-1682256251,https://api.github.com/repos/simonw/datasette/issues/2141,1682256251,IC_kwDOBm6k_c5kRTF7,49699333,2023-08-17T13:07:43Z,2023-08-17T13:07:43Z,CONTRIBUTOR,"Looks like blacken-docs is updatable in another way, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1853289039, https://github.com/simonw/datasette/pull/2125#issuecomment-1668187546,https://api.github.com/repos/simonw/datasette/issues/2125,1668187546,IC_kwDOBm6k_c5jboWa,49699333,2023-08-07T16:20:26Z,2023-08-07T16:20:26Z,CONTRIBUTOR,"Looks like sphinx is up-to-date now, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1833193570, https://github.com/simonw/datasette/pull/2121#issuecomment-1668186872,https://api.github.com/repos/simonw/datasette/issues/2121,1668186872,IC_kwDOBm6k_c5jboL4,49699333,2023-08-07T16:20:19Z,2023-08-07T16:20:19Z,CONTRIBUTOR,"Looks like furo is up-to-date now, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1824399610, https://github.com/simonw/datasette/pull/2098#issuecomment-1668186815,https://api.github.com/repos/simonw/datasette/issues/2098,1668186815,IC_kwDOBm6k_c5jboK_,49699333,2023-08-07T16:20:18Z,2023-08-07T16:20:18Z,CONTRIBUTOR,"Looks like blacken-docs is up-to-date now, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1796830110, https://github.com/simonw/datasette/pull/2124#issuecomment-1662215579,https://api.github.com/repos/simonw/datasette/issues/2124,1662215579,IC_kwDOBm6k_c5jE2Wb,49699333,2023-08-02T13:28:43Z,2023-08-02T13:28:43Z,CONTRIBUTOR,Superseded by #2125.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1826424151, https://github.com/simonw/datasette/pull/2107#issuecomment-1655678215,https://api.github.com/repos/simonw/datasette/issues/2107,1655678215,IC_kwDOBm6k_c5ir6UH,49699333,2023-07-28T13:23:16Z,2023-07-28T13:23:16Z,CONTRIBUTOR,Superseded by #2124.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1820346348, https://github.com/simonw/datasette/pull/2077#issuecomment-1653652665,https://api.github.com/repos/simonw/datasette/issues/2077,1653652665,IC_kwDOBm6k_c5ikLy5,49699333,2023-07-27T13:40:52Z,2023-07-27T13:40:52Z,CONTRIBUTOR,Superseded by #2121.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1719759468, https://github.com/simonw/datasette/pull/2075#issuecomment-1649849249,https://api.github.com/repos/simonw/datasette/issues/2075,1649849249,IC_kwDOBm6k_c5iVrOh,49699333,2023-07-25T13:28:35Z,2023-07-25T13:28:35Z,CONTRIBUTOR,Superseded by #2107.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1710164693, https://github.com/simonw/datasette/pull/2068#issuecomment-1547911570,https://api.github.com/repos/simonw/datasette/issues/2068,1547911570,IC_kwDOBm6k_c5cQ0GS,49699333,2023-05-15T13:59:35Z,2023-05-15T13:59:35Z,CONTRIBUTOR,Superseded by #2075.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1690842199, https://github.com/simonw/datasette/pull/2064#issuecomment-1529737426,https://api.github.com/repos/simonw/datasette/issues/2064,1529737426,IC_kwDOBm6k_c5bLfDS,49699333,2023-05-01T13:58:50Z,2023-05-01T13:58:50Z,CONTRIBUTOR,Superseded by #2068.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1683229834, https://github.com/simonw/datasette/pull/2063#issuecomment-1521837780,https://api.github.com/repos/simonw/datasette/issues/2063,1521837780,IC_kwDOBm6k_c5atWbU,49699333,2023-04-25T13:57:52Z,2023-04-25T13:57:52Z,CONTRIBUTOR,Superseded by #2064.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1681339696, https://github.com/simonw/datasette/pull/2014#issuecomment-1487999503,https://api.github.com/repos/simonw/datasette/issues/2014,1487999503,IC_kwDOBm6k_c5YsRIP,49699333,2023-03-29T06:09:11Z,2023-03-29T06:09:11Z,CONTRIBUTOR,Superseded by #2047.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1566081801, https://github.com/simonw/datasette/pull/2043#issuecomment-1486944644,https://api.github.com/repos/simonw/datasette/issues/2043,1486944644,IC_kwDOBm6k_c5YoPmE,49699333,2023-03-28T13:58:20Z,2023-03-28T13:58:20Z,CONTRIBUTOR,Superseded by #2046.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1639446870, https://github.com/simonw/datasette/pull/1982#issuecomment-1376620851,https://api.github.com/repos/simonw/datasette/issues/1982,1376620851,IC_kwDOBm6k_c5SDZEz,49699333,2023-01-10T02:03:18Z,2023-01-10T02:03:18Z,CONTRIBUTOR,"Looks like sphinx is up-to-date now, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1525560504, https://github.com/simonw/datasette/pull/1977#issuecomment-1375596856,https://api.github.com/repos/simonw/datasette/issues/1977,1375596856,IC_kwDOBm6k_c5R_fE4,49699333,2023-01-09T13:06:14Z,2023-01-09T13:06:14Z,CONTRIBUTOR,Superseded by #1982.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1522552817, https://github.com/simonw/datasette/pull/1976#issuecomment-1373592231,https://api.github.com/repos/simonw/datasette/issues/1976,1373592231,IC_kwDOBm6k_c5R31qn,49699333,2023-01-06T13:02:15Z,2023-01-06T13:02:15Z,CONTRIBUTOR,Superseded by #1977.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1520712722, https://github.com/simonw/datasette/pull/1974#issuecomment-1372188571,https://api.github.com/repos/simonw/datasette/issues/1974,1372188571,IC_kwDOBm6k_c5Rye-b,49699333,2023-01-05T13:02:40Z,2023-01-05T13:02:40Z,CONTRIBUTOR,Superseded by #1976.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1516376583, https://github.com/simonw/datasette/pull/1685#issuecomment-1237381620,https://api.github.com/repos/simonw/datasette/issues/1685,1237381620,IC_kwDOBm6k_c5JwPH0,49699333,2022-09-05T18:36:47Z,2022-09-05T18:36:47Z,CONTRIBUTOR,"Looks like jinja2 is no longer updatable, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1180778860, https://github.com/simonw/datasette/pull/1799#issuecomment-1237381569,https://api.github.com/repos/simonw/datasette/issues/1799,1237381569,IC_kwDOBm6k_c5JwPHB,49699333,2022-09-05T18:36:42Z,2022-09-05T18:36:42Z,CONTRIBUTOR,"Looks like aiofiles is no longer updatable, so this is no longer needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1362242558, https://github.com/simonw/datasette/pull/1693#issuecomment-1168704157,https://api.github.com/repos/simonw/datasette/issues/1693,1168704157,IC_kwDOBm6k_c5FqQKd,49699333,2022-06-28T13:11:36Z,2022-06-28T13:11:36Z,CONTRIBUTOR,Superseded by #1763.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1184850337, https://github.com/simonw/datasette/pull/1753#issuecomment-1163091750,https://api.github.com/repos/simonw/datasette/issues/1753,1163091750,IC_kwDOBm6k_c5FU18m,49699333,2022-06-22T13:22:34Z,2022-06-22T13:22:34Z,CONTRIBUTOR,Superseded by #1760.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1261826957, https://github.com/simonw/datasette/pull/1593#issuecomment-1031455498,https://api.github.com/repos/simonw/datasette/issues/1593,1031455498,IC_kwDOBm6k_c49esMK,49699333,2022-02-07T13:13:22Z,2022-02-07T13:13:22Z,CONTRIBUTOR,Superseded by #1631.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1101705012, https://github.com/simonw/datasette/pull/1514#issuecomment-972852184,https://api.github.com/repos/simonw/datasette/issues/1514,972852184,IC_kwDOBm6k_c45_IvY,49699333,2021-11-18T13:11:15Z,2021-11-18T13:11:15Z,CONTRIBUTOR,Superseded by #1516.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1056117435, https://github.com/simonw/datasette/pull/1500#issuecomment-971568829,https://api.github.com/repos/simonw/datasette/issues/1500,971568829,IC_kwDOBm6k_c456Pa9,49699333,2021-11-17T13:13:58Z,2021-11-17T13:13:58Z,CONTRIBUTOR,Superseded by #1514.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1041158024, https://github.com/simonw/datasette/pull/1489#issuecomment-943594738,https://api.github.com/repos/simonw/datasette/issues/1489,943594738,IC_kwDOBm6k_c44Phzy,49699333,2021-10-14T18:04:13Z,2021-10-14T18:04:13Z,CONTRIBUTOR,"OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting `@dependabot ignore this major version` or `@dependabot ignore this minor version`. You can also ignore all major, minor, or patch releases for a dependency by adding an [`ignore` condition](https://docs.github.com/en/code-security/supply-chain-security/configuration-options-for-dependency-updates#ignore) with the desired `update_types` to your config file. If you change your mind, just re-open this PR and I'll resolve any conflicts on it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1026379132, https://github.com/simonw/datasette/pull/1489#issuecomment-943594735,https://api.github.com/repos/simonw/datasette/issues/1489,943594735,IC_kwDOBm6k_c44Phzv,49699333,2021-10-14T18:04:12Z,2021-10-14T18:04:12Z,CONTRIBUTOR,Looks like this PR is closed. If you re-open it I'll rebase it as long as no-one else has edited it (you can use `@dependabot reopen` if the branch has been deleted).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1026379132, https://github.com/simonw/datasette/pull/1453#issuecomment-919135732,https://api.github.com/repos/simonw/datasette/issues/1453,919135732,IC_kwDOBm6k_c42yOX0,49699333,2021-09-14T13:10:38Z,2021-09-14T13:10:38Z,CONTRIBUTOR,Superseded by #1471.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",982780906, https://github.com/simonw/datasette/pull/1318#issuecomment-838449572,https://api.github.com/repos/simonw/datasette/issues/1318,838449572,MDEyOklzc3VlQ29tbWVudDgzODQ0OTU3Mg==,49699333,2021-05-11T13:12:30Z,2021-05-11T13:12:30Z,CONTRIBUTOR,Superseded by #1321.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",876431852, https://github.com/simonw/datasette/issues/658#issuecomment-583177728,https://api.github.com/repos/simonw/datasette/issues/658,583177728,MDEyOklzc3VlQ29tbWVudDU4MzE3NzcyOA==,49656826,2020-02-07T00:28:55Z,2020-02-07T00:29:50Z,NONE,"Simon, Yes, there is an ""app.css"" on static folder, however, anyone modification I do on this .css, doesn't apply on the datasette. I'm using this command: datasette publish heroku _""databases folder""_ -n _""herokuapp name""_ --extra-options=""--config sql_time_limit_ms:60000 --config max_returned_rows:10000 --config force_https_urls:1"" --template-dir _""templates folder""_ -m _""metadata.json folder""_","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",550293770, https://github.com/simonw/datasette/issues/616#issuecomment-551872999,https://api.github.com/repos/simonw/datasette/issues/616,551872999,MDEyOklzc3VlQ29tbWVudDU1MTg3Mjk5OQ==,49656826,2019-11-08T15:31:33Z,2019-11-08T15:31:33Z,NONE,"Thank you so much, Simon! Now, I'm contacting Heroku's support team to find a way to update the Datasette version on bases.vortex.media. Do you know how to do it?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518506242, https://github.com/simonw/sqlite-utils/issues/456#issuecomment-1190449764,https://api.github.com/repos/simonw/sqlite-utils/issues/456,1190449764,IC_kwDOCGYnMM5G9NJk,45919695,2022-07-20T15:45:54Z,2022-07-20T15:45:54Z,NONE,"> hadley wickham's melt and reshape could be good inspo: http://had.co.nz/reshape/introduction.pdf Note that Hadley has since implemented `pivot_longer` and `pivot_wider` instead of the previous verbs/functions that he used. Those can be found in the tidyr package and are probably the best reference which includes all of the learnings from years of user feedback. https://tidyr.tidyverse.org/articles/pivot.html","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1310243385, https://github.com/simonw/datasette/issues/483#issuecomment-495034774,https://api.github.com/repos/simonw/datasette/issues/483,495034774,MDEyOklzc3VlQ29tbWVudDQ5NTAzNDc3NA==,45919695,2019-05-23T01:38:32Z,2019-05-23T01:43:04Z,NONE,"I think that location information is one of the other common pieces of hierarchical data. At least one that is general enough that extra dimensions could be auto-generated. Also, I think this is an awesome project. Thank you for creating this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447408527, https://github.com/simonw/datasette/issues/1415#issuecomment-1793787454,https://api.github.com/repos/simonw/datasette/issues/1415,1793787454,IC_kwDOBm6k_c5q6wY-,45269373,2023-11-05T16:44:49Z,2023-11-05T16:46:59Z,NONE,"thanks for documenting this @bendnorman! got stuck at exactly the same point `gcloud builds submit ... returned non-zero exit status 1`, without a clue why this was happening. i now managed to get the github action to deploy datasette by assigning the following roles to the service account: `roles/run.admin`, `roles/storage.admin`, `roles/cloudbuild.builds.builder`, `roles/viewer`, `roles/iam.serviceAccountUser`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",959137143, https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1629123734,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14,1629123734,IC_kwDOC8tyDs5hGnSW,44622670,2023-07-10T14:46:52Z,2023-07-10T14:46:52Z,NONE,@simonw any chance to get this fixed soon? ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771608692, https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1049775451,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62,1049775451,IC_kwDODEm0Qs4-kk1b,43036882,2022-02-24T11:43:31Z,2022-02-24T11:43:31Z,NONE,i seem to have fixed this issue by applying for [elevated API access](https://developer.twitter.com/en/portal/products/elevated),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1088816961, https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-772408273,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,772408273,MDEyOklzc3VlQ29tbWVudDc3MjQwODI3Mw==,42315895,2021-02-03T10:36:36Z,2021-02-03T10:36:36Z,NONE,"I figured it out. Those tweets are in database, because somebody quote tweeted them, or retweeted them. And if you grab quoted tweet or reweeted tweet from other tweet json, It doesn't grab all of the details. So if someone quote tweeted a quote tweet, the second quote tweet won't have `quoted_status`. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607, https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769973212,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,769973212,MDEyOklzc3VlQ29tbWVudDc2OTk3MzIxMg==,42315895,2021-01-29T18:29:02Z,2021-01-29T18:31:55Z,NONE,"I think it was with `twitter-to-sqlite home-timeline home.db -a auth.json --since` and Im using only this command to grab tweets from cron tab `2,7,12,17,22,27,32,37,42,47,52,57 * * * * run-one /home/gsajko/miniconda3/bin/twitter-to-sqlite home-timeline /home/gsajko/work/custom_twitter_feed/home.db -a /home/gsajko/work/custom_twitter_feed/auth/auth.json --since` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607, https://github.com/dogsheep/apple-notes-to-sqlite/issues/8#issuecomment-1468898285,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8,1468898285,IC_kwDOJHON9s5XjZvt,41546558,2023-03-14T22:00:21Z,2023-03-14T22:00:21Z,NONE,"Well that's embarrassing. I made a fork using macnotesapp and it's actually slower. This is because the Scripting Bridge sometimes fails to return the folder and thus macnotesapp resorts to AppleScript in this situation. The repeated AppleScript calls on a large library are slower than your ""slurp it all in"" approach. I've got some ideas about how to improve this--will make another attempt if I can fix the issues.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617823309, https://github.com/dogsheep/dogsheep-photos/issues/3#issuecomment-934372104,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3,934372104,IC_kwDOD079W843sWMI,41546558,2021-10-05T12:38:24Z,2021-10-05T12:38:24Z,CONTRIBUTOR,"As dogsheep-photos already uses [osxphotos](https://github.com/RhetTbull/osxphotos) to load photos you can access the EXIF data via osxphotos. Apple Photos imports a small subset of EXIF data at the time the photo is imported and osxphotos provides this via the [exif_info](https://github.com/RhetTbull/osxphotos#exifinfo) property. If you want the full EXIF data, osxphotos also provides a wrapper around [exiftool](https://github.com/RhetTbull/osxphotos#exiftool).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",602533481, https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778246347,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33,778246347,MDEyOklzc3VlQ29tbWVudDc3ODI0NjM0Nw==,41546558,2021-02-12T15:00:43Z,2021-02-12T15:00:43Z,CONTRIBUTOR,"Yes, Big Sur Photos database doesn't have `ZGENERICASSET` table. PR #31 will fix this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803338729, https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-748562330,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31,748562330,MDEyOklzc3VlQ29tbWVudDc0ODU2MjMzMA==,41546558,2020-12-20T04:45:08Z,2020-12-20T04:45:08Z,CONTRIBUTOR,Fixes the issue mentioned here: https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771511344, https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748562288,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15,748562288,MDEyOklzc3VlQ29tbWVudDc0ODU2MjI4OA==,41546558,2020-12-20T04:44:22Z,2020-12-20T04:44:22Z,CONTRIBUTOR,"@nickvazz @simonw I opened a [PR](https://github.com/dogsheep/dogsheep-photos/pull/31) that replaces the SQL for `ZCOMPUTEDASSETATTRIBUTES` to use osxphotos which now exposes all this data and has been updated for Big Sur. I did regression tests to confirm the extracted data is identical, with one exception which should not affect operation: the old code pulled data from `ZCOMPUTEDASSETATTRIBUTES` for missing photos while the main loop ignores missing photos and does not add them to `apple_photos`. The new code does not add rows to the `apple_photos_scores` table for missing photos.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612151767, https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436779,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15,748436779,MDEyOklzc3VlQ29tbWVudDc0ODQzNjc3OQ==,41546558,2020-12-19T07:49:00Z,2020-12-19T07:49:00Z,CONTRIBUTOR,@nickvazz ZGENERICASSET changed to ZASSET in Big Sur. Here's a list of other changes to the schema in Big Sur: https://github.com/RhetTbull/osxphotos/wiki/Changes-in-Photos-6---Big-Sur,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612151767, https://github.com/dogsheep/dogsheep-photos/issues/22#issuecomment-628405453,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22,628405453,MDEyOklzc3VlQ29tbWVudDYyODQwNTQ1Mw==,41546558,2020-05-14T05:59:53Z,2020-05-14T05:59:53Z,CONTRIBUTOR,"I've added support for the above exif data to [v0.28.17](https://github.com/RhetTbull/osxphotos/releases/tag/v0.28.17) of osxphotos. `PhotoInfo.exif_info` will return an `ExifInfo` [dataclass](https://docs.python.org/3/library/dataclasses.html) object with the following properties: ```python flash_fired: bool iso: int metering_mode: int sample_rate: int track_format: int white_balance: int aperture: float bit_rate: float duration: float exposure_bias: float focal_length: float fps: float latitude: float longitude: float shutter_speed: float camera_make: str camera_model: str codec: str lens_model: str ``` It's not all the EXIF data available in most files but is the data Photos deems important to save. Of course, you can get all the exif_data Note: this only works in Photos 5. As best as I can tell, EXIF data is not stored in the database for earlier versions. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615626118, https://github.com/dogsheep/dogsheep-photos/issues/22#issuecomment-627007458,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22,627007458,MDEyOklzc3VlQ29tbWVudDYyNzAwNzQ1OA==,41546558,2020-05-11T22:51:52Z,2020-05-11T22:52:26Z,CONTRIBUTOR,"I'm not familiar with `ExifReader`. I wrote my own wrapper around `exiftool` because I wanted a simple way to write EXIF data when exporting photos (e.g. writing out to PersonInImage and keywords to IPTC:Keywords) and the existing python packages like [pyexiftool](https://github.com/smarnach/pyexiftool) didn't do quite what I wanted. If all you're after is the camera and shot info, that's available in `ZEXTENDEDATTRIBUTES` table. I've got an open issue [#11](https://github.com/RhetTbull/osxphotos/issues/11) to add this to osxphotos but it hasn't bubbled to the top of my backlog yet. osxphotos will give you the location info: `PhotoInfo.location` returns a tuple of (lat, lon) though this info is in ZEXTENDEDATTRIBUTES too (though it might not be correct as I believe Photos creates this table at import and the user might have changed the location of a photo, e.g. if camera didn't have GPS). ```sql CREATE TABLE ZEXTENDEDATTRIBUTES ( Z_PK INTEGER PRIMARY KEY, Z_ENT INTEGER, Z_OPT INTEGER, ZFLASHFIRED INTEGER, ZISO INTEGER, ZMETERINGMODE INTEGER, ZSAMPLERATE INTEGER, ZTRACKFORMAT INTEGER, ZWHITEBALANCE INTEGER, ZASSET INTEGER, ZAPERTURE FLOAT, ZBITRATE FLOAT, ZDURATION FLOAT, ZEXPOSUREBIAS FLOAT, ZFOCALLENGTH FLOAT, ZFPS FLOAT, ZLATITUDE FLOAT, ZLONGITUDE FLOAT, ZSHUTTERSPEED FLOAT, ZCAMERAMAKE VARCHAR, ZCAMERAMODEL VARCHAR, ZCODEC VARCHAR, ZLENSMODEL VARCHAR ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615626118, https://github.com/dogsheep/dogsheep-photos/issues/22#issuecomment-626667235,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22,626667235,MDEyOklzc3VlQ29tbWVudDYyNjY2NzIzNQ==,41546558,2020-05-11T12:20:34Z,2020-05-11T12:20:34Z,CONTRIBUTOR,"@simonw FYI, osxphotos includes a built in ExifTool class that uses [exiftool](https://exiftool.org/) to read and write exif data. It's not exposed yet in the docs because I really only use it right now in the osphotos command line interface to write tags when exporting. In v0.28.16 (just pushed) I added an ExifTool.as_dict() method which will give you a dict with all the exif tags in a file. For example: ```python import osxphotos photos = osxphotos.PhotosDB().photos() exiftool = osxphotos.exiftool.ExifTool(photos[0].path) exifdata = exiftool.as_dict() tags = exifdata[""IPTC:Keywords""] ``` Not as elegant perhaps as a python only implementation because ExifTool has to make subprocess calls to an external tool but exiftool is by far the best tool available for reading and writing EXIF data and it does support HEIC. As for implementation, ExifTool uses a singleton pattern so the first time you instantiate it, it spawns an IPC to exiftool but then keeps it open and uses the same process for any subsequent calls (even on different files). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615626118, https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626396379,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,626396379,MDEyOklzc3VlQ29tbWVudDYyNjM5NjM3OQ==,41546558,2020-05-10T22:01:48Z,2020-05-10T22:01:48Z,CONTRIBUTOR,"Frustrates me when package authors create a ""drop in"" replacement with the same import name...this kind of thing has bitten me more than once! Would've been nicer I think for bpylist2 to do ""import bpylist2 as bpylist""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990, https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395641,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,626395641,MDEyOklzc3VlQ29tbWVudDYyNjM5NTY0MQ==,41546558,2020-05-10T21:55:54Z,2020-05-10T21:55:54Z,CONTRIBUTOR,Did removing old bpylist solve the original problem or do you still have a photo that throws circular reference?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990, https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395507,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,626395507,MDEyOklzc3VlQ29tbWVudDYyNjM5NTUwNw==,41546558,2020-05-10T21:54:45Z,2020-05-10T21:54:45Z,CONTRIBUTOR,"@simonw does Photos show valid reverse geolocation info? Are you sure you're using [bpylist2](https://github.com/xa4a/bpylist2) and not bpylist? They're both unfortunately imported as ""bpylist"" so if you somehow got the wrong (original bpylist) version installed, it could be the issue. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990, https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626390317,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,626390317,MDEyOklzc3VlQ29tbWVudDYyNjM5MDMxNw==,41546558,2020-05-10T21:11:24Z,2020-05-10T21:50:58Z,CONTRIBUTOR,"Ugh....Yeah, I think easiest is to catch the exception and return no place as you suggest. This particular bit of code involves un-archiving a serialized NSKeyedArchiver which uses an object table and it is certainly possible to create a circular reference that way. Because this is happening in the decode, the circular reference must be in the original data. Does Photos show valid reverse geolocation info for the photo in question? If so, Photos may be doing something beyond a simple decode of the binary plist. For now, I'll push a patch to catch the exception.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990, https://github.com/dogsheep/dogsheep-photos/issues/17#issuecomment-624284539,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17,624284539,MDEyOklzc3VlQ29tbWVudDYyNDI4NDUzOQ==,41546558,2020-05-05T20:20:05Z,2020-05-05T20:20:05Z,CONTRIBUTOR,"FYI, I've got an [issue](https://github.com/RhetTbull/osxphotos/issues/25) to make osxphotos cross-platform but it's low on my priority list. About 90% of the functionality could be done cross-platform but right now the MacOS specific stuff is embedded throughout and would take some work. Though I try to minimize it, there's sprinklings of ObjC & Applescript throughout osxphotos.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612860531, https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623845014,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16,623845014,MDEyOklzc3VlQ29tbWVudDYyMzg0NTAxNA==,41546558,2020-05-05T03:55:14Z,2020-05-05T03:56:24Z,CONTRIBUTOR,"I'm traveling w/o access to my Mac so can't help with any code right now. I suspected ZSCENEIDENTIFIER was a foreign key into one of these psi.sqlite tables. But looks like you're on to something connecting groups to assets. As for the UUID, I think there's two ints because each is 64-bits but UUIDs are 128-bits. Thus they need to be combined to get the 128 bit UUID. You might be able to use Apple's [NSUUID](https://developer.apple.com/documentation/foundation/nsuuid?language=objc), for example, by wrapping with pyObjC. Here's one [example](https://github.com/ronaldoussoren/pyobjc/blob/881c82a7ba90f193934b52b44143360c80dce5e5/pyobjc-framework-Cocoa/PyObjCTest/test_nsuuid.py) of using this in PyObjC's test suite. Interesting it's stored this way instead of a UUIDString as in Photos.sqlite. Perhaps it for faster indexing. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612287234, https://github.com/simonw/datasette/issues/2067#issuecomment-1532304714,https://api.github.com/repos/simonw/datasette/issues/2067,1532304714,IC_kwDOBm6k_c5bVR1K,39538958,2023-05-03T00:16:03Z,2023-05-03T00:16:03Z,NONE,"Curiously, after running commands on the database that was litestream-restored, datasette starts to work again, e.g. ```sh litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # fails (OperationalError described above) ``` ```sh litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite-utils enable-wal data/db.sqlite datasette data/db.sqlite # works ``` ```sh litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite-utils optimize data/db.sqlite datasette data/db.sqlite # works ``` ```sh litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite "".clone test.db"" datasette test.db # works ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1690765434, https://github.com/simonw/sqlite-utils/issues/496#issuecomment-1294408928,https://api.github.com/repos/simonw/sqlite-utils/issues/496,1294408928,IC_kwDOCGYnMM5NJxzg,39538958,2022-10-28T03:36:56Z,2022-10-28T03:37:50Z,NONE,"With respect to the typing of Table class itself, my interim solution: ```python from sqlite_utils.db import Table def tbl(self, table_name: str) -> Table: tbl = self.db[table_name] if isinstance(tbl, Table): return tbl raise Exception(f""Missing {table_name=}"") ``` With respect to @chapmanjacobd concern on the `DEFAULT` being an empty class, have also been using `# type: ignore`, e.g. ```python @classmethod def insert_list(cls, areas: list[str]): return meta.tbl(meta.Areas).insert_all( ({""area"": a} for a in areas), ignore=True # type: ignore ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1393202060, https://github.com/simonw/datasette/issues/1082#issuecomment-721547177,https://api.github.com/repos/simonw/datasette/issues/1082,721547177,MDEyOklzc3VlQ29tbWVudDcyMTU0NzE3Nw==,39538958,2020-11-04T06:52:30Z,2020-11-04T06:53:16Z,NONE,"I think I tried the same db size on the following scenarios in Digital Ocean: 1. Basic ($5/month) with 512MB RAM 2. Basic ($10/month) with 1GB RAM 3. Pro ($12/month) with 1GB RAM All such attempts conked out with ""out of memory"" errors","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735852274, https://github.com/simonw/datasette/issues/858#issuecomment-792230560,https://api.github.com/repos/simonw/datasette/issues/858,792230560,MDEyOklzc3VlQ29tbWVudDc5MjIzMDU2MA==,39445562,2021-03-07T07:14:58Z,2021-03-07T07:14:58Z,NONE,"To get it to work I had to: - add `shell=true` to the various commands in datasette - use the name argument of the publish command. (https://docs.datasette.io/en/stable/publish.html) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642388564, https://github.com/simonw/datasette/issues/858#issuecomment-699690034,https://api.github.com/repos/simonw/datasette/issues/858,699690034,MDEyOklzc3VlQ29tbWVudDY5OTY5MDAzNA==,39445562,2020-09-27T21:23:04Z,2020-09-27T21:23:04Z,NONE,"Hi Simon, Thanks so much for all your work on datasette, it's an excellent project and I wish you all the best with it. I particularly enjoyed your talk at the Django London Meetup a short while back. I've been trying to publish to Heroku from Windows 10 and I was running into this error. I'm not sure why it can't be run without `shell=True` on Windows but this seems to help. With this change, I am able to publish if I pass in a `name` to the `publish` command. When a `name` is not passed the default of `datasette` is used and therefore this line here fails (as datasette at heroku already exists) and causes the recession error mentioned above. https://github.com/simonw/datasette/blob/9a6d0dce282e7fb58c5610e24c74098c923abfdc/datasette/publish/heroku.py#L126 I tried to write a patch for this but I am really struggling with being on Windows (many of the tests seem to fail anyway?), and my lack of knowledge of Mock, so sorry for this. Hope this is of some help. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642388564, https://github.com/simonw/sqlite-utils/issues/246#issuecomment-801816980,https://api.github.com/repos/simonw/sqlite-utils/issues/246,801816980,MDEyOklzc3VlQ29tbWVudDgwMTgxNjk4MA==,37962604,2021-03-18T10:40:32Z,2021-03-18T10:43:04Z,NONE,"I have found a similar problem, but I only when using that type of query (with `*` for doing a prefix search). I'm also building something on top of FTS5/sqlite-utils, and the way I decided to handle it was creating a specific function for prefixes. According to [the docs](https://www2.sqlite.org/fts5.html#fts5_prefix_queries), the query can be done in this 2 ways: ```sql ... MATCH '""one two thr"" * ' ... MATCH 'one + two + thr*' ``` I thought I could build a query like the first one using this function: ```python def prefix(query: str): return f'""{query}"" *' ``` And then I use the output of that function as the query parameter for the standard `.search()` method in sqlite-utils. However, my use case is different because I'm the one ""deciding"" when to use a prefix search, not the end user. I also haven't done many tests, but maybe you found that useful. One thing I could think of is checking if the query has an `*` at the end, remove it and build the prefix query using the function above. This is just for prefix queries, I think having the escaping function is still useful for other use cases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",831751367, https://github.com/simonw/sqlite-utils/issues/242#issuecomment-787150276,https://api.github.com/repos/simonw/sqlite-utils/issues/242,787150276,MDEyOklzc3VlQ29tbWVudDc4NzE1MDI3Ng==,37962604,2021-02-27T21:27:26Z,2021-02-27T21:27:26Z,NONE,"I had this resource by Seth Michael Larson saved https://github.com/sethmlarson/pycon-async-sync-poster I haven't had a look at it, but it may contain useful info. On twitter, I mentioned passing an aiosqlite connection during the `Database` creation. I'm not 100% familiar with the `sqlite-utils` codebase, so I may be wrong here, but maybe decorating internal functions could be an option? Then they are awaited or not inside the decorator depending on how they are called.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",817989436, https://github.com/simonw/sqlite-utils/pull/224#issuecomment-765678057,https://api.github.com/repos/simonw/sqlite-utils/issues/224,765678057,MDEyOklzc3VlQ29tbWVudDc2NTY3ODA1Nw==,37962604,2021-01-22T20:53:06Z,2021-01-23T20:13:27Z,NONE,"I'm using the FTS methods in sqlite-utils for this website: [drwn.io](https://drwn.io/). I wanted to get pagination to have some kind of infinite scrolling in the landing page, and I ended up using that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792297010, https://github.com/simonw/datasette/issues/1272#issuecomment-1199115002,https://api.github.com/repos/simonw/datasette/issues/1272,1199115002,IC_kwDOBm6k_c5HeQr6,37748899,2022-07-29T10:22:58Z,2022-07-29T10:22:58Z,NONE,"> test_dockerfile.py > I'm skipping this for the moment because the new Dockerfile shape introduced in [#1249 (comment)](https://github.com/simonw/datasette/issues/1249#issuecomment-804404544) isn't compatible with this technique, since it installs Datasette from PyPI rather than directly from the repo. > > Will need to change that if I want to do this unit tests thing. Hello, can't you copy in a later step directly in the output directory, e.g. COPY test_dockerfile.py /usr/lib/python*/.. or something like that ? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",838245338, https://github.com/simonw/sqlite-utils/issues/98#issuecomment-928790381,https://api.github.com/repos/simonw/sqlite-utils/issues/98,928790381,IC_kwDOCGYnMM43XDdt,36834097,2021-09-28T04:38:44Z,2021-09-28T04:38:44Z,NONE,"Hi @simonw - wondering if you might be able to shed some light here. I've seemed to reproduce this issue. Here's the stacktrace: ``` ... db[""potholes""].insert(pothole, pk='id', alter=True, replace=True) ... Traceback (most recent call last): File """", line 3, in File ""/Users/patricktrainer/.pyenv/versions/3.9.0/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2481, in insert return self.insert_all( File ""/Users/patricktrainer/.pyenv/versions/3.9.0/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2596, in insert_all self.insert_chunk( File ""/Users/patricktrainer/.pyenv/versions/3.9.0/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2424, in insert_chunk row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ``` Interesting enough, I found that omitting the `pk` param does not throw the error. Let me know how I can help out! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",597671518, https://github.com/simonw/datasette/issues/415#issuecomment-473217334,https://api.github.com/repos/simonw/datasette/issues/415,473217334,MDEyOklzc3VlQ29tbWVudDQ3MzIxNzMzNA==,36796532,2019-03-15T09:30:57Z,2019-03-15T09:30:57Z,NONE,"Awesome, thanks! 😁 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842, https://github.com/simonw/datasette/issues/120#issuecomment-439421164,https://api.github.com/repos/simonw/datasette/issues/120,439421164,MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==,36796532,2018-11-16T15:05:18Z,2018-11-16T15:05:18Z,NONE,This would be an awesome feature ❤️ ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397, https://github.com/simonw/datasette/issues/2126#issuecomment-1674242356,https://api.github.com/repos/simonw/datasette/issues/2126,1674242356,IC_kwDOBm6k_c5jyuk0,36199671,2023-08-11T05:52:29Z,2023-08-11T05:52:29Z,NONE,"I see :) yeah, I’m on the stable version installed from homebrew on macOS","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1838266862, https://github.com/simonw/datasette/issues/2126#issuecomment-1666912107,https://api.github.com/repos/simonw/datasette/issues/2126,1666912107,IC_kwDOBm6k_c5jWw9r,36199671,2023-08-06T16:27:34Z,2023-08-06T16:27:34Z,NONE,"And in similar fashion, how can I assign the `edit-tiddlywiki` permission to my user `myuser` in `metadata.yml` / `metadata.json`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1838266862, https://github.com/simonw/datasette/issues/1479#issuecomment-1114601882,https://api.github.com/repos/simonw/datasette/issues/1479,1114601882,IC_kwDOBm6k_c5Cb3ma,32839123,2022-05-02T08:10:27Z,2022-05-02T11:54:49Z,NONE,"Also ran into this issue today using `datasette package`. The stack trace takes up my whole PowerShell history, though (recursionerror), but it also concerns the temporary directory. Our development machines have a very zealous scanner that appears to insert itself between every call to the filesystem. I suspected that was causing some racing, but this turned out not to be the case: inserting `time.sleep(3)` on line 451 of `datasette/datasette/utils/__init__.py` does not make the problem go away. Commenting out the `tmp.cleanup()` line does. The next error I get is docker-specific, so that probably does resolve the Datasette error here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1010112818,