issues
17 rows where comments = 0, repo = 140912432 and state = "open" sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: user, author_association, created_at (date), updated_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1978603203 | I_kwDOCGYnMM517xbD | 602 | `sqlite-utils transform` removes the `AUTOINCREMENT` keyword | ArsTapatun 4472046 | open | 0 | 0 | 2023-11-06T08:48:43Z | 2023-11-06T08:48:43Z | NONE | ContextWe ran into this bug randomly, noticing that deleted Reproducible exampleOriginal database ```sql $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ) EOF $ sqlite3 test.db ".schema mytable" CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ); ``` Modified database after sqlite-utils ```sql $ sqlite-utils transform test.db mytable --rename col2 renamedcol2 $ sqlite3 test.db "SELECT sql FROM sqlite_master WHERE name = 'mytable';" CREATE TABLE IF NOT EXISTS "mytable" ( [col1] INTEGER PRIMARY KEY, [renamedcol2] TEXT NOT NULL ); ``` |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/602/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1977155641 | I_kwDOCGYnMM512QA5 | 601 | Move plugin directory into documentation | simonw 9599 | open | 0 | 0 | 2023-11-04T04:07:52Z | 2023-11-04T04:07:52Z | OWNER | https://github.com/simonw/sqlite-utils-plugins should be in the official documentation. I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1920416843 | I_kwDOCGYnMM5ydzxL | 597 | sqlite-utils insert-files should be able to convert fields | grimnight 1737541 | open | 0 | 0 | 2023-09-30T22:20:47Z | 2023-09-30T22:20:47Z | NONE | Currently using both ```shell ~ ❯ cat test.py import os class Example: def init(self, arg1, arg2): self.arg1 = arg1 ~ ❯ sqlite-utils insert-files test.sqlar sqlar test.py -c name:name -c data:content -c mode:mode -c mtime:mtime -c sz:size --pk=name [####################################] 100% ~ ❯ sqlite-utils convert test.sqlar sqlar data "zlib.compress(value)" --import=zlib --where "name = 'test.py'" [####################################] 100% ~ ❯ cat test.py | sqlite-utils convert test.sqlar sqlar data "zlib.compress(sys.stdin.buffer.read())" --import=zlib --import=sys --where "name = 'test.py'" # Alternative way [####################################] 100% ~ ❯ sqlite3 test.sqlar "SELECT hex(data) FROM sqlar WHERE name = 'test.py';" | python3 -c "import sys, zlib; sys.stdout.buffer.write(zlib.decompress(bytes.fromhex(sys.stdin.read())))" import os class Example: def init(self, arg1, arg2): self.arg1 = arg1 ~ ❯ rm test.py ~ ❯ sqlar -l test.sqlar test.py ~ ❯ sqlar -x test.sqlar ~ ❯ cat test.py import os class Example: def init(self, arg1, arg2): self.arg1 = arg1 ``` |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/597/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1754174496 | I_kwDOCGYnMM5ojpQg | 558 | Ability to define unique columns when creating a table | aguinane 1910303 | open | 0 | 0 | 2023-06-13T06:56:19Z | 2023-08-18T01:06:03Z | NONE | When creating a new table, it would be good to have an option to set unique columns similar to how not_null is set. ```python from sqlite_utils import Database columns = {"mRID": str, "name": str} db = Database("example.db") db["ExampleTable"].create(columns, pk="mRID", not_null=["mRID"], if_not_exists=True) db["ExampleTable"].create_index(["mRID"], unique=True, if_not_exists=True) ``` So something like this would add the UNIQUE flag to the table definition.
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/558/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1839344979 | I_kwDOCGYnMM5toi1T | 582 | Handling CSV/file input that contains NUL bytes | betatim 1448859 | open | 0 | 0 | 2023-08-07T12:24:14Z | 2023-08-07T12:24:14Z | NONE | I was using sqlite-utils to create a DB from a CSV and it turns out the CSV contains a NUL byte. When the processing reaches the line that contains the NUL an exception is raised. I'm wondering if there is something that can be done in Concretely the file is the This is the command and output:
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/582/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1822918995 | I_kwDOCGYnMM5sp4lT | 580 | Add way to export to a csv file using the Python library | kevinlinxc 44324811 | open | 0 | 0 | 2023-07-26T18:09:26Z | 2023-07-26T18:09:26Z | NONE | According to the documentation, we can make a csv output using the CLI tool, but not the Python library. Could we have the latter? |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/580/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1821108702 | I_kwDOCGYnMM5si-ne | 579 | Special handling for SQLite column of type `JSON` | asg017 15178711 | open | 0 | 0 | 2023-07-25T20:37:23Z | 2023-07-25T20:37:23Z | CONTRIBUTOR |
Automatic NestingAccording to "Nested JSON Values", sqlite-utils will only expand JSON if the Instead,
I'm sure there's other ways |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/579/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1550536442 | I_kwDOCGYnMM5ca076 | 521 | Custom JSON encoder | janrito 31504 | open | 0 | 0 | 2023-01-20T09:19:40Z | 2023-01-20T09:19:40Z | NONE | It would be nice if we could specify a custom encoder (and decoder) for types that will need extra deserialisation – e.g., sets, enums or sparse matrices – or even project-specific types |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/521/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1453134846 | I_kwDOCGYnMM5WnRP- | 513 | Add or document streamlined workflow for importing Datasette csv / json exports | henry501 19328961 | open | 0 | 0 | 2022-11-17T10:54:47Z | 2022-11-17T10:54:47Z | NONE | I'm working on some small front-end enhancements to the laion-aesthetic-datasette project, and I wanted to partially populate a database directly using exports from the existing Datasette instance instead of downloading the parquet files and creating my own multi-GB database. There have been a number of small issues that are certainly related to my relative lack of familiarity with the toolkit, but that are still surprising. For example: a CSV export of the images table (http://laion-aesthetic.datasette.io/laion-aesthetic-6pls.csv?sql=select+rowid%2C+url%2C+text%2C+domain_id%2C+width%2C+height%2C+similarity%2C+punsafe%2C+pwatermark%2C+aesthetic%2C+hash%2C+index_level_0+from+images+order+by+random%28%29+limit+100) has nested single quotes, double quotes, and commas that aren't handled by rows_from_file. Similarly, the json output has to be manually transformed to add the column names and remove extraneous information before sqlite_utils can import it. I was able to work through these issues, but as an enhancement it would be really helpful to create or document a clear workflow that avoids the friction of this data transformation. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/513/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1359604075 | I_kwDOCGYnMM5RCelr | 481 | Idea: `sqlite-utils create-table tablename --sql "select ..."` | simonw 9599 | open | 0 | 0 | 2022-09-02T01:41:24Z | 2022-09-02T01:42:08Z | OWNER | Could offer syntactic sugar for:
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/481/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1355193529 | I_kwDOCGYnMM5Qxpy5 | 479 | OperationalError: cannot VACUUM from within a transaction | chapmanjacobd 7908073 | open | 0 | 0 | 2022-08-30T05:34:24Z | 2022-08-30T05:34:24Z | CONTRIBUTOR | Maybe when calling ``` 46 db["media"].optimize() # type: ignore ---> 47 db.vacuum() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:1047, in Database.vacuum(self)
1045 def vacuum(self):
1046 "Run a SQLite File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:470, in Database.execute(self, sql, parameters) 468 return self.conn.execute(sql, parameters) 469 else: --> 470 return self.conn.execute(sql) OperationalError: cannot VACUUM from within a transaction ``` It might also be nice to add a sentence or two about how transactions are committed on the docs page. When I was swapping out my sqlite3 code for this library it was nice that everything was pretty much drop-in but I was/am unsure what to do about the places I explicitly call Related to https://github.com/simonw/sqlite-utils/issues/121 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/479/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1324659241 | I_kwDOCGYnMM5O9LIp | 459 | Single quoted transform recipes on Windows do not work as expected | shakeel 19921 | open | 0 | 0 | 2022-08-01T16:14:54Z | 2022-08-01T16:14:54Z | CONTRIBUTOR | Trying to follow the tutorial for sqlite-utils and datasette https://datasette.io/tutorials/clean-data on Windows 11 OS
In the step to transform dates into ISO dates the quoted value ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ 'r.parsedatetime(value)' --dry-run 1975/01/31 00:00:00+00 --- becomes: r.parsedatetime(value) Would affect 13568 rows ``` However, if I change the code from single quotes to double quotes, it works as expected. ``` sqlite-utils convert manatees.db locations \ REPDATE created_date last_edited_date \ "r.parsedatetime(value)" --dry-run 1975/01/31 00:00:00+00 --- becomes: 1975-01-31T00:00:00+00:00 Would affect 13568 rows ``` Specifying the transform code recipe should work with single quotes on Windows. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/459/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1090798237 | I_kwDOCGYnMM5BBEKd | 359 | Use RETURNING if available to populate last_pk | simonw 9599 | open | 0 | 0 | 2021-12-29T23:43:23Z | 2021-12-29T23:43:23Z | OWNER | Inspired by this: https://news.ycombinator.com/item?id=29729283
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/359/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
818684978 | MDU6SXNzdWU4MTg2ODQ5Nzg= | 243 | How can i use this utils to deal with fts on column meta of tables ? | svjack 27874014 | open | 0 | 0 | 2021-03-01T09:45:05Z | 2021-03-01T09:45:05Z | NONE | Thank you to release this bravo project. When i use this project on multi table db, I want to implement convenient search on column name from different tables. I want to develop a meta table to save the meta data of different columns of different tables and search on this meta table to get rows from the data table (which the meta table describes) does this project provide some simple function on it ? You can think a have a knowledge graph about the table in the db, and i save this knowledge graph into the db with fts enabled. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/243/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
816601354 | MDExOlB1bGxSZXF1ZXN0NTgwMjM1NDI3 | 241 | Extract expand - work in progress | simonw 9599 | open | 0 | 0 | 2021-02-25T16:36:38Z | 2021-02-25T16:36:38Z | OWNER | simonw/sqlite-utils/pulls/241 | Refs #239. Still needs documentation and CLI implementation. |
sqlite-utils 140912432 | pull | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/241/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1 | ||||||
581795570 | MDU6SXNzdWU1ODE3OTU1NzA= | 93 | Support more string values for types in .add_column() | simonw 9599 | open | 0 | 0 | 2020-03-15T19:32:49Z | 2020-09-24T20:36:46Z | OWNER | https://sqlite-utils.readthedocs.io/en/2.4.2/python-api.html#adding-columns says:
As discovered in #92 this isn't the right list of values. I should expand this to match https://www.sqlite.org/datatype3.html |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/93/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
688352145 | MDU6SXNzdWU2ODgzNTIxNDU= | 141 | insert-files support for compressed values | simonw 9599 | open | 0 | 0 | 2020-08-28T20:59:46Z | 2020-09-24T20:36:08Z | OWNER | The |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/141/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);