home / github

Menu
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

14 rows where user = 7908073

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: state, comments, type, created_at (date), updated_at (date), closed_at (date)

id ▼ node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1239034903 I_kwDOCGYnMM5J2iwX 433 CLI eats my cursor chapmanjacobd 7908073 closed 0     10 2022-05-17T18:52:52Z 2023-11-04T00:46:30Z 2023-11-04T00:46:30Z CONTRIBUTOR   I'm not sure why this happens but `sqlite-utils` makes my terminal cursor disappear after running commands like `sqlite-utils insert`. I've only noticed this behavior in `sqlite-utils`, not in any other CLI tools I can still type commands after it runs but the text cursor is invisible sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/433/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1355193529 I_kwDOCGYnMM5Qxpy5 479 OperationalError: cannot VACUUM from within a transaction chapmanjacobd 7908073 open 0     0 2022-08-30T05:34:24Z 2022-08-30T05:34:24Z   CONTRIBUTOR   Maybe when calling `.vacuum()` and other DB-level write-lock operations `sqlite_utils` could guard against this error message by automatically committing first? ``` 46 db["media"].optimize() # type: ignore ---> 47 db.vacuum() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:1047, in Database.vacuum(self) 1045 def vacuum(self): 1046 "Run a SQLite ``VACUUM`` against the database." -> 1047 self.execute("VACUUM;") File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:470, in Database.execute(self, sql, parameters) 468 return self.conn.execute(sql, parameters) 469 else: --> 470 return self.conn.execute(sql) OperationalError: cannot VACUUM from within a transaction ``` It might also be nice to add a sentence or two about how transactions are committed on the [docs page](https://sqlite-utils.datasette.io/en/latest/python-api.html#detect-fts). When I was swapping out my sqlite3 code for this library it was nice that everything was pretty much drop-in but I was/am unsure what to do about the places I explicitly call `.commit()` in my code Related to https://github.com/simonw/sqlite-utils/issues/121 sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/479/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}    
1355433619 PR_kwDOCGYnMM4-B7Mc 480 search_sql add include_rank option chapmanjacobd 7908073 closed 0     4 2022-08-30T09:10:29Z 2022-08-31T03:40:35Z 2022-08-31T03:40:35Z CONTRIBUTOR simonw/sqlite-utils/pulls/480 I haven't tested this yet but wanted to get a heads-up whether this kind of change would be useful or if I should just duplicate the function and tweak it within my code <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--480.org.readthedocs.build/en/480/ <!-- readthedocs-preview sqlite-utils end --> sqlite-utils 140912432 pull     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/480/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 0  
1361355564 I_kwDOCGYnMM5RJKMs 482 balanced table default column_order chapmanjacobd 7908073 closed 0     1 2022-09-05T03:00:18Z 2022-10-10T17:43:02Z 2022-09-06T20:17:27Z CONTRIBUTOR   Is there any performance or size difference with column order in SQLITE ? similar to this https://www.cybertec-postgresql.com/en/column-order-in-postgresql-does-matter/ It might be interesting to have an option to create with an optimized column order. I'm assuming this would look something like INTEGER columns, REAL columns, BLOB columns, TEXT columns, NULL columns. NULL columns at the end because they are more likely to be TEXT and it is impossible to know if they will become INTEGER (Of course, any schema evolution would reduce optimization but maybe column order could also be re-evaluated when schema changes) edit: this is easy to accomplish with the existing `transform` method: ``` int_columns = [k for k, v in table_columns.items() if v == int] db[table].transform(column_order=[*int_columns]) ``` sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/482/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1393202060 I_kwDOCGYnMM5TCpOM 496 devrel/python api: Pylance type hinting chapmanjacobd 7908073 open 0     4 2022-10-01T03:03:34Z 2023-05-03T05:53:27Z   CONTRIBUTOR   Pylance is generally pretty good at figuring out stuff but `sqlite-utils` has some quirks which make type hinting kinda useless. Maybe you don't care but I thought I would bring it to your attention. For example: ``` db["subs"].insert_all(subs, pk="index") ``` ``` Cannot access member "insert_all" for type "View" Member "insert_all" is unknown ``` `insert_all` and all the other methods show up as a type issues because the program can't know whether something is a View or a Table. Fair enough. But that basically throws all type checking out the window. `pk="index"` also shows up as a type issue: ``` Argument of type "Literal['index']" cannot be assigned to parameter "pk" of type "Default" in function "insert_all" "Literal['index']" is incompatible with "Default" ``` I think this is because DEFAULT is an empty class? maybe a few small changes could be made to make the library more type-friendly The interim solution is of course to turn off type hints completely for the line ``` db["subs"].insert_all(subs, pk="index") # type: ignore ``` sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/496/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}    
1393212964 I_kwDOCGYnMM5TCr4k 497 column_names chapmanjacobd 7908073 closed 0     1 2022-10-01T03:34:21Z 2022-10-25T21:09:28Z 2022-10-25T21:09:28Z CONTRIBUTOR   It would be nice to have a `column_names`. Similar to `table_names`. Or if you could get one or all of the following syntax to work for both Database and Table that might be even better: Style 1 - `if 'table1' in db` - `if 'col1' in db['table1']` Style 2 - `if 'table1' in db.tables` - `if 'col1' in db['table1'].columns` maybe the table ones actually work but I'm too lazy to check. I just know that I have to do: `[c.name for c in db['table1'].columns]` Edit: This is possible with `columns_dict`. I have actually used that before but I forgot about it. Feel free to close, but I do think accessing this data could be more consistent and intuitive. sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/497/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1404013495 PR_kwDOCGYnMM5AicIh 498 fix: enable-fts permanently save triggers chapmanjacobd 7908073 closed 0     2 2022-10-11T05:10:51Z 2022-10-15T04:33:08Z 2022-10-11T06:34:31Z CONTRIBUTOR simonw/sqlite-utils/pulls/498 I was wondering why my all my databases were giving wild search results. Turns out create_trigger was not sticking! Running `sqlite-utils triggers x.db` shows `[]` after running `enable-fts` using the python api. Looking at the counts trigger it seems that is the right way to save triggers. triggers show up now <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--498.org.readthedocs.build/en/498/ <!-- readthedocs-preview sqlite-utils end --> sqlite-utils 140912432 pull     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/498/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 0  
1405196044 PR_kwDOCGYnMM5AmYzy 499 feat: recreate fts triggers after table transform chapmanjacobd 7908073 open 0     2 2022-10-11T20:35:39Z 2022-10-26T17:54:51Z   CONTRIBUTOR simonw/sqlite-utils/pulls/499 https://github.com/simonw/sqlite-utils/pull/498 <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--499.org.readthedocs.build/en/499/ <!-- readthedocs-preview sqlite-utils end --> alternatively, `self.disable_fts()` sqlite-utils 140912432 pull     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/499/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 0  
1430325103 I_kwDOCGYnMM5VQQdv 507 conn.execute: UnicodeEncodeError: 'utf-8' codec can't encode character chapmanjacobd 7908073 closed 0     1 2022-10-31T18:49:51Z 2022-11-01T00:40:17Z 2022-11-01T00:40:16Z CONTRIBUTOR   I'm not really sure what caused this and it happened in the middle of my program (after running for 35775 seconds). ``` Extracting metadata 49.9% (chunk 9893 of 19831) ... File "/home/xk/.local/lib/python3.10/site-packages/xklb/fs_extract.py", line 90, in extract_chunk args.db["media"].insert_all(utils.list_dict_filter_bool(media), pk="path", alter=True, replace=True) File "/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py", line 3107, in insert_all self.insert_chunk( File "/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py", line 2872, in insert_chunk result = self.db.execute(query, params) File "/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py", line 483, in execute return self.conn.execute(sql, parameters) UnicodeEncodeError: 'utf-8' codec can't encode character '\udcc3' in position 62: surrogates not allowed ``` This might be relevant: https://stackoverflow.com/questions/31898353/python-cant-encode-with-surrogateescape I'm going to try re-running with ```py def execute( self, sql: str, parameters: Optional[Union[Iterable, dict]] = None ) -> sqlite3.Cursor: """ Execute SQL query and return a ``sqlite3.Cursor``. :param sql: SQL query to execute :param parameters: Parameters to use in that query - an iterable for ``where id = ?`` parameters, or a dictionary for ``where id = :id`` """ try: if self._tracer: self._tracer(sql, parameters) if parameters is not None: return self.conn.execute(sql, parameters) else: return self.conn.execute(sql) except UnicodeEncodeError: sql = sql.encode('utf-8', 'surrogatepass').decode('utf-8') if parameters is not None: parameters = parameters.encode('utf-8', 'surrogatepass').decode('utf-8') return self.execute(sql, parameters) ``` sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/507/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1430563092 PR_kwDOCGYnMM5B6_6K 508 Allow surrogates in parameters chapmanjacobd 7908073 closed 0     2 2022-10-31T22:11:49Z 2022-11-17T15:11:16Z 2022-10-31T22:55:36Z CONTRIBUTOR simonw/sqlite-utils/pulls/508 closes #507 https://dwheeler.com/essays/fixing-unix-linux-filenames.html <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--508.org.readthedocs.build/en/508/ <!-- readthedocs-preview sqlite-utils end --> sqlite-utils 140912432 pull     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/508/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} 0  
1436539554 I_kwDOCGYnMM5Vn9qi 511 [insert_all, upsert_all] IntegrityError: constraint failed chapmanjacobd 7908073 closed 0     2 2022-11-04T19:21:48Z 2022-11-04T22:59:54Z 2022-11-04T22:54:09Z CONTRIBUTOR   My understand is that `INSERT OR IGNORE` will ignore when inserts would cause duplicate keys so I'm not sure exactly why the error is raised from `sqlite3`. ``` import argparse from pathlib import Path from xklb import db, utils from xklb.utils import log def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument("database") parser.add_argument("dbs", nargs="*") parser.add_argument("--upsert") parser.add_argument("--db", "-db", help=argparse.SUPPRESS) parser.add_argument("--verbose", "-v", action="count", default=0) args = parser.parse_args() if args.db: args.database = args.db Path(args.database).touch() args.db = db.connect(args) log.info(utils.dict_filter_bool(args.__dict__)) return args def merge_db(args, source_db): source_db = str(Path(source_db).resolve()) s_db = db.connect(argparse.Namespace(database=source_db, verbose=args.verbose)) for table in [s for s in s_db.table_names() if not "_fts" in s and not s.startswith("sqlite_")]: log.info("[%s]: %s", source_db, table) with s_db.conn: data = s_db[table].rows with args.db.conn: if args.upsert: args.db[table].upsert_all(data, pk=args.upsert.split(","), alter=True) else: args.db[table].insert_all(data, alter=True, replace=True) def merge_dbs(): args = parse_args() for s_db in args.dbs: merge_db(args, s_db) if __name__ == "__main__": merge_dbs() ``` ``` $ lb-dev merge video.db tube_71.db --upsert path -vv SQL: INSERT OR IGNORE INTO [media]([path]) VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz'] ... File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, re… sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1581090327 I_kwDOCGYnMM5ePYYX 529 Microsoft line endings chapmanjacobd 7908073 closed 0     1 2023-02-12T02:20:48Z 2023-06-14T23:12:12Z 2023-06-14T23:11:47Z CONTRIBUTOR   sqlite-utils prints `\r\n` but [it should probably](https://devblogs.microsoft.com/commandline/extended-eol-in-notepad/) print `\n` (unless the platform is detected as Windows?) It has tripped me up a few times when piping the output of sqlite-utils to other programs: ``` $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | cat -A /mnt/d7/file^M$ $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | tr -d '\r' | cat -A /mnt/d7/file$ ``` sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/529/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1655860104 I_kwDOCGYnMM5ismuI 535 rows: --transpose or psql extended view-like functionality chapmanjacobd 7908073 closed 0     2 2023-04-05T15:37:33Z 2023-06-15T08:39:49Z 2023-06-14T22:05:28Z CONTRIBUTOR   It would be nice if the rows subcommand had a flag, perhaps called `--transpose` which would print in long form instead of wide. Similar to extended display mode in psql (`\x`) In other words instead of this: ``` sqlite-utils rows --limit 5 --fmt github track_metadata.db songs ``` | track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 |… sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed
1740150327 I_kwDOCGYnMM5nuJY3 557 Aliased ROWID option for tables created from alter=True commands chapmanjacobd 7908073 closed 0     2 2023-06-04T05:29:28Z 2023-06-14T06:09:21Z 2023-06-05T19:26:26Z CONTRIBUTOR   > If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values. ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does `id integer primary key` DDL) makes it OK. It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the `id` column in any select statement where I plan on using `upsert` but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777 sqlite-utils 140912432 issue     {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}   completed

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 35.671ms · About: simonw/datasette-graphql