issues
10 rows where user = 193185
This data as json, CSV (advanced)
Suggested facets: state, comments, author_association, repo, type, created_at (date), updated_at (date), closed_at (date)
id ▼ | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
594553553 | MDExOlB1bGxSZXF1ZXN0Mzk5MTY2NDMz | 719 | asgi: check raw_path is not None | cldellow 193185 | closed | 0 | 1 | 2020-04-05T16:53:58Z | 2020-05-04T17:14:26Z | 2020-05-04T17:14:26Z | CONTRIBUTOR | simonw/datasette/pulls/719 | The ASGI spec (https://asgi.readthedocs.io/en/latest/specs/www.html#http) seems to imply that `None` is a valid value, so we need to check the value itself, not just whether the key is present. In particular, the [mangum](https://github.com/erm/mangum) adapter passes `None` for this key's value. This change permits mangum to be used to front datasette in Amazon API Gateway + AWS Lambda deployments. | datasette 107914493 | pull | {"url": "https://api.github.com/repos/simonw/datasette/issues/719/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | |||||
1465194249 | I_kwDOCGYnMM5XVRcJ | 514 | upsert of new row with check constraints fails | cldellow 193185 | closed | 0 | 5 | 2022-11-26T16:12:23Z | 2023-05-08T21:50:52Z | 2023-05-08T21:50:51Z | NONE | (I originally opened this in https://github.com/simonw/datasette-insert/issues/20, but I see that that library depends on sqlite-utils) In the case of a new row, upsert first adds the row, specifying only its pkeys: https://github.com/simonw/sqlite-utils/blob/965ca0d5f5bffe06cc02cd7741344d1ddddf9d56/sqlite_utils/db.py#L2783-L2787 This means that a table with NON NULL (or other constraint) columns that aren't part of the pkey can't have new rows upserted. | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/514/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
1465194930 | PR_kwDOCGYnMM5DvZxa | 515 | upsert new rows with constraints, fixes #514 | cldellow 193185 | closed | 0 | 1 | 2022-11-26T16:15:21Z | 2023-05-08T21:27:11Z | 2023-05-08T21:27:10Z | NONE | simonw/sqlite-utils/pulls/515 | This fixes #514 by making the initial insert for upserts include all columns, so that new rows can be added to tables with non-pkey columns that have constraints. (aside: I'm not a python programmer. `pip`? `pipenv`? `venv`? These are mystical incantations to me. The process to set up this repo for local development and testing was _so easy_. Thank you for the excellent contributing documentation!) <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--515.org.readthedocs.build/en/515/ <!-- readthedocs-preview sqlite-utils end --> | sqlite-utils 140912432 | pull | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/515/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | |||||
1515815014 | I_kwDOBm6k_c5aWYBm | 1973 | render_cell plugin hook's row object is not a sqlite.Row | cldellow 193185 | open | 0 | 4 | 2023-01-01T20:27:46Z | 2023-01-29T00:40:31Z | CONTRIBUTOR | From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette: > row - sqlite.Row > The SQLite row object that the value being rendered is part of This appears to actually be a [CustomRow](https://github.com/simonw/datasette/blob/f0fadc28ddb9f82e5cc1ecaa51e8a342eb6dc528/datasette/utils/__init__.py#L773-L789), but I think that's unrelated to my issue. I have a table: ```sql CREATE TABLE IF NOT EXISTS "dss_job_stats"( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) ); ``` On datasette 0.63.2, the `render_cell` hook receives a `row` value that looks like: ``` CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')]) ``` I expected the `job_id` value to be `2`, but it's actually `{'value': 2, 'label': '2'}`. I can work around this, but was wondering if this was intended behaviour? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/1973/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | ||||||||
1552368054 | I_kwDOBm6k_c5ch0G2 | 2000 | rewrite_sql hook | cldellow 193185 | open | 0 | 1 | 2023-01-23T01:02:52Z | 2023-01-23T06:08:01Z | CONTRIBUTOR | I'm not sold that this is a good idea, but thought it'd be worth writing up a ticket. Proposal: add a hook like ```python def rewrite_sql(datasette, database, request, fn, sql, params) ``` It would be called from Database.execute, Database.execute_write, Database.execute_write_script, Database.execute_write_many before running the user's SQL. `fn` would indicate which method was being used, in case that's relevant for the SQL inspection -- for example `execute` only permits a single statement. The hook could return a SQL statement to be executed instead, or an async function to be awaited on that returned the SQL to be executed. Plugins that could be written with this hook: - https://github.com/cldellow/datasette-ersatz-table-valued-functions would use this to avoid monkey-patching - a plugin to inspect and reject unsafe Spatialite function calls (reported by [Simon in Discord](https://discord.com/channels/823971286308356157/823971286941302908/1066438832293159004)) - a plugin to do more general rewrites of queries to enforce table or row-level security, for example, based on the currently logged in actor's ID - a plugin to maintain audit tables when users write to a table - a plugin to cache expensive queries (eg the queries that drive facets) - these could allow stale reads if previously cached, then refresh them in an offline queue Flaws with this idea: `execute_fn` and `execute_write_fn` would not go through this hook, which limits the guarantees you can make about it for security purposes. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/2000/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | ||||||||
1556065335 | PR_kwDOBm6k_c5Ie5nA | 2004 | use single quotes for string literals, fixes #2001 | cldellow 193185 | open | 0 | 1 | 2023-01-25T05:08:46Z | 2023-02-01T06:37:18Z | CONTRIBUTOR | simonw/datasette/pulls/2004 | This modernizes some uses of double quotes for string literals to use only single quotes, fixes simonw/datasette#2001 While developing it, I manually enabled the stricter mode by using the code snippet at https://gist.github.com/cldellow/85bba507c314b127f85563869cd94820 I think that code snippet isn't generally safe/portable, so I haven't tried to automate it in the tests. <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--2004.org.readthedocs.build/en/2004/ <!-- readthedocs-preview datasette end --> | datasette 107914493 | pull | {"url": "https://api.github.com/repos/simonw/datasette/issues/2004/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | ||||||
1560982210 | PR_kwDOBm6k_c5IvYKw | 2008 | array facet: don't materialize unnecessary columns | cldellow 193185 | open | 0 | 8 | 2023-01-28T19:33:40Z | 2023-01-29T18:17:40Z | CONTRIBUTOR | simonw/datasette/pulls/2008 | The presence of `inner.*` causes SQLite to materialize a row with all the columns. Those columns will be discarded later. Instead, we can select only the column we'll use. This lets SQLite's optimizer realize that the other columns in the CTE definition aren't needed. On a test table with 278K rows, 98K of which had an array, this speeds up the facet calculation from 4 sec to 1 sec. <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--2008.org.readthedocs.build/en/2008/ <!-- readthedocs-preview datasette end --> | datasette 107914493 | pull | {"url": "https://api.github.com/repos/simonw/datasette/issues/2008/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | ||||||
1565179870 | I_kwDOBm6k_c5dSr_e | 2013 | Datasette uses non-standard quoting for identifiers | cldellow 193185 | open | 0 | 0 | 2023-02-01T00:05:39Z | 2023-02-01T00:06:30Z | CONTRIBUTOR | Related to #2001, but where #2001 was about literals, this is about identifiers From https://www.sqlite.org/lang_keywords.html: > "keyword" A keyword in double-quotes is an identifier. > [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility. Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/__init__.py#L345-L349, in some of the other DB access code, and in some of the test fixtures. Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/2013/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | ||||||||
1571711808 | I_kwDOBm6k_c5drmtA | 2018 | `check_visibility` gives confusing (wrong?) results if permission is `None` | cldellow 193185 | open | 0 | 0 | 2023-02-06T01:03:08Z | 2023-02-06T01:03:46Z | CONTRIBUTOR | I'm trying to gate access to an edit UI on the user having `update-row` on the underlying view or table. I expected [datasette.check_visibility](https://docs.datasette.io/en/latest/internals.html#await-check-visibility-actor-action-none-resource-none-permissions-none) to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ ("update-row", (database, table)), ], ) if not visible: return None ``` But `visible` is returning true, even when there is no explicit `update-row` permission. (In this case, `request.actor` is `None`.) Based on [the update-row permissions docs](https://docs.datasette.io/en/latest/authentication.html#update-row), I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that `check_visibility` calls `ensure_permissions` and expects it to throw if the permission is not available. But `ensure_permissions` does not throw when `permission_allowed` returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/2018/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | ||||||||
1605959201 | I_kwDOBm6k_c5fuP4h | 2032 | datasette errors when foreign key integrity is enabled | cldellow 193185 | open | 0 | 0 | 2023-03-02T01:27:51Z | 2023-03-02T01:31:58Z | CONTRIBUTOR | By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running: ```sql PRAGMA foreign_keys = ON; ``` inside of a `prepare_connection` hook. If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with: ``` FOREIGN KEY constraint failed ``` This could be resolved by either: - deleting from the `tables` column last - changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions) Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/2032/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);