{"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109158903", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109158903, "node_id": "IC_kwDOBm6k_c5CHGv3", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:11:42Z", "updated_at": "2022-04-26T00:11:42Z", "author_association": "OWNER", "body": "Places this plugin hook (or hooks?) should be able to affect:\r\n\r\n- JSON for a table/view\r\n- JSON for a row\r\n- JSON for a canned query\r\n- JSON for a custom arbitrary query\r\n\r\nI'm going to combine those last two, which means there are three places. But maybe I can combine the table one and the row one as well?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109159307", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109159307, "node_id": "IC_kwDOBm6k_c5CHG2L", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:12:28Z", "updated_at": "2022-04-26T00:12:28Z", "author_association": "OWNER", "body": "I'm going to keep table and row separate. So I think I need to add three new plugin hooks:\r\n\r\n- `table_extras()`\r\n- `row_extras()`\r\n- `query_extras()`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109160226", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109160226, "node_id": "IC_kwDOBm6k_c5CHHEi", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:14:11Z", "updated_at": "2022-04-26T00:14:11Z", "author_association": "OWNER", "body": "There are four existing plugin hooks that include the word \"extra\" but use it to mean something else - to mean additional CSS/JS/variables to be injected into the page:\r\n\r\n- `def extra_css_urls(...)`\r\n- `def extra_js_urls(...)`\r\n- `def extra_body_script(...)`\r\n- `def extra_template_vars(...)`\r\n\r\nI think `extra_*` and `*_extras` are different enough that they won't be confused with each other.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109162123", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109162123, "node_id": "IC_kwDOBm6k_c5CHHiL", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:16:42Z", "updated_at": "2022-04-26T00:16:51Z", "author_association": "OWNER", "body": "Actually I'm going to imitate the existing `register_*` hooks:\r\n\r\n- `def register_output_renderer(datasette)`\r\n- `def register_facet_classes()`\r\n- `def register_routes(datasette)`\r\n- `def register_commands(cli)`\r\n- `def register_magic_parameters(datasette)`\r\n\r\nSo I'm going to call the new hooks:\r\n\r\n- `register_table_extras(datasette)`\r\n- `register_row_extras(datasette)`\r\n- `register_query_extras(datasette)`\r\n\r\nThey'll return a list of `async def` functions. The names of those functions will become the names of the extras.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109164803", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109164803, "node_id": "IC_kwDOBm6k_c5CHIMD", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:21:40Z", "updated_at": "2022-04-26T00:21:40Z", "author_association": "OWNER", "body": "What would the existing https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables feature look like if it was re-imagined as a `register_row_extras()` plugin?\r\n\r\nRough sketch, copying most of the code from https://github.com/simonw/datasette/blob/579f59dcec43a91dd7d404e00b87a00afd8515f2/datasette/views/row.py#L98\r\n\r\n```python\r\nfrom datasette import hookimpl\r\n\r\n@hookimpl\r\ndef register_row_extras(datasette):\r\n return [foreign_key_tables]\r\n\r\nasync def foreign_key_tables(datasette, database, table, pk_values):\r\n if len(pk_values) != 1:\r\n return []\r\n db = datasette.get_database(database)\r\n all_foreign_keys = await db.get_all_foreign_keys()\r\n foreign_keys = all_foreign_keys[table][\"incoming\"]\r\n if len(foreign_keys) == 0:\r\n return []\r\n\r\n sql = \"select \" + \", \".join(\r\n [\r\n \"(select count(*) from {table} where {column}=:id)\".format(\r\n table=escape_sqlite(fk[\"other_table\"]),\r\n column=escape_sqlite(fk[\"other_column\"]),\r\n )\r\n for fk in foreign_keys\r\n ]\r\n )\r\n try:\r\n rows = list(await db.execute(sql, {\"id\": pk_values[0]}))\r\n except QueryInterrupted:\r\n # Almost certainly hit the timeout\r\n return []\r\n\r\n foreign_table_counts = dict(\r\n zip(\r\n [(fk[\"other_table\"], fk[\"other_column\"]) for fk in foreign_keys],\r\n list(rows[0]),\r\n )\r\n )\r\n foreign_key_tables = []\r\n for fk in foreign_keys:\r\n count = (\r\n foreign_table_counts.get((fk[\"other_table\"], fk[\"other_column\"])) or 0\r\n )\r\n key = fk[\"other_column\"]\r\n if key.startswith(\"_\"):\r\n key += \"__exact\"\r\n link = \"{}?{}={}\".format(\r\n self.ds.urls.table(database, fk[\"other_table\"]),\r\n key,\r\n \",\".join(pk_values),\r\n )\r\n foreign_key_tables.append({**fk, **{\"count\": count, \"link\": link}})\r\n return foreign_key_tables\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109165411", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109165411, "node_id": "IC_kwDOBm6k_c5CHIVj", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:22:42Z", "updated_at": "2022-04-26T00:22:42Z", "author_association": "OWNER", "body": "Passing `pk_values` to the plugin hook feels odd. I think I'd pass a `row` object instead and let the code look up the primary key values on that row (by introspecting the primary keys for the table).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109171871", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109171871, "node_id": "IC_kwDOBm6k_c5CHJ6f", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:34:48Z", "updated_at": "2022-04-26T00:34:48Z", "author_association": "OWNER", "body": "Let's try sketching out a `register_table_extras` plugin for something new.\r\n\r\nThe first idea I came up with suggests adding new fields to the individual row records that come back - my mental model for extras so far has been that they add new keys to the root object.\r\n\r\nSo if a table result looked like this:\r\n\r\n```json\r\n{\r\n \"rows\": [\r\n {\"id\": 1, \"name\": \"Cleo\"},\r\n {\"id\": 2, \"name\": \"Suna\"}\r\n ],\r\n \"next_url\": null\r\n}\r\n```\r\nI was initially thinking that `?_extra=facets` would add a `\"facets\": {...}` key to that root object.\r\n\r\nHere's a plugin idea I came up with that would probably justify adding to the individual row objects instead:\r\n\r\n- `?_extra=check404s` - does an async `HEAD` request against every column value that looks like a URL and checks if it returns a 404\r\n\r\nThis could also work by adding a `\"check404s\": {\"url-here\": 200}` key to the root object though.\r\n\r\nI think I need some better plugin concepts before committing to this new hook. There's overlap between this and how I want the enrichments mechanism ([see here](https://simonwillison.net/2021/Jan/17/weeknotes-still-pretty-distracted/)) to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109174715", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109174715, "node_id": "IC_kwDOBm6k_c5CHKm7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T00:40:13Z", "updated_at": "2022-04-26T00:43:33Z", "author_association": "OWNER", "body": "Some of the things I'd like to use `?_extra=` for, that may or not make sense as plugins:\r\n\r\n- Performance breakdown information, maybe including explain output for a query/table\r\n- Information about the tables that were consulted in a query - imagine pulling in additional table metadata\r\n- Statistical aggregates against the full set of results. This may well be a Datasette core feature at some point in the future, but being able to provide it early as a plugin would be really cool.\r\n- For tables, what are the other tables they can join against?\r\n- Suggested facets\r\n- Facet results themselves\r\n- New custom facets I haven't thought of - though the `register_facet_classes` hook covers that already\r\n- Table schema\r\n- Table metadata\r\n- Analytics - how many times has this table been queried? Would be a plugin thing\r\n- For geospatial data, how about a GeoJSON polygon that represents the bounding box for all returned results? Effectively this is an extra aggregation.\r\n\r\nLooking at https://github-to-sqlite.dogsheep.net/github/commits.json?_labels=on&_shape=objects for inspiration.\r\n\r\nI think there's a separate potential mechanism in the future that lets you add custom columns to a table. This would affect `.csv` and the HTML presentation too, which makes it a different concept from the `?_extra=` hook that affects the JSON export (and the context that is fed to the HTML templates).", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/428#issuecomment-1109190401", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/428", "id": 1109190401, "node_id": "IC_kwDOCGYnMM5CHOcB", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T01:05:29Z", "updated_at": "2022-04-26T01:05:29Z", "author_association": "OWNER", "body": "Django makes extensive use of savepoints for nested transactions: https://docs.djangoproject.com/en/4.0/topics/db/transactions/#savepoints", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215216249, "label": "Research adding support for savepoints"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109200335", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109200335, "node_id": "IC_kwDOBm6k_c5CHQ3P", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T01:24:47Z", "updated_at": "2022-04-26T01:24:47Z", "author_association": "OWNER", "body": "Sketching out a `?_extra=statistics` table plugin:\r\n\r\n```python\r\nfrom datasette import hookimpl\r\n\r\n@hookimpl\r\ndef register_table_extras(datasette):\r\n return [statistics]\r\n\r\nasync def statistics(datasette, query, columns, sql):\r\n # ... need to figure out which columns are integer/floats\r\n # then build and execute a SQL query that calculates sum/avg/etc for each column\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109200774", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109200774, "node_id": "IC_kwDOBm6k_c5CHQ-G", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T01:25:43Z", "updated_at": "2022-04-26T01:26:15Z", "author_association": "OWNER", "body": "Had a thought: if a custom HTML template is going to make use of stuff generated using these extras, it will need a way to tell Datasette to execute those extras even in the absence of the `?_extra=...` URL parameters.\r\n\r\nIs that necessary? Or should those kinds of plugins use the existing `extra_template_vars` hook instead?\r\n\r\nOr maybe the `extra_template_vars` hook gets redesigned so it can depend on other `extras` in some way?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109305184", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109305184, "node_id": "IC_kwDOBm6k_c5CHqdg", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T04:03:35Z", "updated_at": "2022-04-26T04:03:35Z", "author_association": "OWNER", "body": "I bet there's all kinds of interesting potential extras that could be calculated by loading the results of the query into a Pandas DataFrame.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109306070", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109306070, "node_id": "IC_kwDOBm6k_c5CHqrW", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T04:05:20Z", "updated_at": "2022-04-26T04:05:20Z", "author_association": "OWNER", "body": "The proposed plugin for annotations - allowing users to attach comments to database tables, columns and rows - would be a great application for all three of those `?_extra=` plugin hooks.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1109309683", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1109309683, "node_id": "IC_kwDOBm6k_c5CHrjz", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T04:12:39Z", "updated_at": "2022-04-26T04:12:39Z", "author_association": "OWNER", "body": "I think the rough shape of the three plugin hooks is right. The detailed decisions that are needed concern what the parameters should be, which I think will mainly happen as part of:\r\n\r\n- #1715", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1720#issuecomment-1110212021", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1720", "id": 1110212021, "node_id": "IC_kwDOBm6k_c5CLH21", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T20:20:27Z", "updated_at": "2022-04-26T20:20:27Z", "author_association": "OWNER", "body": "Closing this because I have a good enough idea of the design for now - the details of the parameters can be figured out when I implement this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1215174094, "label": "Design plugin hook for extras"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1110219185", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1110219185, "node_id": "IC_kwDOBm6k_c5CLJmx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T20:28:40Z", "updated_at": "2022-04-26T20:56:48Z", "author_association": "OWNER", "body": "The refactor I did in #1719 pretty much clashes with all of the changes in https://github.com/simonw/datasette/commit/5053f1ea83194ecb0a5693ad5dada5b25bf0f7e6 so I'll probably need to start my `api-extras` branch again from scratch.\r\n\r\nUsing a new `tableview-asyncinject` branch.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1110229319", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1110229319, "node_id": "IC_kwDOBm6k_c5CLMFH", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T20:41:32Z", "updated_at": "2022-04-26T20:44:38Z", "author_association": "OWNER", "body": "This time I'm not going to bother with the `filter_args` thing - I'm going to just try to use `asyncinject` to execute some big high level things in parallel - facets, suggested facets, counts, the query - and then combine it with the `extras` mechanism I'm trying to introduce too.\r\n\r\nMost importantly: I want that `extra_template()` function that adds more template context for the HTML to be executed as part of an `asyncinject` flow!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1110238896", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1110238896, "node_id": "IC_kwDOBm6k_c5CLOaw", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T20:53:59Z", "updated_at": "2022-04-26T20:53:59Z", "author_association": "OWNER", "body": "I'm going to rename `database` to `database_name` and `table` to `table_name` to avoid confusion with the `Database` object as opposed to the string name for the database.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1110239536", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1110239536, "node_id": "IC_kwDOBm6k_c5CLOkw", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T20:54:53Z", "updated_at": "2022-04-26T20:54:53Z", "author_association": "OWNER", "body": "`pytest tests/test_table_*` runs the tests quickly.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1110246593", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1110246593, "node_id": "IC_kwDOBm6k_c5CLQTB", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T21:03:56Z", "updated_at": "2022-04-26T21:03:56Z", "author_association": "OWNER", "body": "Well this is fun... I applied this change:\r\n\r\n```diff\r\ndiff --git a/datasette/views/table.py b/datasette/views/table.py\r\nindex d66adb8..85f9e44 100644\r\n--- a/datasette/views/table.py\r\n+++ b/datasette/views/table.py\r\n@@ -1,3 +1,4 @@\r\n+import asyncio\r\n import itertools\r\n import json\r\n \r\n@@ -5,6 +6,7 @@ import markupsafe\r\n \r\n from datasette.plugins import pm\r\n from datasette.database import QueryInterrupted\r\n+from datasette import tracer\r\n from datasette.utils import (\r\n await_me_maybe,\r\n CustomRow,\r\n@@ -174,8 +176,11 @@ class TableView(DataView):\r\n write=bool(canned_query.get(\"write\")),\r\n )\r\n \r\n- is_view = bool(await db.get_view_definition(table_name))\r\n- table_exists = bool(await db.table_exists(table_name))\r\n+ with tracer.trace_child_tasks():\r\n+ is_view, table_exists = map(bool, await asyncio.gather(\r\n+ db.get_view_definition(table_name),\r\n+ db.table_exists(table_name)\r\n+ ))\r\n \r\n # If table or view not found, return 404\r\n if not is_view and not table_exists:\r\n```\r\nAnd now using https://datasette.io/plugins/datasette-pretty-traces I get this:\r\n\r\n![CleanShot 2022-04-26 at 14 03 33@2x](https://user-images.githubusercontent.com/9599/165392009-84c4399d-3e94-46d4-ba7b-a64a116cac5c.png)\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1110265087", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1110265087, "node_id": "IC_kwDOBm6k_c5CLUz_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T21:26:17Z", "updated_at": "2022-04-26T21:26:17Z", "author_association": "OWNER", "body": "Running facets and facet suggestions in parallel using `asyncio.gather()` turns out to be a lot less hassle than I had thought - maybe I don't need `asyncinject` for this at all?\r\n\r\n```diff\r\n if not nofacet:\r\n- for facet in facet_instances:\r\n- (\r\n- instance_facet_results,\r\n- instance_facets_timed_out,\r\n- ) = await facet.facet_results()\r\n+ # Run them in parallel\r\n+ facet_awaitables = [facet.facet_results() for facet in facet_instances]\r\n+ facet_awaitable_results = await asyncio.gather(*facet_awaitables)\r\n+ for (\r\n+ instance_facet_results,\r\n+ instance_facets_timed_out,\r\n+ ) in facet_awaitable_results:\r\n for facet_info in instance_facet_results:\r\n base_key = facet_info[\"name\"]\r\n key = base_key\r\n@@ -522,8 +540,10 @@ class TableView(DataView):\r\n and not nofacet\r\n and not nosuggest\r\n ):\r\n- for facet in facet_instances:\r\n- suggested_facets.extend(await facet.suggest())\r\n+ # Run them in parallel\r\n+ facet_suggest_awaitables = [facet.suggest() for facet in facet_instances]\r\n+ for suggest_result in await asyncio.gather(*facet_suggest_awaitables):\r\n+ suggested_facets.extend(suggest_result)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1723#issuecomment-1110278182", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1723", "id": 1110278182, "node_id": "IC_kwDOBm6k_c5CLYAm", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T21:43:34Z", "updated_at": "2022-04-26T21:43:34Z", "author_association": "OWNER", "body": "Here's the diff I'm using:\r\n```diff\r\ndiff --git a/datasette/views/table.py b/datasette/views/table.py\r\nindex d66adb8..f15ef1e 100644\r\n--- a/datasette/views/table.py\r\n+++ b/datasette/views/table.py\r\n@@ -1,3 +1,4 @@\r\n+import asyncio\r\n import itertools\r\n import json\r\n \r\n@@ -5,6 +6,7 @@ import markupsafe\r\n \r\n from datasette.plugins import pm\r\n from datasette.database import QueryInterrupted\r\n+from datasette import tracer\r\n from datasette.utils import (\r\n await_me_maybe,\r\n CustomRow,\r\n@@ -150,6 +152,16 @@ class TableView(DataView):\r\n default_labels=False,\r\n _next=None,\r\n _size=None,\r\n+ ):\r\n+ with tracer.trace_child_tasks():\r\n+ return await self._data_traced(request, default_labels, _next, _size)\r\n+\r\n+ async def _data_traced(\r\n+ self,\r\n+ request,\r\n+ default_labels=False,\r\n+ _next=None,\r\n+ _size=None,\r\n ):\r\n database_route = tilde_decode(request.url_vars[\"database\"])\r\n table_name = tilde_decode(request.url_vars[\"table\"])\r\n@@ -159,6 +171,20 @@ class TableView(DataView):\r\n raise NotFound(\"Database not found: {}\".format(database_route))\r\n database_name = db.name\r\n \r\n+ # For performance profiling purposes, ?_parallel=1 turns on asyncio.gather\r\n+ async def _gather_parallel(*args):\r\n+ return await asyncio.gather(*args)\r\n+\r\n+ async def _gather_sequential(*args):\r\n+ results = []\r\n+ for fn in args:\r\n+ results.append(await fn)\r\n+ return results\r\n+\r\n+ gather = (\r\n+ _gather_parallel if request.args.get(\"_parallel\") else _gather_sequential\r\n+ )\r\n+\r\n # If this is a canned query, not a table, then dispatch to QueryView instead\r\n canned_query = await self.ds.get_canned_query(\r\n database_name, table_name, request.actor\r\n@@ -174,8 +200,12 @@ class TableView(DataView):\r\n write=bool(canned_query.get(\"write\")),\r\n )\r\n \r\n- is_view = bool(await db.get_view_definition(table_name))\r\n- table_exists = bool(await db.table_exists(table_name))\r\n+ is_view, table_exists = map(\r\n+ bool,\r\n+ await gather(\r\n+ db.get_view_definition(table_name), db.table_exists(table_name)\r\n+ ),\r\n+ )\r\n \r\n # If table or view not found, return 404\r\n if not is_view and not table_exists:\r\n@@ -497,33 +527,44 @@ class TableView(DataView):\r\n )\r\n )\r\n \r\n- if not nofacet:\r\n- for facet in facet_instances:\r\n- (\r\n+ async def execute_facets():\r\n+ if not nofacet:\r\n+ # Run them in parallel\r\n+ facet_awaitables = [facet.facet_results() for facet in facet_instances]\r\n+ facet_awaitable_results = await gather(*facet_awaitables)\r\n+ for (\r\n instance_facet_results,\r\n instance_facets_timed_out,\r\n- ) = await facet.facet_results()\r\n- for facet_info in instance_facet_results:\r\n- base_key = facet_info[\"name\"]\r\n- key = base_key\r\n- i = 1\r\n- while key in facet_results:\r\n- i += 1\r\n- key = f\"{base_key}_{i}\"\r\n- facet_results[key] = facet_info\r\n- facets_timed_out.extend(instance_facets_timed_out)\r\n-\r\n- # Calculate suggested facets\r\n+ ) in facet_awaitable_results:\r\n+ for facet_info in instance_facet_results:\r\n+ base_key = facet_info[\"name\"]\r\n+ key = base_key\r\n+ i = 1\r\n+ while key in facet_results:\r\n+ i += 1\r\n+ key = f\"{base_key}_{i}\"\r\n+ facet_results[key] = facet_info\r\n+ facets_timed_out.extend(instance_facets_timed_out)\r\n+\r\n suggested_facets = []\r\n- if (\r\n- self.ds.setting(\"suggest_facets\")\r\n- and self.ds.setting(\"allow_facet\")\r\n- and not _next\r\n- and not nofacet\r\n- and not nosuggest\r\n- ):\r\n- for facet in facet_instances:\r\n- suggested_facets.extend(await facet.suggest())\r\n+\r\n+ async def execute_suggested_facets():\r\n+ # Calculate suggested facets\r\n+ if (\r\n+ self.ds.setting(\"suggest_facets\")\r\n+ and self.ds.setting(\"allow_facet\")\r\n+ and not _next\r\n+ and not nofacet\r\n+ and not nosuggest\r\n+ ):\r\n+ # Run them in parallel\r\n+ facet_suggest_awaitables = [\r\n+ facet.suggest() for facet in facet_instances\r\n+ ]\r\n+ for suggest_result in await gather(*facet_suggest_awaitables):\r\n+ suggested_facets.extend(suggest_result)\r\n+\r\n+ await gather(execute_facets(), execute_suggested_facets())\r\n \r\n # Figure out columns and rows for the query\r\n columns = [r[0] for r in results.description]\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1216508080, "label": "Research running SQL in table view in parallel using `asyncio.gather()`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1723#issuecomment-1110278577", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1723", "id": 1110278577, "node_id": "IC_kwDOBm6k_c5CLYGx", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T21:44:04Z", "updated_at": "2022-04-26T21:44:04Z", "author_association": "OWNER", "body": "And some simple benchmarks with `ab` - using the `?_parallel=1` hack to try it with and without a parallel `asyncio.gather()`:\r\n\r\n```\r\n~ % ab -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2' \r\nThis is ApacheBench, Version 2.3 <$Revision: 1879490 $>\r\nCopyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/\r\nLicensed to The Apache Software Foundation, http://www.apache.org/\r\n\r\nBenchmarking 127.0.0.1 (be patient).....done\r\n\r\n\r\nServer Software: uvicorn\r\nServer Hostname: 127.0.0.1\r\nServer Port: 8001\r\n\r\nDocument Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2\r\nDocument Length: 314187 bytes\r\n\r\nConcurrency Level: 1\r\nTime taken for tests: 68.279 seconds\r\nComplete requests: 100\r\nFailed requests: 13\r\n (Connect: 0, Receive: 0, Length: 13, Exceptions: 0)\r\nTotal transferred: 31454937 bytes\r\nHTML transferred: 31418437 bytes\r\nRequests per second: 1.46 [#/sec] (mean)\r\nTime per request: 682.787 [ms] (mean)\r\nTime per request: 682.787 [ms] (mean, across all concurrent requests)\r\nTransfer rate: 449.89 [Kbytes/sec] received\r\n\r\nConnection Times (ms)\r\n min mean[+/-sd] median max\r\nConnect: 0 0 0.0 0 0\r\nProcessing: 621 683 68.0 658 993\r\nWaiting: 620 682 68.0 657 992\r\nTotal: 621 683 68.0 658 993\r\n\r\nPercentage of the requests served within a certain time (ms)\r\n 50% 658\r\n 66% 678\r\n 75% 687\r\n 80% 711\r\n 90% 763\r\n 95% 879\r\n 98% 926\r\n 99% 993\r\n 100% 993 (longest request)\r\n\r\n\r\n----\r\n\r\nIn parallel:\r\n\r\n~ % ab -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1'\r\nThis is ApacheBench, Version 2.3 <$Revision: 1879490 $>\r\nCopyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/\r\nLicensed to The Apache Software Foundation, http://www.apache.org/\r\n\r\nBenchmarking 127.0.0.1 (be patient).....done\r\n\r\n\r\nServer Software: uvicorn\r\nServer Hostname: 127.0.0.1\r\nServer Port: 8001\r\n\r\nDocument Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1\r\nDocument Length: 315703 bytes\r\n\r\nConcurrency Level: 1\r\nTime taken for tests: 34.763 seconds\r\nComplete requests: 100\r\nFailed requests: 11\r\n (Connect: 0, Receive: 0, Length: 11, Exceptions: 0)\r\nTotal transferred: 31607988 bytes\r\nHTML transferred: 31570288 bytes\r\nRequests per second: 2.88 [#/sec] (mean)\r\nTime per request: 347.632 [ms] (mean)\r\nTime per request: 347.632 [ms] (mean, across all concurrent requests)\r\nTransfer rate: 887.93 [Kbytes/sec] received\r\n\r\nConnection Times (ms)\r\n min mean[+/-sd] median max\r\nConnect: 0 0 0.0 0 0\r\nProcessing: 311 347 28.0 338 450\r\nWaiting: 311 347 28.0 338 450\r\nTotal: 312 348 28.0 338 451\r\n\r\nPercentage of the requests served within a certain time (ms)\r\n 50% 338\r\n 66% 348\r\n 75% 361\r\n 80% 367\r\n 90% 396\r\n 95% 408\r\n 98% 436\r\n 99% 451\r\n 100% 451 (longest request)\r\n\r\n----\r\n\r\nWith concurrency 10, not parallel:\r\n\r\n~ % ab -c 10 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=' \r\nThis is ApacheBench, Version 2.3 <$Revision: 1879490 $>\r\nCopyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/\r\nLicensed to The Apache Software Foundation, http://www.apache.org/\r\n\r\nBenchmarking 127.0.0.1 (be patient).....done\r\n\r\n\r\nServer Software: uvicorn\r\nServer Hostname: 127.0.0.1\r\nServer Port: 8001\r\n\r\nDocument Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=\r\nDocument Length: 314346 bytes\r\n\r\nConcurrency Level: 10\r\nTime taken for tests: 38.408 seconds\r\nComplete requests: 100\r\nFailed requests: 93\r\n (Connect: 0, Receive: 0, Length: 93, Exceptions: 0)\r\nTotal transferred: 31471333 bytes\r\nHTML transferred: 31433733 bytes\r\nRequests per second: 2.60 [#/sec] (mean)\r\nTime per request: 3840.829 [ms] (mean)\r\nTime per request: 384.083 [ms] (mean, across all concurrent requests)\r\nTransfer rate: 800.18 [Kbytes/sec] received\r\n\r\nConnection Times (ms)\r\n min mean[+/-sd] median max\r\nConnect: 0 0 0.1 0 1\r\nProcessing: 685 3719 354.0 3774 4096\r\nWaiting: 684 3707 353.7 3750 4095\r\nTotal: 685 3719 354.0 3774 4096\r\n\r\nPercentage of the requests served within a certain time (ms)\r\n 50% 3774\r\n 66% 3832\r\n 75% 3855\r\n 80% 3878\r\n 90% 3944\r\n 95% 4006\r\n 98% 4057\r\n 99% 4096\r\n 100% 4096 (longest request)\r\n\r\n\r\n----\r\n\r\nConcurrency 10 parallel:\r\n\r\n~ % ab -c 10 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1'\r\nThis is ApacheBench, Version 2.3 <$Revision: 1879490 $>\r\nCopyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/\r\nLicensed to The Apache Software Foundation, http://www.apache.org/\r\n\r\nBenchmarking 127.0.0.1 (be patient).....done\r\n\r\n\r\nServer Software: uvicorn\r\nServer Hostname: 127.0.0.1\r\nServer Port: 8001\r\n\r\nDocument Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1\r\nDocument Length: 315703 bytes\r\n\r\nConcurrency Level: 10\r\nTime taken for tests: 36.762 seconds\r\nComplete requests: 100\r\nFailed requests: 89\r\n (Connect: 0, Receive: 0, Length: 89, Exceptions: 0)\r\nTotal transferred: 31606516 bytes\r\nHTML transferred: 31568816 bytes\r\nRequests per second: 2.72 [#/sec] (mean)\r\nTime per request: 3676.182 [ms] (mean)\r\nTime per request: 367.618 [ms] (mean, across all concurrent requests)\r\nTransfer rate: 839.61 [Kbytes/sec] received\r\n\r\nConnection Times (ms)\r\n min mean[+/-sd] median max\r\nConnect: 0 0 0.1 0 0\r\nProcessing: 381 3602 419.6 3609 4458\r\nWaiting: 381 3586 418.7 3607 4457\r\nTotal: 381 3603 419.6 3609 4458\r\n\r\nPercentage of the requests served within a certain time (ms)\r\n 50% 3609\r\n 66% 3741\r\n 75% 3791\r\n 80% 3821\r\n 90% 3972\r\n 95% 4074\r\n 98% 4386\r\n 99% 4458\r\n 100% 4458 (longest request)\r\n\r\n\r\nTrying -c 3 instead. Non parallel:\r\n\r\n~ % ab -c 3 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel='\r\nThis is ApacheBench, Version 2.3 <$Revision: 1879490 $>\r\nCopyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/\r\nLicensed to The Apache Software Foundation, http://www.apache.org/\r\n\r\nBenchmarking 127.0.0.1 (be patient).....done\r\n\r\n\r\nServer Software: uvicorn\r\nServer Hostname: 127.0.0.1\r\nServer Port: 8001\r\n\r\nDocument Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=\r\nDocument Length: 314346 bytes\r\n\r\nConcurrency Level: 3\r\nTime taken for tests: 39.365 seconds\r\nComplete requests: 100\r\nFailed requests: 83\r\n (Connect: 0, Receive: 0, Length: 83, Exceptions: 0)\r\nTotal transferred: 31470808 bytes\r\nHTML transferred: 31433208 bytes\r\nRequests per second: 2.54 [#/sec] (mean)\r\nTime per request: 1180.955 [ms] (mean)\r\nTime per request: 393.652 [ms] (mean, across all concurrent requests)\r\nTransfer rate: 780.72 [Kbytes/sec] received\r\n\r\nConnection Times (ms)\r\n min mean[+/-sd] median max\r\nConnect: 0 0 0.0 0 0\r\nProcessing: 731 1153 126.2 1189 1359\r\nWaiting: 730 1151 125.9 1188 1358\r\nTotal: 731 1153 126.2 1189 1359\r\n\r\nPercentage of the requests served within a certain time (ms)\r\n 50% 1189\r\n 66% 1221\r\n 75% 1234\r\n 80% 1247\r\n 90% 1296\r\n 95% 1309\r\n 98% 1343\r\n 99% 1359\r\n 100% 1359 (longest request)\r\n\r\n----\r\n\r\nParallel:\r\n\r\n~ % ab -c 3 -n 100 'http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1'\r\nThis is ApacheBench, Version 2.3 <$Revision: 1879490 $>\r\nCopyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/\r\nLicensed to The Apache Software Foundation, http://www.apache.org/\r\n\r\nBenchmarking 127.0.0.1 (be patient).....done\r\n\r\n\r\nServer Software: uvicorn\r\nServer Hostname: 127.0.0.1\r\nServer Port: 8001\r\n\r\nDocument Path: /global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel1&_facet=other_fuel3&_facet=other_fuel2&_parallel=1\r\nDocument Length: 315703 bytes\r\n\r\nConcurrency Level: 3\r\nTime taken for tests: 34.530 seconds\r\nComplete requests: 100\r\nFailed requests: 18\r\n (Connect: 0, Receive: 0, Length: 18, Exceptions: 0)\r\nTotal transferred: 31606179 bytes\r\nHTML transferred: 31568479 bytes\r\nRequests per second: 2.90 [#/sec] (mean)\r\nTime per request: 1035.902 [ms] (mean)\r\nTime per request: 345.301 [ms] (mean, across all concurrent requests)\r\nTransfer rate: 893.87 [Kbytes/sec] received\r\n\r\nConnection Times (ms)\r\n min mean[+/-sd] median max\r\nConnect: 0 0 0.0 0 0\r\nProcessing: 412 1020 104.4 1018 1280\r\nWaiting: 411 1018 104.1 1014 1275\r\nTotal: 412 1021 104.4 1018 1280\r\n\r\nPercentage of the requests served within a certain time (ms)\r\n 50% 1018\r\n 66% 1041\r\n 75% 1061\r\n 80% 1079\r\n 90% 1136\r\n 95% 1176\r\n 98% 1251\r\n 99% 1280\r\n 100% 1280 (longest request)\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1216508080, "label": "Research running SQL in table view in parallel using `asyncio.gather()`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1723#issuecomment-1110279869", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1723", "id": 1110279869, "node_id": "IC_kwDOBm6k_c5CLYa9", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T21:45:39Z", "updated_at": "2022-04-26T21:45:39Z", "author_association": "OWNER", "body": "Getting some nice traces out of this:\r\n\r\n\"CleanShot\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1216508080, "label": "Research running SQL in table view in parallel using `asyncio.gather()`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1723#issuecomment-1110305790", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1723", "id": 1110305790, "node_id": "IC_kwDOBm6k_c5CLev-", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T22:19:04Z", "updated_at": "2022-04-26T22:19:04Z", "author_association": "OWNER", "body": "I realized that seeing the total time in queries wasn't enough to understand this, because if the queries were executed in serial or parallel it should still sum up to the same amount of SQL time (roughly).\r\n\r\nInstead I need to know how long the page took to render. But that's hard to display on the page since you can't measure it until rendering has finished!\r\n\r\nSo I built an ASGI plugin to handle that measurement: https://github.com/simonw/datasette-total-page-time\r\n\r\nAnd with that plugin installed, `http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel2&_facet=other_fuel1&_parallel=1` (the parallel version) takes 377ms:\r\n\r\n\"CleanShot\r\n\r\nWhile `http://127.0.0.1:8001/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=other_fuel2&_facet=other_fuel1` (the serial version) takes 762ms:\r\n\r\n\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1216508080, "label": "Research running SQL in table view in parallel using `asyncio.gather()`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1723#issuecomment-1110330554", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1723", "id": 1110330554, "node_id": "IC_kwDOBm6k_c5CLky6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-26T23:06:20Z", "updated_at": "2022-04-26T23:06:20Z", "author_association": "OWNER", "body": "Deployed here: https://latest-with-plugins.datasette.io/github/commits?_facet=repo&_trace=1&_facet=committer", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1216508080, "label": "Research running SQL in table view in parallel using `asyncio.gather()`"}, "performed_via_github_app": null}