{"html_url": "https://github.com/simonw/datasette/issues/292#issuecomment-392343839", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/292", "id": 392343839, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MjM0MzgzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-27T16:10:09Z", "updated_at": "2018-06-04T17:38:04Z", "author_association": "OWNER", "body": "The more efficient way of doing this kind of count would be to provide a mechanism which can also add extra fragments to a `GROUP BY` clause used for the `SELECT`.\r\n\r\nOr... how about a mechanism similar to Django's `prefetch_related` which lets you define extra queries that will be called with a list of primary keys (or values from other columns) and used to populate a new column? A little unconventional but could be extremely useful and efficient.\r\n\r\nRelated to that: since the per-query overhead in SQLite is tiny, could even define an extra query to be run once-per-row before returning results.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 326800219, "label": "Mechanism for customizing the SQL used to select specific columns in the table view"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/304#issuecomment-394400419", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/304", "id": 394400419, "node_id": "MDEyOklzc3VlQ29tbWVudDM5NDQwMDQxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-06-04T15:39:03Z", "updated_at": "2018-06-04T15:39:03Z", "author_association": "OWNER", "body": "In the interest of getting this shipped, I'm going to ignore the `3.7.10` issue.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 328229224, "label": "Ability to configure SQLite cache_size"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/304#issuecomment-394412217", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/304", "id": 394412217, "node_id": "MDEyOklzc3VlQ29tbWVudDM5NDQxMjIxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-06-04T16:13:32Z", "updated_at": "2018-06-04T16:13:32Z", "author_association": "OWNER", "body": "Docs: http://datasette.readthedocs.io/en/latest/config.html#cache-size-kb", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 328229224, "label": "Ability to configure SQLite cache_size"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/302#issuecomment-394412784", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/302", "id": 394412784, "node_id": "MDEyOklzc3VlQ29tbWVudDM5NDQxMjc4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-06-04T16:15:22Z", "updated_at": "2018-06-04T16:15:22Z", "author_association": "OWNER", "body": "I think this is related to #303", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 328171513, "label": "test-2.3.sqlite database filename throws a 404"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-394417567", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 394417567, "node_id": "MDEyOklzc3VlQ29tbWVudDM5NDQxNzU2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-06-04T16:30:48Z", "updated_at": "2018-06-04T16:32:55Z", "author_association": "OWNER", "body": "When serving streaming responses, I need to check that a large CSV file doesn't completely max out the CPU in a way that is harmful to the rest of the instance.\r\n\r\nIf it does, one option may be to insert an async sleep call in between each chunk that is streamed back. This could be controlled by a `csv_pause_ms` config setting, defaulting to maybe 5 but can be disabled entirely by setting to 0.\r\n\r\nThat's only if testing proves that this is a necessary mechanism.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-394431323", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 394431323, "node_id": "MDEyOklzc3VlQ29tbWVudDM5NDQzMTMyMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-06-04T17:17:37Z", "updated_at": "2018-06-04T17:17:37Z", "author_association": "OWNER", "body": "I built this ASGI debugging tool to help with this migration: https://asgi-scope.now.sh/fivethirtyeight-34d6604/most-common-name%2Fsurnames.json?foo=bar&bazoeuto=onetuh&a=.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-394503399", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 394503399, "node_id": "MDEyOklzc3VlQ29tbWVudDM5NDUwMzM5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-06-04T21:20:14Z", "updated_at": "2018-06-04T21:20:14Z", "author_association": "OWNER", "body": "Results of an extremely simple micro-benchmark comparing the two shows that uvicorn is at least as fast as Sanic (benchmarks a little faster with a very simple payload): https://gist.github.com/simonw/418950af178c01c416363cc057420851", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null}