html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/304#issuecomment-394400419,https://api.github.com/repos/simonw/datasette/issues/304,394400419,MDEyOklzc3VlQ29tbWVudDM5NDQwMDQxOQ==,9599,simonw,2018-06-04T15:39:03Z,2018-06-04T15:39:03Z,OWNER,"In the interest of getting this shipped, I'm going to ignore the `3.7.10` issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/304#issuecomment-394412217,https://api.github.com/repos/simonw/datasette/issues/304,394412217,MDEyOklzc3VlQ29tbWVudDM5NDQxMjIxNw==,9599,simonw,2018-06-04T16:13:32Z,2018-06-04T16:13:32Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/config.html#cache-size-kb,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/302#issuecomment-394412784,https://api.github.com/repos/simonw/datasette/issues/302,394412784,MDEyOklzc3VlQ29tbWVudDM5NDQxMjc4NA==,9599,simonw,2018-06-04T16:15:22Z,2018-06-04T16:15:22Z,OWNER,I think this is related to #303,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/266#issuecomment-394417567,https://api.github.com/repos/simonw/datasette/issues/266,394417567,MDEyOklzc3VlQ29tbWVudDM5NDQxNzU2Nw==,9599,simonw,2018-06-04T16:30:48Z,2018-06-04T16:32:55Z,OWNER,"When serving streaming responses, I need to check that a large CSV file doesn't completely max out the CPU in a way that is harmful to the rest of the instance. If it does, one option may be to insert an async sleep call in between each chunk that is streamed back. This could be controlled by a `csv_pause_ms` config setting, defaulting to maybe 5 but can be disabled entirely by setting to 0. That's only if testing proves that this is a necessary mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/272#issuecomment-394431323,https://api.github.com/repos/simonw/datasette/issues/272,394431323,MDEyOklzc3VlQ29tbWVudDM5NDQzMTMyMw==,9599,simonw,2018-06-04T17:17:37Z,2018-06-04T17:17:37Z,OWNER,I built this ASGI debugging tool to help with this migration: https://asgi-scope.now.sh/fivethirtyeight-34d6604/most-common-name%2Fsurnames.json?foo=bar&bazoeuto=onetuh&a=.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394503399,https://api.github.com/repos/simonw/datasette/issues/272,394503399,MDEyOklzc3VlQ29tbWVudDM5NDUwMzM5OQ==,9599,simonw,2018-06-04T21:20:14Z,2018-06-04T21:20:14Z,OWNER,Results of an extremely simple micro-benchmark comparing the two shows that uvicorn is at least as fast as Sanic (benchmarks a little faster with a very simple payload): https://gist.github.com/simonw/418950af178c01c416363cc057420851,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI,