issue_comments
6 rows where "created_at" is on date 2018-06-04
This data as json, CSV (advanced)
Suggested facets: issue_url
id ▼ | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
394400419 | https://github.com/simonw/datasette/issues/304#issuecomment-394400419 | https://api.github.com/repos/simonw/datasette/issues/304 | MDEyOklzc3VlQ29tbWVudDM5NDQwMDQxOQ== | simonw 9599 | 2018-06-04T15:39:03Z | 2018-06-04T15:39:03Z | OWNER | In the interest of getting this shipped, I'm going to ignore the `3.7.10` issue. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ability to configure SQLite cache_size 328229224 | |
394412217 | https://github.com/simonw/datasette/issues/304#issuecomment-394412217 | https://api.github.com/repos/simonw/datasette/issues/304 | MDEyOklzc3VlQ29tbWVudDM5NDQxMjIxNw== | simonw 9599 | 2018-06-04T16:13:32Z | 2018-06-04T16:13:32Z | OWNER | Docs: http://datasette.readthedocs.io/en/latest/config.html#cache-size-kb | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ability to configure SQLite cache_size 328229224 | |
394412784 | https://github.com/simonw/datasette/issues/302#issuecomment-394412784 | https://api.github.com/repos/simonw/datasette/issues/302 | MDEyOklzc3VlQ29tbWVudDM5NDQxMjc4NA== | simonw 9599 | 2018-06-04T16:15:22Z | 2018-06-04T16:15:22Z | OWNER | I think this is related to #303 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | test-2.3.sqlite database filename throws a 404 328171513 | |
394417567 | https://github.com/simonw/datasette/issues/266#issuecomment-394417567 | https://api.github.com/repos/simonw/datasette/issues/266 | MDEyOklzc3VlQ29tbWVudDM5NDQxNzU2Nw== | simonw 9599 | 2018-06-04T16:30:48Z | 2018-06-04T16:32:55Z | OWNER | When serving streaming responses, I need to check that a large CSV file doesn't completely max out the CPU in a way that is harmful to the rest of the instance. If it does, one option may be to insert an async sleep call in between each chunk that is streamed back. This could be controlled by a `csv_pause_ms` config setting, defaulting to maybe 5 but can be disabled entirely by setting to 0. That's only if testing proves that this is a necessary mechanism. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Export to CSV 323681589 | |
394431323 | https://github.com/simonw/datasette/issues/272#issuecomment-394431323 | https://api.github.com/repos/simonw/datasette/issues/272 | MDEyOklzc3VlQ29tbWVudDM5NDQzMTMyMw== | simonw 9599 | 2018-06-04T17:17:37Z | 2018-06-04T17:17:37Z | OWNER | I built this ASGI debugging tool to help with this migration: https://asgi-scope.now.sh/fivethirtyeight-34d6604/most-common-name%2Fsurnames.json?foo=bar&bazoeuto=onetuh&a=. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Port Datasette to ASGI 324188953 | |
394503399 | https://github.com/simonw/datasette/issues/272#issuecomment-394503399 | https://api.github.com/repos/simonw/datasette/issues/272 | MDEyOklzc3VlQ29tbWVudDM5NDUwMzM5OQ== | simonw 9599 | 2018-06-04T21:20:14Z | 2018-06-04T21:20:14Z | OWNER | Results of an extremely simple micro-benchmark comparing the two shows that uvicorn is at least as fast as Sanic (benchmarks a little faster with a very simple payload): https://gist.github.com/simonw/418950af178c01c416363cc057420851 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Port Datasette to ASGI 324188953 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
updated_at (date) 1 ✖