issues: 403624090
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
403624090 | MDU6SXNzdWU0MDM2MjQwOTA= | 6 | "sqlite-utils insert" should support newline-delimited JSON | 9599 | closed | 0 | 1 | 2019-01-28T02:00:02Z | 2019-01-28T02:17:45Z | 2019-01-28T02:17:45Z | OWNER | We can already export newline delimited JSON. We should learn to import it as well. The neat thing about importing it is that you can import GBs of data without having to read the whole lot into memory in order to decode the wrapping JSON array. Datasette can export it now: https://github.com/simonw/datasette/issues/405 Demo: https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on It should be possible to do this: $ curl "https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on" \ | sqlite-utils insert data.db facetable - --nl | 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed |