issues: 1426080014
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1426080014 | I_kwDOBm6k_c5VAEEO | 1867 | /db/table/-/rename API (also allows atomic replace) | 9599 | open | 0 | 8755003 | 1 | 2022-10-27T18:13:23Z | 2023-01-09T15:34:12Z | OWNER | > There's one catch with batched inserts: if your CLI tool fails half way through you could end up with a partially populated table - since a bunch of batches will have succeeded first. > > ... > > If people care about that kind of thing they could always push all of their inserts to a table called `_tablename` and then atomically rename that once they've uploaded all of the data (assuming I provide an atomic-rename-this-table mechanism). _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1866#issuecomment-1293893789_ | 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/1867/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} |