issue_comments: 1295200988
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/issues/1866#issuecomment-1295200988 | https://api.github.com/repos/simonw/datasette/issues/1866 | 1295200988 | IC_kwDOBm6k_c5NMzLc | 9599 | 2022-10-28T16:29:55Z | 2022-10-28T16:29:55Z | OWNER | I wonder if there's something clever I could do here within a transaction? Start a transaction. Write out a temporary in-memory table with all of the existing primary keys in the table. Run the bulk insert. Then run `select pk from table where pk not in (select pk from old_pks)` to see what has changed. I don't think that's going to work well for large tables. I'm going to go with not returning inserted rows by default, unless you pass a special option requesting that. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 1426001541 |