issue_comments: 1008233910
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008233910 | https://api.github.com/repos/simonw/sqlite-utils/issues/364 | 1008233910 | IC_kwDOCGYnMM48GG22 | 9599 | 2022-01-09T05:32:53Z | 2022-01-09T05:35:45Z | OWNER | This is strange. The following: ```pycon >>> import subprocess >>> p = subprocess.Popen(["sqlite-utils", "insert", "/tmp/stream.db", "stream", "-", "--nl"], stdin=subprocess.PIPE) >>> p.stdin.write(b'\n'.join(b'{"id": %s}' % str(i).encode("utf-8") for i in range(1000))) 11889 >>> # At this point /tmp/stream.db is still 0 bytes - but if I then run this: >>> p.stdin.close() >>> # /tmp/stream.db is now 20K and contains the written data ``` No wait, mystery solved - I can add `p.stdin.flush()` instead of `p.stdin.close()` and the file suddenly jumps up in size. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 1095570074 |