issue_comments: 956041692
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/sqlite-utils/issues/173#issuecomment-956041692 | https://api.github.com/repos/simonw/sqlite-utils/issues/173 | 956041692 | IC_kwDOCGYnMM44_Anc | 2118708 | 2021-11-01T08:42:24Z | 2021-11-01T08:42:24Z | NONE | > I know how to build this for CSV and TSV - I can read them via a file wrapper that counts how many bytes it has seen. > > Not sure how to do it for JSON though. Maybe I could provide it just for newline-delimited JSON? Again I can measure progress based on how many bytes have been read. I was thinking about this, while inserting a stream of ~40M line-delimited json docs. Wouldn't a `--total-expected` flag work ? That's [how tqdm does it](https://github.com/tqdm/tqdm/blob/fc69d5dcf578f7c7986fa76841a6b793f813df35/tqdm/std.py#L366) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 707478649 |