issue_comments: 1270923537
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/issues/1836#issuecomment-1270923537 | https://api.github.com/repos/simonw/datasette/issues/1836 | 1270923537 | IC_kwDOBm6k_c5LwMER | 536941 | 2022-10-07T00:46:08Z | 2022-10-07T00:46:08Z | CONTRIBUTOR | i thought it was maybe to do with reading through all the files, but that does not seem to be the case if i make a little test file like: ```python # test_read.py import hashlib import sys import pathlib HASH_BLOCK_SIZE = 1024 * 1024 def inspect_hash(path): """Calculate the hash of a database, efficiently.""" m = hashlib.sha256() with path.open("rb") as fp: while True: data = fp.read(HASH_BLOCK_SIZE) if not data: break m.update(data) return m.hexdigest() inspect_hash(pathlib.Path(sys.argv[1])) ``` then a line in the Dockerfile like ```docker RUN python test_read.py nlrb.db && echo "[]" > /etc/inspect.json ``` just produes a layer of `3B` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 1400374908 |