html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/dogsheep/healthkit-to-sqlite/issues/20#issuecomment-902356871,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/20,902356871,IC_kwDOC8tyDs41yN-H,9599,simonw,2021-08-20T01:12:48Z,2021-08-20T01:12:48Z,MEMBER,Also on `workout_points.workout_id` to speed up queries to show all points in a specific workout.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",975166271,Add index on workout_points.date, https://github.com/dogsheep/healthkit-to-sqlite/issues/20#issuecomment-902355471,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/20,902355471,IC_kwDOC8tyDs41yNoP,9599,simonw,2021-08-20T01:09:07Z,2021-08-20T01:09:07Z,MEMBER,"Workaround: sqlite-utils create-index healthkit.db workout_points -- -date See https://sqlite-utils.datasette.io/en/stable/cli.html#creating-indexes","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",975166271,Add index on workout_points.date, https://github.com/dogsheep/swarm-to-sqlite/issues/13#issuecomment-1502543165,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13,1502543165,IC_kwDODD6af85Zjv09,9599,simonw,2023-04-11T01:10:36Z,2023-04-11T01:11:47Z,MEMBER,"I just had that error myself on macOS while running the tests: ``` ERROR tests/test_save_checkin.py::test_tables - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_venue - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_event - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_sticker - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_likes - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_with_ - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_users - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_photos - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_posts - sqlite3.OperationalError: table sqlite_master may not be modified ERROR tests/test_save_checkin.py::test_view - sqlite3.OperationalError: table sqlite_master may not be modified ``` `pytest --pdb` shows it happening in the bit that adds foreign keys: ``` > /Users/simon/.local/share/virtualenvs/swarm-to-sqlite-daPW7yIJ/lib/python3.9/site-packages/sqlite_utils/db.py(1096)add_foreign_keys() -> cursor.execute( (Pdb) list 1096 >> cursor.execute( 1097 ""UPDATE sqlite_master SET sql = ? WHERE name = ?"", 1098 (new_sql, table_name), 1099 ) 1100 cursor.execute(""PRAGMA schema_version = %d"" % (schema_version + 1)) 1101 -> cursor.execute(""PRAGMA writable_schema = 0"") 1102 # Have to VACUUM outside the transaction to ensure .foreign_keys property 1103 # can see the newly created foreign key. 1104 self.vacuum() ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1373210675,fails before generating views. ERR: table sqlite_master may not be modified, https://github.com/dogsheep/swarm-to-sqlite/issues/13#issuecomment-1502546045,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13,1502546045,IC_kwDODD6af85Zjwh9,9599,simonw,2023-04-11T01:14:50Z,2023-04-11T01:14:50Z,MEMBER,"Related: - https://github.com/simonw/sqlite-utils/issues/235","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1373210675,fails before generating views. ERR: table sqlite_master may not be modified, https://github.com/dogsheep/swarm-to-sqlite/issues/13#issuecomment-1502629219,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13,1502629219,IC_kwDODD6af85ZkE1j,9599,simonw,2023-04-11T03:15:26Z,2023-04-11T03:15:26Z,MEMBER,"OK, I figured this out. Unfortunately it's an error that occurs on Python versions that have defensive mode turned on, and it doesn't look like there's a way to turn that mode off. See notes above.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1373210675,fails before generating views. ERR: table sqlite_master may not be modified, https://github.com/dogsheep/swarm-to-sqlite/issues/13#issuecomment-1502629404,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13,1502629404,IC_kwDODD6af85ZkE4c,9599,simonw,2023-04-11T03:15:47Z,2023-04-11T03:46:17Z,MEMBER,"I think `swarm-to-sqlite` needs to avoid this error, maybe by setting up foreign keys in another way - or even by skipping foreign keys entirely on databases that don't support this kind of operation.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1373210675,fails before generating views. ERR: table sqlite_master may not be modified, https://github.com/dogsheep/twitter-to-sqlite/issues/57#issuecomment-902327457,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57,902327457,IC_kwDODEm0Qs41yGyh,9599,simonw,2021-08-19T23:53:25Z,2021-08-19T23:53:25Z,MEMBER,I'm getting this too. Looking into it now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907645813,"Error: Use either --since or --since_id, not both", https://github.com/dogsheep/twitter-to-sqlite/issues/57#issuecomment-902328369,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57,902328369,IC_kwDODEm0Qs41yHAx,9599,simonw,2021-08-19T23:56:26Z,2021-08-19T23:56:26Z,MEMBER,"https://developer.twitter.com/en/docs/twitter-api/v1/tweets/timelines/api-reference/get-statuses-user_timeline says the API has been replaced by the new v2 one, but it should still work - and the `since_id` parameter is still documented on that page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907645813,"Error: Use either --since or --since_id, not both", https://github.com/dogsheep/twitter-to-sqlite/issues/57#issuecomment-902328760,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57,902328760,IC_kwDODEm0Qs41yHG4,9599,simonw,2021-08-19T23:57:41Z,2021-08-19T23:57:41Z,MEMBER,"Weird, added debug code and got this: `{'screen_name': 'simonw', 'count': 200, 'since_id': 'False', 'tweet_mode': 'extended'}` - so maybe it's a `twitter-to-sqlite` bug where somehow the string `False` is being passed somewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907645813,"Error: Use either --since or --since_id, not both", https://github.com/dogsheep/twitter-to-sqlite/issues/57#issuecomment-902329455,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57,902329455,IC_kwDODEm0Qs41yHRv,9599,simonw,2021-08-19T23:59:56Z,2021-08-19T23:59:56Z,MEMBER,"This looks like the bug to me: https://github.com/dogsheep/twitter-to-sqlite/blob/197e69cec40052c423a5ed071feb5f7cccea41b9/twitter_to_sqlite/cli.py#L239-L241 `type=str, default=False`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907645813,"Error: Use either --since or --since_id, not both", https://github.com/dogsheep/twitter-to-sqlite/issues/57#issuecomment-902329884,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57,902329884,IC_kwDODEm0Qs41yHYc,9599,simonw,2021-08-20T00:01:05Z,2021-08-20T00:01:05Z,MEMBER,Maybe Click changed something which meant that this broke things when it didn't used to?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907645813,"Error: Use either --since or --since_id, not both", https://github.com/dogsheep/twitter-to-sqlite/pull/49#issuecomment-902330301,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/49,902330301,IC_kwDODEm0Qs41yHe9,9599,simonw,2021-08-20T00:01:56Z,2021-08-20T00:01:56Z,MEMBER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681575714,"Document the use of --stop_after with favorites, refs #20", https://github.com/dogsheep/twitter-to-sqlite/pull/59#issuecomment-924209583,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/59,924209583,IC_kwDODEm0Qs43FlGv,9599,simonw,2021-09-21T17:37:34Z,2021-09-21T17:37:34Z,MEMBER,Thanks for this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",984942782,"Fix for since_id bug, closes #58", https://github.com/dogsheep/github-to-sqlite/issues/69#issuecomment-985928838,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/69,985928838,IC_kwDODFdgUs46xBSG,9599,simonw,2021-12-04T00:34:52Z,2021-12-04T00:34:52Z,MEMBER,"First attempt at this: ```sql select 'issues' as ""table"", id, node_id, title, user, created_at, body, repo from issues union all select 'issue_comments' as ""table"", issue_comments.id, issue_comments.node_id, '' as title, issue_comments.user, issue_comments.created_at, issue_comments.body, issues.repo from issue_comments join issues on issues.id = issue_comments.issue order by created_at desc ``` https://github-to-sqlite.dogsheep.net/github?sql=select+%27issues%27+as+%22table%22%2C+id%2C+node_id%2C+title%2C+user%2C+created_at%2C+body%2C+repo%0D%0Afrom+issues%0D%0Aunion+all%0D%0Aselect+%27issue_comments%27+as+%22table%22%2C+issue_comments.id%2C+issue_comments.node_id%2C+%27%27+as+title%2C+issue_comments.user%2C+issue_comments.created_at%2C+issue_comments.body%2C+issues.repo%0D%0Afrom+issue_comments+join+issues+on+issues.id+%3D+issue_comments.issue%0D%0Aorder+by+created_at+desc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1071071397,View that combines issues and issue comments, https://github.com/dogsheep/github-to-sqlite/issues/72#issuecomment-1105474232,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/72,1105474232,IC_kwDODFdgUs5B5DK4,9599,simonw,2022-04-21T17:02:15Z,2022-04-21T17:02:15Z,MEMBER,"That's interesting - yeah it looks like the number of pages can be derived from the `Link` header, which is enough information to show a progress bar, probably using Click just to avoid adding another dependency. https://docs.github.com/en/rest/guides/traversing-with-pagination","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1211283427,feature: display progress bar when downloading multi-page responses, https://github.com/dogsheep/github-to-sqlite/pull/73#issuecomment-1188223299,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/73,1188223299,IC_kwDODFdgUs5G0tlD,9599,simonw,2022-07-18T19:40:06Z,2022-07-18T19:40:06Z,MEMBER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1261884917,Fixing 'NoneType' object has no attribute 'items', https://github.com/dogsheep/github-to-sqlite/issues/74#issuecomment-1188223933,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74,1188223933,IC_kwDODFdgUs5G0tu9,9599,simonw,2022-07-18T19:40:50Z,2022-07-18T19:42:41Z,MEMBER,"Here's how the demo is deployed: https://github.com/dogsheep/github-to-sqlite/blob/dbac2e5dd8a562b45d8255a265859cf8020ca22a/.github/workflows/deploy-demo.yml#L103-L119 I'm suspicious of `py-gfm`, which is used like this: https://github.com/dogsheep/github-to-sqlite/blob/dbac2e5dd8a562b45d8255a265859cf8020ca22a/demo-metadata.json#L49-L51","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1308461063,500 error in github-to-sqlite demo, https://github.com/dogsheep/github-to-sqlite/issues/74#issuecomment-1188228964,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74,1188228964,IC_kwDODFdgUs5G0u9k,9599,simonw,2022-07-18T19:45:30Z,2022-07-18T19:47:35Z,MEMBER,"`pycmarkgfm` doesn't implement the Markdown plugin extension I was using. I'm going to drop the GFM rendering from the demo, and just treat it as regular markdown.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1308461063,500 error in github-to-sqlite demo, https://github.com/dogsheep/github-to-sqlite/issues/74#issuecomment-1188225625,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74,1188225625,IC_kwDODFdgUs5G0uJZ,9599,simonw,2022-07-18T19:41:52Z,2022-07-18T19:41:52Z,MEMBER,"https://github.com/Zopieux/py-gfm says that library is no longer maintained, and suggests https://github.com/Zopieux/pycmarkgfm as an alternative.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1308461063,500 error in github-to-sqlite demo, https://github.com/dogsheep/github-to-sqlite/issues/74#issuecomment-1188233729,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74,1188233729,IC_kwDODFdgUs5G0wIB,9599,simonw,2022-07-18T19:51:02Z,2022-07-18T19:51:02Z,MEMBER,Takes 30m to deploy the demo!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1308461063,500 error in github-to-sqlite demo, https://github.com/dogsheep/github-to-sqlite/issues/74#issuecomment-1188317682,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74,1188317682,IC_kwDODFdgUs5G1Eny,9599,simonw,2022-07-18T21:14:22Z,2022-07-18T21:14:22Z,MEMBER,That fixed it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1308461063,500 error in github-to-sqlite demo, https://github.com/dogsheep/github-to-sqlite/issues/79#issuecomment-1498167714,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79,1498167714,IC_kwDODFdgUs5ZTDmi,9599,simonw,2023-04-05T21:12:55Z,2023-04-05T21:12:55Z,MEMBER,"It's a rate limiting problem: https://github.com/dogsheep/github-to-sqlite/actions/runs/4622674009/jobs/8175633155 ``` File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/cli.py"", line 407, in commits utils.save_commits(db, commits, repo_full[""id""]) File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 635, in save_commits for commit in commits: File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 417, in fetch_commits for commits in paginate(url, headers): File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 478, in paginate raise GitHubError.from_response(response) github_to_sqlite.utils.GitHubError: ('API rate limit exceeded for user ID 9599.', 403) Error: Process completed with exit code 1. ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1570375808,Deploy demo job is failing due to rate limit, https://github.com/dogsheep/pocket-to-sqlite/issues/10#issuecomment-1221623052,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10,1221623052,IC_kwDODLZ_YM5I0H0M,9599,simonw,2022-08-21T21:20:33Z,2022-08-21T21:20:33Z,MEMBER,"That was clearly the intention from the description of this issue: - #4","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1246826792,"When running `auth` command, don't overwrite an existing auth.json file", https://github.com/dogsheep/pocket-to-sqlite/issues/11#issuecomment-1221621466,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11,1221621466,IC_kwDODLZ_YM5I0Hba,9599,simonw,2022-08-21T21:09:47Z,2022-08-21T21:09:47Z,MEMBER,"Great catch, thanks. I'm going to use it to mean `--auth` - since other tools in the Dogsheep family have the same convention. `--all` will be the only way to specify all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1345452427,"-a option is used for ""--auth"" and for ""--all""", https://github.com/dogsheep/pocket-to-sqlite/issues/11#issuecomment-1221621529,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11,1221621529,IC_kwDODLZ_YM5I0HcZ,9599,simonw,2022-08-21T21:10:15Z,2022-08-21T21:11:26Z,MEMBER,"Just saw that's what's implemented here already! - #7","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1345452427,"-a option is used for ""--auth"" and for ""--all""", https://github.com/dogsheep/pocket-to-sqlite/pull/7#issuecomment-1221621700,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/7,1221621700,IC_kwDODLZ_YM5I0HfE,9599,simonw,2022-08-21T21:11:12Z,2022-08-21T21:11:12Z,MEMBER,I thought this might need a documentation update but `--all` is already covered: https://github.com/dogsheep/pocket-to-sqlite/blob/0.2.1/README.md,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",750141615,Fixed conflicting CLI flags, https://github.com/dogsheep/pocket-to-sqlite/issues/10#issuecomment-1221622873,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10,1221622873,IC_kwDODLZ_YM5I0HxZ,9599,simonw,2022-08-21T21:19:25Z,2022-08-21T21:19:25Z,MEMBER,"Agreed, that would be a much better implementation.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1246826792,"When running `auth` command, don't overwrite an existing auth.json file", https://github.com/dogsheep/pocket-to-sqlite/issues/12#issuecomment-1627563202,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12,1627563202,IC_kwDODLZ_YM5hAqTC,9599,simonw,2023-07-09T01:14:27Z,2023-07-09T01:14:27Z,MEMBER,I tested this locally with `python -m build` and then `pip install ...whl` in a fresh virtual environment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1795187493,Switch to pyproject.toml, https://github.com/dogsheep/pocket-to-sqlite/issues/12#issuecomment-1627564127,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12,1627564127,IC_kwDODLZ_YM5hAqhf,9599,simonw,2023-07-09T01:19:42Z,2023-07-09T01:19:42Z,MEMBER,https://github.com/dogsheep/pocket-to-sqlite/tree/0.2.3 and https://pypi.org/project/pocket-to-sqlite/0.2.3/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1795187493,Switch to pyproject.toml, https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886136224,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2,886136224,IC_kwDODtX3eM400V2g,9599,simonw,2021-07-25T02:08:29Z,2021-07-25T02:08:29Z,MEMBER,"Prototype: curl ""https://hn.algolia.com/api/v1/search_by_date?query=simonwillison.net&restrictSearchableAttributes=url&hitsPerPage=1000"" | \ jq .hits | sqlite-utils insert hn.db items - --pk objectID --alter","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952179830,Command for fetching Hacker News threads from the search API, https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886135562,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2,886135562,IC_kwDODtX3eM400VsK,9599,simonw,2021-07-25T02:01:11Z,2021-07-25T02:01:11Z,MEMBER,"That page doesn't have an API but does look easy to scrape. The other option here is the HN Search API powered by Algolia, documented at https://hn.algolia.com/api","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952179830,Command for fetching Hacker News threads from the search API, https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886135922,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2,886135922,IC_kwDODtX3eM400Vxy,9599,simonw,2021-07-25T02:06:20Z,2021-07-25T02:06:20Z,MEMBER,"https://hn.algolia.com/api/v1/search_by_date?query=simonwillison.net&restrictSearchableAttributes=url looks like it does what I want. https://hn.algolia.com/api/v1/search_by_date?query=simonwillison.net&restrictSearchableAttributes=url&hitsPerPage=1000 - returns 1000 at once. Otherwise you have to paginate using `&page=2` etc - up to `nbPages` pages. https://www.algolia.com/doc/api-reference/api-parameters/hitsPerPage/ says 1000 is the maximum.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952179830,Command for fetching Hacker News threads from the search API, https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886140431,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2,886140431,IC_kwDODtX3eM400W4P,9599,simonw,2021-07-25T03:12:57Z,2021-07-25T03:12:57Z,MEMBER,"I'm going to build a general-purpose `hacker-new-to-sqlite search ...` command, where one of the options is to search within the URL.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952179830,Command for fetching Hacker News threads from the search API, https://github.com/dogsheep/hacker-news-to-sqlite/issues/3#issuecomment-886142671,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3,886142671,IC_kwDODtX3eM400XbP,9599,simonw,2021-07-25T03:51:05Z,2021-07-25T03:51:05Z,MEMBER,"Prototype: curl 'https://hn.algolia.com/api/v1/items/27941108' \ | jq '[recurse(.children[]) | del(.children)]' \ | sqlite-utils insert hn.db items - --pk id ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952189173,Use HN algolia endpoint to retrieve trees, https://github.com/dogsheep/hacker-news-to-sqlite/issues/3#issuecomment-886237834,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3,886237834,IC_kwDODtX3eM400uqK,9599,simonw,2021-07-25T18:05:32Z,2021-07-25T18:05:32Z,MEMBER,"If you hit the endpoint for a comment that's part of a thread you get that comment and its recursive children: https://hn.algolia.com/api/v1/items/27941552 You can tell that it's not the top-level because the `parent_id` isn't `null`. You can use `story_id` to figure out what the top-level item is. ```json { ""id"": 27941552, ""created_at"": ""2021-07-24T15:08:39.000Z"", ""created_at_i"": 1627139319, ""type"": ""comment"", ""author"": ""nine_k"", ""title"": null, ""url"": null, ""text"": ""

I wish ..."", ""points"": null, ""parent_id"": 27941108, ""story_id"": 27941108 } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952189173,Use HN algolia endpoint to retrieve trees, https://github.com/dogsheep/hacker-news-to-sqlite/issues/3#issuecomment-886241674,https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3,886241674,IC_kwDODtX3eM400vmK,9599,simonw,2021-07-25T18:41:17Z,2021-07-25T18:41:17Z,MEMBER,Got a TIL out of this: https://til.simonwillison.net/jq/extracting-objects-recursively,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",952189173,Use HN algolia endpoint to retrieve trees, https://github.com/dogsheep/evernote-to-sqlite/issues/13#issuecomment-905203570,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13,905203570,IC_kwDOEhK-wc419E9y,9599,simonw,2021-08-25T05:51:22Z,2021-08-25T05:53:27Z,MEMBER,"The debugger showed me that it broke on a string that looked like this: ```xml

Q3 2018 Reflection & Development

... ``` Yeah that is not valid XML!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",978743426,xml.etree.ElementTree.ParseError: not well-formed (invalid token), https://github.com/dogsheep/evernote-to-sqlite/issues/13#issuecomment-905206234,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13,905206234,IC_kwDOEhK-wc419Fna,9599,simonw,2021-08-25T05:58:42Z,2021-08-25T05:58:42Z,MEMBER,"https://github.com/dogsheep/evernote-to-sqlite/blob/36a466f142e5bad52719851c2fbda0c05cd35b99/evernote_to_sqlite/utils.py#L34-L42 Not sure why I was round-tripping the `content_xml` like that - I will try not doing that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",978743426,xml.etree.ElementTree.ParseError: not well-formed (invalid token), https://github.com/dogsheep/evernote-to-sqlite/issues/13#issuecomment-906635938,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13,906635938,IC_kwDOEhK-wc42Ciqi,9599,simonw,2021-08-26T18:18:27Z,2021-08-26T18:18:27Z,MEMBER,"It looks like I was using the round-trip to dump the `` and `` content - it's just HTML, not even trying to be XML.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",978743426,xml.etree.ElementTree.ParseError: not well-formed (invalid token), https://github.com/dogsheep/apple-notes-to-sqlite/issues/1#issuecomment-1461226402,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1,1461226402,IC_kwDOJHON9s5XGIui,9599,simonw,2023-03-09T03:47:11Z,2023-03-09T03:47:11Z,MEMBER,"Me: > Write applescript to loop through all of my Apple Notes and output their contents ChatGPT: ```applescript tell application ""Notes"" repeat with eachNote in every note set noteTitle to the name of eachNote set noteBody to the body of eachNote display dialog noteTitle & return & return & noteBody end repeat end tell ``` Me: > How can I turn that into a zsh script which will output the results to my terminal ChatGPT: ```zsh #!/usr/bin/zsh osascript -e 'tell application ""Notes"" repeat with eachNote in every note set noteTitle to the name of eachNote set noteBody to the body of eachNote set output to noteTitle & ""\n"" & noteBody & ""\n"" display dialog output log output end repeat end tell' ``` I had to change that first line to `#!/bin/zsh` - then when I ran it it spammed my screen with dialog boxes. Me: > I want to log the notes contents to the terminal, not display it in a dialog ChatGPT: ```zsh #!/bin/zsh osascript -e 'tell application ""Notes"" repeat with eachNote in every note set noteTitle to the name of eachNote set noteBody to the body of eachNote set output to noteTitle & ""\n"" & noteBody & ""\n"" log output end repeat end tell' ``` This really helps!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616347574,Initial proof of concept with ChatGPT, https://github.com/dogsheep/apple-notes-to-sqlite/issues/1#issuecomment-1461230197,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1,1461230197,IC_kwDOJHON9s5XGJp1,9599,simonw,2023-03-09T03:51:36Z,2023-03-09T03:51:36Z,MEMBER,"After a few more rounds I got to this script, which outputs them to a `/tmp/notes.txt` file: ```zsh #!/bin/zsh osascript -e ' set notesFile to ""/tmp/notes.txt"" set fileRef to open for access notesFile with write permission tell application ""Notes"" repeat with eachNote in every note set noteId to the id of eachNote set noteTitle to the name of eachNote set noteBody to the body of eachNote write ""------------------------"" & ""\n"" to fileRef write noteId & ""\n"" to fileRef write noteTitle & ""\n\n"" to fileRef write noteBody & ""\n"" to fileRef end repeat end tell close access fileRef' ``` Then I wrote this little Python script to load them into a database: ```python import sqlite_utils split = b""------------------------\n"" s = open(""/tmp/notes.txt"", ""rb"").read() notes = [n.decode(""mac_roman"") for n in s.split(split) if n] cleaned_notes = [{ ""id"": n.split(""\n"")[0], ""title"": n.split(""\n"")[1], ""body"": ""\n"".join(n.split(""\n"")[2:]).strip() } for n in notes] db = sqlite_utils.Database(""/tmp/notes.db"") db[""notes""].insert_all(cleaned_notes) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616347574,Initial proof of concept with ChatGPT, https://github.com/dogsheep/apple-notes-to-sqlite/issues/1#issuecomment-1461230436,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1,1461230436,IC_kwDOJHON9s5XGJtk,9599,simonw,2023-03-09T03:51:52Z,2023-03-09T03:51:52Z,MEMBER,This did the job! Next step is to turn that into a Python script.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616347574,Initial proof of concept with ChatGPT, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461232709,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461232709,IC_kwDOJHON9s5XGKRF,9599,simonw,2023-03-09T03:54:28Z,2023-03-09T03:54:28Z,MEMBER,"I think the AppleScript I want to pass to `osascript` looks like this: ```applescript tell application ""Notes"" repeat with eachNote in every note set noteId to the id of eachNote set noteTitle to the name of eachNote set noteBody to the body of eachNote log ""------------------------"" & ""\n"" log noteId & ""\n"" log noteTitle & ""\n\n"" log noteBody & ""\n"" end repeat end tell ``` But there are a few more properties I'd like to get - created and updated date for example.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461234311,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461234311,IC_kwDOJHON9s5XGKqH,9599,simonw,2023-03-09T03:56:24Z,2023-03-09T03:56:24Z,MEMBER,"I opened the ""Script Editor"" app on my computer, used Window -> Library to open the Library panel, then clicked on the Notes app there. I got this: So the notes object has these properties: - name (text) : the name of the note (normally the first line of the body) - id (text, r/o) : the unique identifier of the note - container ([folder](applewebdata://621FA8D9-C995-4081-B3B3-149B0EA04C7F#Notes-Suite.folder), r/o) : the folder of the note - body (text) : the HTML content of the note - plaintext (text, r/o) : the plaintext content of the note - creation date (date, r/o) : the creation date of the note - modification date (date, r/o) : the modification date of the note - password protected (boolean, r/o) : Is the note password protected? - shared (boolean, r/o) : Is the note shared? I'm going to ignore the concept of attachments for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461234591,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461234591,IC_kwDOJHON9s5XGKuf,9599,simonw,2023-03-09T03:56:45Z,2023-03-09T03:56:45Z,MEMBER,"My prototype showed that images embedded in notes come out in the HTML export as bas64 image URLs, which is neat.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461259490,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461259490,IC_kwDOJHON9s5XGQzi,9599,simonw,2023-03-09T04:24:27Z,2023-03-09T04:24:27Z,MEMBER,"Converting AppleScript date strings to ISO format is hard! https://forum.latenightsw.com/t/formatting-dates/841 has a recipe I'll try: set todayISO to (todayDate as «class isot» as string) Not clear to me how timezones work here. I'm going to ignore them for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461260978,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461260978,IC_kwDOJHON9s5XGRKy,9599,simonw,2023-03-09T04:27:18Z,2023-03-09T04:27:18Z,MEMBER,"Before that conversion: Monday, March 6, 2023 at 11:55:15 AM After: 2023-03-06T11:55:15","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461262577,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461262577,IC_kwDOJHON9s5XGRjx,9599,simonw,2023-03-09T04:30:00Z,2023-03-09T04:30:00Z,MEMBER,It doesn't have tests yet. I guess I'll need to mock `subprocess` to test this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/2#issuecomment-1461285545,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2,1461285545,IC_kwDOJHON9s5XGXKp,9599,simonw,2023-03-09T05:06:24Z,2023-03-09T05:06:24Z,MEMBER,"OK, this works!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616354999,First working version, https://github.com/dogsheep/apple-notes-to-sqlite/issues/4#issuecomment-1462554175,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4,1462554175,IC_kwDOJHON9s5XLM4_,9599,simonw,2023-03-09T18:19:34Z,2023-03-09T18:19:34Z,MEMBER,It looks like the iteration order is most-recently-modified-first - I tried editing a note a bit further back in my notes app and it was the first one output by `apple-notes-to-sqlite --dump`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616429236,Support incremental updates, https://github.com/dogsheep/apple-notes-to-sqlite/issues/4#issuecomment-1462556829,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4,1462556829,IC_kwDOJHON9s5XLNid,9599,simonw,2023-03-09T18:20:56Z,2023-03-09T18:20:56Z,MEMBER,"In terms of the UI: I'm tempted to say that the default behaviour is for it to run until it sees a note that it already knows about AND that has matching update/created dates, and then stop. You can do a full import again ignoring that logic with `apple-notes-to-sqlite notes.db --full`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1616429236,Support incremental updates, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462562735,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462562735,IC_kwDOJHON9s5XLO-v,9599,simonw,2023-03-09T18:23:56Z,2023-03-09T18:25:22Z,MEMBER,"From the Script Editor library docs: A note has a: > - `container` (folder), r/o) : the folder of the note Here's what a folder looks like: > folder n : a folder containing notes > elements: > > - contains folders, notes; contained by application, accounts, folders. > > properties: > > - `name` (text) : the name of the folder > - `id` (text, r/o) : the unique identifier of the folder > - `shared` (boolean, r/o) : Is the folder shared? > - `container` (account or folder, r/o) : the container of the folder ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847,Folder support, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462564717,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462564717,IC_kwDOJHON9s5XLPdt,9599,simonw,2023-03-09T18:25:39Z,2023-03-09T18:25:39Z,MEMBER,So it looks like folders can be hierarchical?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847,Folder support, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462570187,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462570187,IC_kwDOJHON9s5XLQzL,9599,simonw,2023-03-09T18:30:24Z,2023-03-09T18:30:24Z,MEMBER,"I used ChatGPT to write this: ``` osascript -e 'tell application ""Notes"" set allFolders to folders repeat with aFolder in allFolders set folderId to id of aFolder set folderName to name of aFolder set folderContainer to container of aFolder set folderContainerName to name of folderContainer log ""Folder ID: "" & folderId log ""Folder Name: "" & folderName log ""Folder Container: "" & folderContainerName log "" "" --check for nested folders if count of folders of aFolder > 0 then set nestedFolders to folders of aFolder repeat with aNestedFolder in nestedFolders set nestedFolderId to id of aNestedFolder set nestedFolderName to name of aNestedFolder set nestedFolderContainer to container of aNestedFolder set nestedFolderContainerName to name of nestedFolderContainer log "" Nested Folder ID: "" & nestedFolderId log "" Nested Folder Name: "" & nestedFolderName log "" Nested Folder Container: "" & nestedFolderContainerName log "" "" end repeat end if end repeat end tell ' ``` Which for my account output this: ``` Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6113 Folder Name: Blog posts Folder Container: iCloud Nested Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Nested Folder Name: Nested inside blog posts Nested Folder Container: Blog posts Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p698 Folder Name: JSK Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Folder Name: Nested inside blog posts Folder Container: Blog posts Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3526 Folder Name: New Folder Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3839 Folder Name: New Folder 1 Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p2 Folder Name: Notes Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6059 Folder Name: Quick Notes Folder Container: iCloud Folder ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7283 Folder Name: UK Christmas 2022 Folder Container: iCloud ``` So I think the correct approach here is to run code at the start to list all of the folders (no need to do fancy recursion though, just a flat list with the parent containers is enough) and create a model of that hierarchy in SQLite. Then when I import notes I can foreign key reference them back to their containing folder. I'm tempted to use `rowid` for the foreign keys because the official IDs are pretty long.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847,Folder support, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462682795,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462682795,IC_kwDOJHON9s5XLsSr,9599,simonw,2023-03-09T19:52:20Z,2023-03-09T19:52:44Z,MEMBER,"Created through several rounds with ChatGPT (including hints like ""rewrite that using setdefault()""): ```python def topological_sort(nodes): children = {} for node in nodes: parent_id = node[""parent""] if parent_id is not None: children.setdefault(parent_id, []).append(node) def traverse(node, result): result.append(node) if node[""id""] in children: for child in children[node[""id""]]: traverse(child, result) sorted_data = [] for node in nodes: if node[""parent""] is None: traverse(node, sorted_data) return sorted_data ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847,Folder support, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462693867,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462693867,IC_kwDOJHON9s5XLu_r,9599,simonw,2023-03-09T20:01:39Z,2023-03-09T20:02:11Z,MEMBER,"My `folders` table will have: - `id` - rowid - `long_id` - that long unique string ID - `name` - the name - `parent` - foreign key to `id`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847,Folder support, https://github.com/dogsheep/apple-notes-to-sqlite/issues/7#issuecomment-1462691466,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7,1462691466,IC_kwDOJHON9s5XLuaK,9599,simonw,2023-03-09T19:59:52Z,2023-03-09T19:59:52Z,MEMBER,"Improved script: ```zsh osascript -e 'tell application ""Notes"" set allFolders to folders repeat with aFolder in allFolders set folderId to id of aFolder set folderName to name of aFolder set folderContainer to container of aFolder if class of folderContainer is folder then set folderContainerId to id of folderContainer else set folderContainerId to """" end if log ""ID: "" & folderId log ""Name: "" & folderName log ""Container: "" & folderContainerId log "" "" end repeat end tell ' ``` ``` ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6113 Name: Blog posts Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p698 Name: JSK Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7995 Name: Nested inside blog posts Container: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6113 ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3526 Name: New Folder Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p3839 Name: New Folder 1 Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p2 Name: Notes Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p6059 Name: Quick Notes Container: ID: x-coredata://D2D50498-BBD1-4097-B122-D15ABD32BDEC/ICFolder/p7283 Name: UK Christmas 2022 Container: ``` I filtered out things where the parent was an account and not a folder using `if class of folderContainer is folder then`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1617769847,Folder support, https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462962682,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11,1462962682,IC_kwDOJHON9s5XMwn6,9599,simonw,2023-03-09T23:20:35Z,2023-03-09T23:22:41Z,MEMBER,"Here's a query that returns all notes in folder 1, including notes in descendant folders: ```sql with recursive nested_folders(folder_id, descendant_folder_id) as ( -- base case: select all immediate children of the root folder select id, id from folders where parent is null union all -- recursive case: select all children of the previous level of nested folders select nf.folder_id, f.id from nested_folders nf join folders f on nf.descendant_folder_id = f.parent ) -- Find notes within all descendants of folder 1 select * from notes where folder in ( select descendant_folder_id from nested_folders where folder_id = 1 ); ``` With assistance from ChatGPT. Prompts were: ``` SQLite schema: CREATE TABLE [folders] ( [id] INTEGER PRIMARY KEY, [long_id] TEXT, [name] TEXT, [parent] INTEGER, FOREIGN KEY([parent]) REFERENCES [folders]([id]) ); Write a recursive CTE that returns the following: folder_id | descendant_folder_id With a row for every nested child of every folder - so the top level folder has lots of rows ``` Then I tweaked it a bit, then ran this: ``` WITH RECURSIVE nested_folders(folder_id, descendant_folder_id) AS ( -- base case: select all immediate children of the root folder SELECT id, id FROM folders WHERE parent IS NULL UNION ALL -- recursive case: select all children of the previous level of nested folders SELECT nf.folder_id, f.id FROM nested_folders nf JOIN folders f ON nf.descendant_folder_id = f.parent ) -- select all rows from the recursive CTE SELECT * from notes where folder in (select descendant_folder_id FROM nested_folders where folder_id = 1) Convert all SQL keywords to lower case, and re-indent ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1618130434,Implement a SQL view to make it easier to query files in a nested folder, https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462968053,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11,1462968053,IC_kwDOJHON9s5XMx71,9599,simonw,2023-03-09T23:24:01Z,2023-03-09T23:24:01Z,MEMBER,"I improved the readability by removing some unnecessary table aliases: ```sql with recursive nested_folders(folder_id, descendant_folder_id) as ( -- base case: select all immediate children of the root folder select id, id from folders where parent is null union all -- recursive case: select all children of the previous level of nested folders select nested_folders.folder_id, folders.id from nested_folders join folders on nested_folders.descendant_folder_id = folders.parent ) -- Find notes within all descendants of folder 1 select * from notes where folder in ( select descendant_folder_id from nested_folders where folder_id = 1 ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1618130434,Implement a SQL view to make it easier to query files in a nested folder, https://github.com/dogsheep/apple-notes-to-sqlite/issues/11#issuecomment-1462965256,https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11,1462965256,IC_kwDOJHON9s5XMxQI,9599,simonw,2023-03-09T23:22:12Z,2023-03-09T23:22:12Z,MEMBER,"Here's what the CTE from that looks like: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1618130434,Implement a SQL view to make it easier to query files in a nested folder, https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-540879620,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4,540879620,MDEyOklzc3VlQ29tbWVudDU0MDg3OTYyMA==,9599,simonw,2019-10-11T02:59:16Z,2019-10-11T02:59:16Z,MEMBER,Also import ad preferences and all that other junk.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488835586,Command for importing data from a Twitter Export file, https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541141169,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18,541141169,MDEyOklzc3VlQ29tbWVudDU0MTE0MTE2OQ==,9599,simonw,2019-10-11T16:51:29Z,2019-10-11T16:51:29Z,MEMBER,Documented here: https://github.com/dogsheep/twitter-to-sqlite/blob/master/README.md#retrieving-tweets-from-your-home-timeline,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",505928530,Command to import home-timeline, https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112108,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17,541112108,MDEyOklzc3VlQ29tbWVudDU0MTExMjEwOA==,9599,simonw,2019-10-11T15:30:15Z,2019-10-11T15:30:15Z,MEMBER,It should delete the tables entirely. That way it will work even if the table schema has changed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",505674949,import command should empty all archive-* tables first, https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112588,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17,541112588,MDEyOklzc3VlQ29tbWVudDU0MTExMjU4OA==,9599,simonw,2019-10-11T15:31:30Z,2019-10-11T15:31:30Z,MEMBER,"No need for an option: > This command will delete and recreate all of your `archive-*` tables every time you run it. If this is not what you want, run the command against a fresh SQLite database rather than running it again one that already exists.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",505674949,import command should empty all archive-* tables first, https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118773,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18,541118773,MDEyOklzc3VlQ29tbWVudDU0MTExODc3Mw==,9599,simonw,2019-10-11T15:48:31Z,2019-10-11T15:48:31Z,MEMBER,https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-home_timeline,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",505928530,Command to import home-timeline, https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118934,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18,541118934,MDEyOklzc3VlQ29tbWVudDU0MTExODkzNA==,9599,simonw,2019-10-11T15:48:54Z,2019-10-11T15:48:54Z,MEMBER,Rate limit is tight: 15 requests every 15 mins!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",505928530,Command to import home-timeline, https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541119834,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18,541119834,MDEyOklzc3VlQ29tbWVudDU0MTExOTgzNA==,9599,simonw,2019-10-11T15:51:22Z,2019-10-11T16:51:33Z,MEMBER,"In order to support multiple user timelines being saved in the same database, I'm going to import the tweets into the `tweets` table AND add a new `timeline_tweets` table recording that a specific tweet showed up in a specific user's timeline.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",505928530,Command to import home-timeline, https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-541248629,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19,541248629,MDEyOklzc3VlQ29tbWVudDU0MTI0ODYyOQ==,9599,simonw,2019-10-11T22:48:56Z,2019-10-11T22:48:56Z,MEMBER,"`since_id` documented here: https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-home_timeline > Returns results with an ID greater than (that is, more recent than) the specified ID. There are limits to the number of Tweets which can be accessed through the API. If the limit of Tweets has occured since the since_id, the since_id will be forced to the oldest ID available.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506087267,since_id support for home-timeline, https://github.com/dogsheep/github-to-sqlite/issues/6#issuecomment-541387822,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6,541387822,MDEyOklzc3VlQ29tbWVudDU0MTM4NzgyMg==,9599,simonw,2019-10-13T05:27:39Z,2019-10-13T05:27:39Z,MEMBER,This should be fixed by https://github.com/dogsheep/github-to-sqlite/commit/552543a74970f8a3a3f87f887be23a0c6eb1cb5b,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",504238461,sqlite3.OperationalError: table users has no column named bio, https://github.com/dogsheep/github-to-sqlite/issues/6#issuecomment-541387941,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6,541387941,MDEyOklzc3VlQ29tbWVudDU0MTM4Nzk0MQ==,9599,simonw,2019-10-13T05:30:19Z,2019-10-13T05:30:19Z,MEMBER,Fix released in 0.5: https://github.com/dogsheep/github-to-sqlite/releases/tag/0.5,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",504238461,sqlite3.OperationalError: table users has no column named bio, https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-541388038,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20,541388038,MDEyOklzc3VlQ29tbWVudDU0MTM4ODAzOA==,9599,simonw,2019-10-13T05:31:58Z,2019-10-13T05:31:58Z,MEMBER,For favourites a `--stop_after=200` option is probably good enough.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506268945,--since support for various commands for refresh-by-cron, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-541493242,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,541493242,MDEyOklzc3VlQ29tbWVudDU0MTQ5MzI0Mg==,9599,simonw,2019-10-14T03:35:36Z,2019-10-14T03:35:36Z,MEMBER,"https://developer.twitter.com/en/docs/tweets/search/api-reference/get-search-tweets ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975,Command for running a search and saving tweets for that search, https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-541748580,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10,541748580,MDEyOklzc3VlQ29tbWVudDU0MTc0ODU4MA==,9599,simonw,2019-10-14T15:30:44Z,2019-10-14T15:30:44Z,MEMBER,Had several recommendations for https://github.com/tqdm/tqdm which is what goodreads-to-sqlite uses.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",492297930,Rethink progress bars for various commands, https://github.com/dogsheep/github-to-sqlite/issues/7#issuecomment-541721437,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7,541721437,MDEyOklzc3VlQ29tbWVudDU0MTcyMTQzNw==,9599,simonw,2019-10-14T14:44:12Z,2019-10-14T14:44:12Z,MEMBER,Docs: https://github.com/dogsheep/github-to-sqlite/blob/0.5/README.md#retrieving-issue-comments-for-a-repository,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506276893,issue-comments command for importing issue comments, https://github.com/dogsheep/twitter-to-sqlite/issues/21#issuecomment-542333836,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21,542333836,MDEyOklzc3VlQ29tbWVudDU0MjMzMzgzNg==,9599,simonw,2019-10-15T18:00:48Z,2019-10-15T18:00:48Z,MEMBER,I'll use `html.unescape()` for this: https://docs.python.org/3/library/html.html#html.unescape,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506432572,Fix & escapes in tweet text, https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542849963,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19,542849963,MDEyOklzc3VlQ29tbWVudDU0Mjg0OTk2Mw==,9599,simonw,2019-10-16T19:13:06Z,2019-10-16T19:13:06Z,MEMBER,Updated documentation: https://github.com/dogsheep/twitter-to-sqlite/blob/fced2a9b67d2cbdf9817f1eb75f7c28e413c963b/README.md#retrieving-tweets-from-your-home-timeline,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506087267,since_id support for home-timeline, https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-542854749,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20,542854749,MDEyOklzc3VlQ29tbWVudDU0Mjg1NDc0OQ==,9599,simonw,2019-10-16T19:26:01Z,2019-10-16T19:26:01Z,MEMBER,"I'm not going to do this for ""accounts that have followed me"" and ""new accounts that I have followed"" - instead I will recommend running the `friend_ids` and `followers_ids` commands on a daily basis since that data doesn't really change much by the hour. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506268945,--since support for various commands for refresh-by-cron, https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855081,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12,542855081,MDEyOklzc3VlQ29tbWVudDU0Mjg1NTA4MQ==,9599,simonw,2019-10-16T19:26:56Z,2019-10-16T19:26:56Z,MEMBER,This may be the first case where I want to be able to repair existing databases rather than discarding their contents.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503053800,"Extract ""source"" into a separate lookup table", https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855427,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12,542855427,MDEyOklzc3VlQ29tbWVudDU0Mjg1NTQyNw==,9599,simonw,2019-10-16T19:27:55Z,2019-10-16T19:27:55Z,MEMBER,I can do that by keeping `source` as a `TEXT` column but turning it into a non-enforced foreign key against a new `sources` table. Then I can run code that scans that column for any values beginning with a `<` and converts them.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503053800,"Extract ""source"" into a separate lookup table", https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542858025,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12,542858025,MDEyOklzc3VlQ29tbWVudDU0Mjg1ODAyNQ==,9599,simonw,2019-10-16T19:35:31Z,2019-10-16T19:36:09Z,MEMBER,Maybe this means I need an `upgrade` command to apply these kinds of migrations? Total feature creep!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503053800,"Extract ""source"" into a separate lookup table", https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542875885,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3,542875885,MDEyOklzc3VlQ29tbWVudDU0Mjg3NTg4NQ==,9599,simonw,2019-10-16T20:23:08Z,2019-10-16T20:23:08Z,MEMBER,"https://developer.foursquare.com/docs/api/users/checkins documents `afterTimestamp`: > Retrieve the first results to follow these seconds since epoch. This should be useful for paging forward in time, or when polling for changes. To avoid missing results when polling, we recommend subtracting several seconds from the last poll time and then de-duplicating.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487600595,Option to fetch only checkins more recent than the current max checkin, https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542876047,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3,542876047,MDEyOklzc3VlQ29tbWVudDU0Mjg3NjA0Nw==,9599,simonw,2019-10-16T20:23:36Z,2019-10-16T20:23:36Z,MEMBER,I'm going to go with `--since=1d/2w/3h` for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487600595,Option to fetch only checkins more recent than the current max checkin, https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542882604,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3,542882604,MDEyOklzc3VlQ29tbWVudDU0Mjg4MjYwNA==,9599,simonw,2019-10-16T20:41:23Z,2019-10-16T20:41:23Z,MEMBER,Documented here: https://github.com/dogsheep/swarm-to-sqlite/blob/0.2/README.md#usage,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487600595,Option to fetch only checkins more recent than the current max checkin, https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542832952,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19,542832952,MDEyOklzc3VlQ29tbWVudDU0MjgzMjk1Mg==,9599,simonw,2019-10-16T18:30:11Z,2019-10-16T18:30:11Z,MEMBER,"The `--since` option will derive the `since_id` from the max ID in the `timeline_tweets` table: $ twitter-to-sqlite home-timeline --since The `--since_id=xxx` option lets you specify that ID directly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506087267,since_id support for home-timeline, https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543265058,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25,543265058,MDEyOklzc3VlQ29tbWVudDU0MzI2NTA1OA==,9599,simonw,2019-10-17T16:51:12Z,2019-10-17T16:51:12Z,MEMBER,"This migration function only runs if there is a table called `tweets` and the migration has not run before. I think this can happen if the database has just been freshly created (by a command that fetches the user's user timeline for example) and is then run a SECOND time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",508578780,Ensure migrations don't accidentally create foreign key twice, https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543266947,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25,543266947,MDEyOklzc3VlQ29tbWVudDU0MzI2Njk0Nw==,9599,simonw,2019-10-17T16:56:06Z,2019-10-17T16:56:06Z,MEMBER,I wrote a test that proves that this is a problem. Should be an easy fix though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",508578780,Ensure migrations don't accidentally create foreign key twice, https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543269396,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10,543269396,MDEyOklzc3VlQ29tbWVudDU0MzI2OTM5Ng==,9599,simonw,2019-10-17T17:02:07Z,2019-10-17T17:02:07Z,MEMBER,A neat trick that Click does is detecting if an interactive terminal is attached and NOT showing a progress bar if there isn't one. Need to figure out how to do that with tqdm.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",492297930,Rethink progress bars for various commands, https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543270714,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10,543270714,MDEyOklzc3VlQ29tbWVudDU0MzI3MDcxNA==,9599,simonw,2019-10-17T17:05:16Z,2019-10-17T17:05:16Z,MEMBER,"https://github.com/pallets/click/blob/716a5be90f56ce6cd506bb53d5739d09374b1636/click/_termui_impl.py#L93 is how Click does this: ``` self.is_hidden = not isatty(self.file) ``` Where `isatty` is a Click utility function: `from ._compat import isatty`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",492297930,Rethink progress bars for various commands, https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543271000,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10,543271000,MDEyOklzc3VlQ29tbWVudDU0MzI3MTAwMA==,9599,simonw,2019-10-17T17:05:59Z,2019-10-17T17:05:59Z,MEMBER,Looks like tqdm already does a TTY check here: https://github.com/tqdm/tqdm/blob/89b73bdc30c099c5b53725806e7edf3a121c9b3a/tqdm/std.py#L889-L890,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",492297930,Rethink progress bars for various commands, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543273540,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,543273540,MDEyOklzc3VlQ29tbWVudDU0MzI3MzU0MA==,9599,simonw,2019-10-17T17:12:51Z,2019-10-17T17:12:51Z,MEMBER,"Just importing tweets here isn't enough - how are we supposed to know which tweets were imported by which search? So I think the right thing to do here is to also create a `search_runs` table, which records each individual run of this tool (with a timestamp and the search terms used). Then have a `search_runs_tweets` m2m table which shows which Tweets were found by that search.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975,Command for running a search and saving tweets for that search, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543290744,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,543290744,MDEyOklzc3VlQ29tbWVudDU0MzI5MDc0NA==,9599,simonw,2019-10-17T17:57:14Z,2019-10-17T17:57:14Z,MEMBER,I have a working command now. I'm going to ship it early because it could do with some other people trying it out.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975,Command for running a search and saving tweets for that search, https://github.com/dogsheep/twitter-to-sqlite/issues/23#issuecomment-543217890,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23,543217890,MDEyOklzc3VlQ29tbWVudDU0MzIxNzg5MA==,9599,simonw,2019-10-17T15:03:10Z,2019-10-17T15:03:10Z,MEMBER,"Thinking about this further: the concept of migrations may end up being in direct conflict with the `sqlite-utils` concept of creating tables on demand the first time they are used - and of creating table schemas automatically to fit the shape of the JSON that is being inserted into them. I'm going to forge ahead anyway and build this because I think it will be an interesting exploration, but it's very likely this turns out to be a bad idea in the long run!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",508190730,Extremely simple migration system, https://github.com/dogsheep/twitter-to-sqlite/issues/23#issuecomment-543222239,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23,543222239,MDEyOklzc3VlQ29tbWVudDU0MzIyMjIzOQ==,9599,simonw,2019-10-17T15:12:33Z,2019-10-17T15:12:33Z,MEMBER,"Migrations will run only if you open a database that previously existed (as opposed to opening a brand new empty database). This means that the first time you run a command against a fresh database, migrations will not run and the `migrations` table will not be created. The _second_ time you run any command against that database the migrations will execute and populate the `migrations` table. This also means that each migration needs to be able to sanity check the database to see if it should run or not. If it should NOT run, it will do nothing but still be marked as having executed by adding to the `migrations` table.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",508190730,Extremely simple migration system, https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20,544335363,MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw==,9599,simonw,2019-10-21T03:32:04Z,2019-10-21T03:32:04Z,MEMBER,"In case anyone is interested, here's an extract from the crontab I'm running these under at the moment: ``` 1,11,21,31,41,51 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite user-timeline /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --since 2,7,12,17,22,27,32,37,42,47,52,57 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite home-timeline /home/ubuntu/timeline.db -a /home/ubuntu/auth.json --since 6,16,26,36,46,56 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite favorites /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --stop_after=50 ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506268945,--since support for various commands for refresh-by-cron, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-544646516,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,544646516,MDEyOklzc3VlQ29tbWVudDU0NDY0NjUxNg==,9599,simonw,2019-10-21T18:30:14Z,2019-10-21T18:30:14Z,MEMBER,"Thanks to help from Dr. Laura Cantino at Science Hack Day San Francisco I've been able to pull together this query: ```sql select rsid, genotype, case genotype when 'AA' then 'brown eye color, 80% of the time' when 'AG' then 'brown eye color' when 'GG' then 'blue eye color, 99% of the time' end as interpretation from genome where rsid = 'rs12913832' ``` See also https://www.snpedia.com/index.php/Rs12913832 - in particular this table: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-544648863,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,544648863,MDEyOklzc3VlQ29tbWVudDU0NDY0ODg2Mw==,9599,simonw,2019-10-21T18:36:03Z,2019-10-21T18:36:03Z,MEMBER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/twitter-to-sqlite/issues/26#issuecomment-547713287,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26,547713287,MDEyOklzc3VlQ29tbWVudDU0NzcxMzI4Nw==,9599,simonw,2019-10-30T02:36:13Z,2019-10-30T02:36:13Z,MEMBER,"Shipped this in 0.13: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.13 See also this Twitter thread: https://twitter.com/simonw/status/1189369677509623809","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",513074501,Command for importing mentions timeline, https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549094195,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8,549094195,MDEyOklzc3VlQ29tbWVudDU0OTA5NDE5NQ==,9599,simonw,2019-11-03T00:43:16Z,2019-11-03T00:43:28Z,MEMBER,"Also need to take #5 into account - if this command creates incomplete user records, how do we repair them? And make sure that if we run this command first any future commands that populate users don't break (probably just a case of using `alter=True` in a few places).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",516763727,"stargazers command, refs #4", https://github.com/dogsheep/github-to-sqlite/issues/5#issuecomment-549094229,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5,549094229,MDEyOklzc3VlQ29tbWVudDU0OTA5NDIyOQ==,9599,simonw,2019-11-03T00:44:03Z,2019-11-03T00:44:03Z,MEMBER,Might not need an incomplete boolean - may be possible to handle this with `alter=True` and then by filtering for users with null values in certain columns.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493671014,"Add ""incomplete"" boolean to users table for incomplete profiles", https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095217,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27,549095217,MDEyOklzc3VlQ29tbWVudDU0OTA5NTIxNw==,9599,simonw,2019-11-03T01:06:25Z,2019-11-03T01:06:25Z,MEMBER,"Wow, that `retweets_of_me` endpoint is almost completely useless: ``` $ twitter-to-sqlite fetch https://api.twitter.com/1.1/statuses/retweets_of_me.json ``` It returns my own tweets that have been retweeted, but with no indication at all of who retweeted them. It looks like this needs to be combined with this API - https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets-id - to fetch the details of up to 100 recent users who actually DID retweet an individual status. But that has a one-every-12-seconds rate limit on it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",514459062,retweets-of-me command, https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095317,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27,549095317,MDEyOklzc3VlQ29tbWVudDU0OTA5NTMxNw==,9599,simonw,2019-11-03T01:08:10Z,2019-11-03T01:08:10Z,MEMBER,"Hmm... one thing that could be useful is that `retweets_of_me` can support a `--since` parameter - so if run frequently it should hopefully let us know which tweets we would need to run `statuses/retweets/:id.json` against. I'm not sure if the `--since` parameter would show me a tweet that was previously retweeted but has now been retweeted again. I'll have a bit of a test and see.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",514459062,retweets-of-me command, https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095463,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27,549095463,MDEyOklzc3VlQ29tbWVudDU0OTA5NTQ2Mw==,9599,simonw,2019-11-03T01:10:52Z,2019-11-03T01:10:52Z,MEMBER,"I imagine it won't, since the data I would be recording and then passing to `since_id` would be the highest ID of my own tweets that have been retweeted at least once. So it won't be able to spot if I should check for fresh retweets of a given tweet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",514459062,retweets-of-me command, https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095641,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27,549095641,MDEyOklzc3VlQ29tbWVudDU0OTA5NTY0MQ==,9599,simonw,2019-11-03T01:12:58Z,2019-11-03T01:12:58Z,MEMBER,"It looks like Twitter really want you to subscribe to a premium API for this kind of thing and consume retweets via webhooks: https://developer.twitter.com/en/docs/accounts-and-users/subscribe-account-activity/api-reference I'm going to give up on this for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",514459062,retweets-of-me command, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549096321,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,549096321,MDEyOklzc3VlQ29tbWVudDU0OTA5NjMyMQ==,9599,simonw,2019-11-03T01:27:55Z,2019-11-03T01:28:17Z,MEMBER,"It would be neat if this could support `--since`, with that argument automatically finding the maximum tweet ID from a previous search that used the same exact arguments (using the `search_runs` table).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975,Command for running a search and saving tweets for that search, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549226399,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,549226399,MDEyOklzc3VlQ29tbWVudDU0OTIyNjM5OQ==,9599,simonw,2019-11-04T05:11:57Z,2019-11-04T05:11:57Z,MEMBER,I'm going to add a `hash` column to `search_runs` to support that. It's going to be the sha1 hash of the key-ordered JSON of the search arguments used by that run. Then `--since` can look for an identical hash and use it to identify the highest last fetched tweet to use in `since_id`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975,Command for running a search and saving tweets for that search, https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549228535,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3,549228535,MDEyOklzc3VlQ29tbWVudDU0OTIyODUzNQ==,9599,simonw,2019-11-04T05:31:55Z,2019-11-04T05:31:55Z,MEMBER,Documented here: https://github.com/dogsheep/twitter-to-sqlite/blob/801c0c2daf17d8abce9dcb5d8d610410e7e25dbe/README.md#running-searches,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833975,Command for running a search and saving tweets for that search, https://github.com/dogsheep/github-to-sqlite/issues/10#issuecomment-549230337,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10,549230337,MDEyOklzc3VlQ29tbWVudDU0OTIzMDMzNw==,9599,simonw,2019-11-04T05:47:18Z,2019-11-04T05:47:18Z,MEMBER,This definition isn't quite right - it's not pulling the identity of the user who starred the repo (`users.login` ends up being the owner login instead).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",516967682,Add this repos_starred view, https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549230583,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8,549230583,MDEyOklzc3VlQ29tbWVudDU0OTIzMDU4Mw==,9599,simonw,2019-11-04T05:49:26Z,2019-11-04T05:49:26Z,MEMBER,Adding the view from #10 would be useful here too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",516763727,"stargazers command, refs #4", https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549233778,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8,549233778,MDEyOklzc3VlQ29tbWVudDU0OTIzMzc3OA==,9599,simonw,2019-11-04T06:14:40Z,2019-11-04T06:14:40Z,MEMBER,"Spotted a tricky problem: running `github-to-sqlite starred stargazers.db` results in an incomplete `simonw` record. It creates a proper record for me thanks to this bit: https://github.com/dogsheep/github-to-sqlite/blob/ea07274667a08c67907e8bfbbccb6f0fb95ce817/github_to_sqlite/cli.py#L120-L126 But then... when it gets to the `datasette` repository which I have starred it over-writes my full user record with one that's missing most of the details, thanks to this bit: https://github.com/dogsheep/github-to-sqlite/blob/ea07274667a08c67907e8bfbbccb6f0fb95ce817/github_to_sqlite/utils.py#L117-L124 I need to find a way of NOT over-writing a good record with a thinner one. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",516763727,"stargazers command, refs #4", https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-550388354,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4,550388354,MDEyOklzc3VlQ29tbWVudDU1MDM4ODM1NA==,9599,simonw,2019-11-06T16:26:55Z,2019-11-06T16:26:55Z,MEMBER,"Here's a query I figured out using a window function that shows cumulative stargazers over time: ```sql select yyyymmdd, sum(n) over ( order by yyyymmdd rows unbounded preceding ) as cumulative_count from ( select substr(starred_at, 0, 11) as yyyymmdd, count(*) as n from stars group by yyyymmdd ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670730,Command to fetch stargazers for one or more repos, https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550783316,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10,550783316,MDEyOklzc3VlQ29tbWVudDU1MDc4MzMxNg==,9599,simonw,2019-11-07T05:16:56Z,2019-11-07T05:34:29Z,MEMBER,"It looks like Apple changed the location of these in iOS 13 - they are now in separate `.gpx` files: ![2FF70E95-CDEE-4241-A5C5-EE95A862E519](https://user-images.githubusercontent.com/9599/68362042-be12e000-00da-11ea-8925-7397410332d8.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",519038979,Failed to import workout points, https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550806302,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10,550806302,MDEyOklzc3VlQ29tbWVudDU1MDgwNjMwMg==,9599,simonw,2019-11-07T05:33:31Z,2019-11-07T05:33:31Z,MEMBER,"The XML now includes references to these new files: ![CBBA54FC-51FB-4BB3-927C-C2CA99237B04](https://user-images.githubusercontent.com/9599/68362716-121ec400-00dd-11ea-9846-387c7cd64c8b.jpeg) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",519038979,Failed to import workout points, https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550824838,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10,550824838,MDEyOklzc3VlQ29tbWVudDU1MDgyNDgzOA==,9599,simonw,2019-11-07T05:47:07Z,2019-11-07T05:47:07Z,MEMBER,"Relevant code: https://github.com/dogsheep/healthkit-to-sqlite/blob/d16f45f06fbae6ec8a78cc9ca7b5b7db0413f139/healthkit_to_sqlite/utils.py#L58-L64","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",519038979,Failed to import workout points, https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550828084,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10,550828084,MDEyOklzc3VlQ29tbWVudDU1MDgyODA4NA==,9599,simonw,2019-11-07T05:49:24Z,2019-11-07T05:49:24Z,MEMBER,"So the fix there is going to be to detect the new `FileReference` element and load the corresponding points data from it. This will be a little tricky because that function will need access to the zip file. It probably won't work at all for the mode where the `export.xml` file is passed directly using the `--xml` option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",519038979,Failed to import workout points, https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552129686,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29,552129686,MDEyOklzc3VlQ29tbWVudDU1MjEyOTY4Ng==,9599,simonw,2019-11-09T19:27:39Z,2019-11-09T19:27:39Z,MEMBER,I think this is fixed by the latest version of `sqlite-utils` - https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-12-1 - I'll bump the dependency.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518725064,`import` command fails on empty files, https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552129921,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29,552129921,MDEyOklzc3VlQ29tbWVudDU1MjEyOTkyMQ==,9599,simonw,2019-11-09T19:30:42Z,2019-11-09T19:30:42Z,MEMBER,"Confirmed, that seems to fix it: ``` (twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ twitter-to-sqlite import blah.db ~/Dropbox/dogsheep/twitter-2019-06-25-b31f246100821b551f2f9a23f21ac6fb565dab49dd23a35630cabbf2b94a1f03/account-suspension.js Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 633, in import_ archive.import_from_file(db, path.name, open(path, ""rb"").read()) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/archive.py"", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id=""pk"") File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1094, in upsert_all extracts=extracts, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 961, in insert_all first_record = next(records) StopIteration (twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ pip install -U sqlite-utils Collecting sqlite-utils Using cached https://files.pythonhosted.org/packages/ee/a2/1b135010c7ac8e2d7545f659e9e6c6ede0f406f20b52e08d5817e1e31a9a/sqlite_utils-1.12.1-py3-none-any.whl Requirement already satisfied, skipping upgrade: click in /Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages (from sqlite-utils) (7.0) Requirement already satisfied, skipping upgrade: tabulate in /Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages (from sqlite-utils) (0.8.5) Requirement already satisfied, skipping upgrade: click-default-group in /Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages (from sqlite-utils) (1.2.2) Installing collected packages: sqlite-utils Found existing installation: sqlite-utils 1.11 Uninstalling sqlite-utils-1.11: Successfully uninstalled sqlite-utils-1.11 Successfully installed sqlite-utils-1.12.1 (twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ twitter-to-sqlite import blah.db ~/Dropbox/dogsheep/twitter-2019-06-25-b31f246100821b551f2f9a23f21ac6fb565dab49dd23a35630cabbf2b94a1f03/account-suspension.js (twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518725064,`import` command fails on empty files, https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552131798,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30,552131798,MDEyOklzc3VlQ29tbWVudDU1MjEzMTc5OA==,9599,simonw,2019-11-09T19:54:45Z,2019-11-09T19:54:45Z,MEMBER,Good catch - not sure how that bug crept in. Removing line 116 looks like the right fix to me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518739697,`followers` fails because `transform_user` is called twice, https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552133449,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29,552133449,MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ0OQ==,9599,simonw,2019-11-09T20:15:15Z,2019-11-09T20:15:15Z,MEMBER,Released: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518725064,`import` command fails on empty files, https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552133468,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30,552133468,MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ2OA==,9599,simonw,2019-11-09T20:15:27Z,2019-11-09T20:15:27Z,MEMBER,Released: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",518739697,`followers` fails because `transform_user` is called twice, https://github.com/dogsheep/twitter-to-sqlite/issues/28#issuecomment-552133488,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28,552133488,MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ4OA==,9599,simonw,2019-11-09T20:15:42Z,2019-11-09T20:15:42Z,MEMBER,Released: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",515658861,Add indexes to followers table, https://github.com/dogsheep/twitter-to-sqlite/issues/31#issuecomment-552135263,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31,552135263,MDEyOklzc3VlQ29tbWVudDU1MjEzNTI2Mw==,9599,simonw,2019-11-09T20:38:35Z,2019-11-09T20:38:35Z,MEMBER,Command still needs documentation and a bit more testing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",520508502,"""friends"" command (similar to ""followers"")", https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559883311,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14,559883311,MDEyOklzc3VlQ29tbWVudDU1OTg4MzMxMQ==,9599,simonw,2019-11-29T21:30:37Z,2019-11-29T21:30:37Z,MEMBER,"I should build the command to persist ETags and obey their polling guidelines: > Events are optimized for polling with the ""ETag"" header. If no new events have been triggered, you will see a ""304 Not Modified"" response, and your current rate limit will be untouched. There is also an ""X-Poll-Interval"" header that specifies how often (in seconds) you are allowed to poll. In times of high server load, the time may increase. Please obey the header.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530491074,Command for importing events, https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559902818,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14,559902818,MDEyOklzc3VlQ29tbWVudDU1OTkwMjgxOA==,9599,simonw,2019-11-30T01:32:38Z,2019-11-30T01:32:38Z,MEMBER,"Prototype: ``` pip install sqlite-utils paginate-json paginate-json ""https://api.github.com/users/simonw/events"" | sqlite-utils insert /tmp/events.db events - --pk=id ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530491074,Command for importing events, https://github.com/dogsheep/github-to-sqlite/issues/12#issuecomment-594151327,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12,594151327,MDEyOklzc3VlQ29tbWVudDU5NDE1MTMyNw==,9599,simonw,2020-03-03T20:26:15Z,2020-03-03T20:32:23Z,MEMBER,"Better version (since this also includes JSON array of repository topics): ```sql CREATE VIEW recent_releases AS select repos.rowid as rowid, json_object(""label"", repos.full_name, ""href"", repos.html_url) as repo, json_object( ""href"", releases.html_url, ""label"", releases.name ) as release, substr(releases.published_at, 0, 11) as date, releases.body as body_markdown, releases.published_at, coalesce(repos.topics, '[]') as topics from releases join repos on repos.id = releases.repo order by releases.published_at desc ``` That `repos.rowid as rowid` bit is necessary because otherwise clicking on a link in facet-by-topic doesn't return any results.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",520756546,Add this view for seeing new releases, https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-594154644,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8,594154644,MDEyOklzc3VlQ29tbWVudDU5NDE1NDY0NA==,9599,simonw,2020-03-03T20:33:57Z,2020-03-03T20:33:57Z,MEMBER,`sqlite-utils` supports proper upserts now so this problem should be easy to fix.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",516763727,"stargazers command, refs #4", https://github.com/dogsheep/github-to-sqlite/issues/12#issuecomment-594155249,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12,594155249,MDEyOklzc3VlQ29tbWVudDU5NDE1NTI0OQ==,9599,simonw,2020-03-03T20:35:17Z,2020-03-03T20:35:17Z,MEMBER,"`swarm-to-sqlite` has an example of adding views here: https://github.com/dogsheep/swarm-to-sqlite/blob/f2c89dd613fb8a7f14e5267ccc2145463b996190/swarm_to_sqlite/utils.py#L141 I think that approach can be approved by first checking if the view exists, then dropping it, then recreating it. Could even try to see if the view exists and matches what we were going to set it to and do nothing if that is the case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",520756546,Add this view for seeing new releases, https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597354514,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17,597354514,MDEyOklzc3VlQ29tbWVudDU5NzM1NDUxNA==,9599,simonw,2020-03-10T22:37:45Z,2020-03-10T22:37:45Z,MEMBER,I should add an option to stop the moment you see a commit you have fetched before.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",578883725,Command for importing commits, https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597358364,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17,597358364,MDEyOklzc3VlQ29tbWVudDU5NzM1ODM2NA==,9599,simonw,2020-03-10T22:50:20Z,2020-03-11T01:18:36Z,MEMBER,By default it will stop when it sees a commit that has already been stored. You will be able to over-ride that behaviour using `--all`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",578883725,Command for importing commits, https://github.com/dogsheep/healthkit-to-sqlite/issues/4#issuecomment-513440090,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4,513440090,MDEyOklzc3VlQ29tbWVudDUxMzQ0MDA5MA==,9599,simonw,2019-07-20T06:11:50Z,2019-07-20T06:11:50Z,MEMBER,"Some examples: https://github.com/dogsheep/healthkit-to-sqlite/blob/d016e70c31cf84ba0f5ec3102546db54a51aaffb/tests/export.xml#L4-L13","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470640505,Import Records, https://github.com/dogsheep/healthkit-to-sqlite/issues/1#issuecomment-513437463,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1,513437463,MDEyOklzc3VlQ29tbWVudDUxMzQzNzQ2Mw==,9599,simonw,2019-07-20T05:19:59Z,2019-07-20T05:19:59Z,MEMBER,"I ran xml_analyser against the XML HealthKit `export.xml` file and got the following results: ```python { 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980, 'activeEnergyBurnedGoal': 980, 'activeEnergyBurnedUnit': 980, 'appleExerciseTime': 980, 'appleExerciseTimeGoal': 980, 'appleStandHours': 980, 'appleStandHoursGoal': 980, 'dateComponents': 980}, 'child_counts': {}, 'count': 980, 'parent_counts': {'HealthData': 980}}, 'Correlation': {'attr_counts': {'creationDate': 1, 'endDate': 1, 'sourceName': 1, 'sourceVersion': 1, 'startDate': 1, 'type': 1}, 'child_counts': {'MetadataEntry': 1, 'Record': 2}, 'count': 1, 'parent_counts': {'HealthData': 1}}, 'ExportDate': {'attr_counts': {'value': 1}, 'child_counts': {}, 'count': 1, 'parent_counts': {'HealthData': 1}}, 'HealthData': {'attr_counts': {'locale': 1}, 'child_counts': {'ActivitySummary': 980, 'Correlation': 1, 'ExportDate': 1, 'Me': 1, 'Record': 2672231, 'Workout': 663}, 'count': 1, 'parent_counts': {}}, 'HeartRateVariabilityMetadataList': {'attr_counts': {}, 'child_counts': {'InstantaneousBeatsPerMinute': 93653}, 'count': 2318, 'parent_counts': {'Record': 2318}}, 'InstantaneousBeatsPerMinute': {'attr_counts': {'bpm': 93653, 'time': 93653}, 'child_counts': {}, 'count': 93653, 'parent_counts': {'HeartRateVariabilityMetadataList': 93653}}, 'Location': {'attr_counts': {'altitude': 398683, 'course': 398683, 'date': 398683, 'horizontalAccuracy': 398683, 'latitude': 398683, 'longitude': 398683, 'speed': 398683, 'verticalAccuracy': 398683}, 'child_counts': {}, 'count': 398683, 'parent_counts': {'WorkoutRoute': 398683}}, 'Me': {'attr_counts': {'HKCharacteristicTypeIdentifierBiologicalSex': 1, 'HKCharacteristicTypeIdentifierBloodType': 1, 'HKCharacteristicTypeIdentifierDateOfBirth': 1, 'HKCharacteristicTypeIdentifierFitzpatrickSkinType': 1}, 'child_counts': {}, 'count': 1, 'parent_counts': {'HealthData': 1}}, 'MetadataEntry': {'attr_counts': {'key': 290449, 'value': 290449}, 'child_counts': {}, 'count': 290449, 'parent_counts': {'Correlation': 1, 'Record': 287974, 'Workout': 1928, 'WorkoutRoute': 546}}, 'Record': {'attr_counts': {'creationDate': 2672233, 'device': 2665111, 'endDate': 2672233, 'sourceName': 2672233, 'sourceVersion': 2671779, 'startDate': 2672233, 'type': 2672233, 'unit': 2650012, 'value': 2672232}, 'child_counts': {'HeartRateVariabilityMetadataList': 2318, 'MetadataEntry': 287974}, 'count': 2672233, 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}}, 'Workout': {'attr_counts': {'creationDate': 663, 'device': 230, 'duration': 663, 'durationUnit': 663, 'endDate': 663, 'sourceName': 663, 'sourceVersion': 663, 'startDate': 663, 'totalDistance': 663, 'totalDistanceUnit': 663, 'totalEnergyBurned': 663, 'totalEnergyBurnedUnit': 663, 'workoutActivityType': 663}, 'child_counts': {'MetadataEntry': 1928, 'WorkoutEvent': 2094, 'WorkoutRoute': 340}, 'count': 663, 'parent_counts': {'HealthData': 663}}, 'WorkoutEvent': {'attr_counts': {'date': 2094, 'duration': 837, 'durationUnit': 837, 'type': 2094}, 'child_counts': {}, 'count': 2094, 'parent_counts': {'Workout': 2094}}, 'WorkoutRoute': {'attr_counts': {'creationDate': 340, 'endDate': 340, 'sourceName': 340, 'sourceVersion': 340, 'startDate': 340}, 'child_counts': {'Location': 398683, 'MetadataEntry': 546}, 'count': 340, 'parent_counts': {'Workout': 340}}} ``` The most interesting bit is this: ```python 'HealthData': {'attr_counts': {'locale': 1}, 'child_counts': {'ActivitySummary': 980, 'Correlation': 1, 'ExportDate': 1, 'Me': 1, 'Record': 2672231, 'Workout': 663}, ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470637068,Use XML Analyser to figure out the structure of the export XML, https://github.com/dogsheep/healthkit-to-sqlite/issues/2#issuecomment-513439411,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2,513439411,MDEyOklzc3VlQ29tbWVudDUxMzQzOTQxMQ==,9599,simonw,2019-07-20T05:58:57Z,2019-07-20T05:58:57Z,MEMBER,"```python 'Workout': {'attr_counts': {'creationDate': 663, 'device': 230, 'duration': 663, 'durationUnit': 663, 'endDate': 663, 'sourceName': 663, 'sourceVersion': 663, 'startDate': 663, 'totalDistance': 663, 'totalDistanceUnit': 663, 'totalEnergyBurned': 663, 'totalEnergyBurnedUnit': 663, 'workoutActivityType': 663}, 'child_counts': {'MetadataEntry': 1928, 'WorkoutEvent': 2094, 'WorkoutRoute': 340}, 'count': 663, 'parent_counts': {'HealthData': 663}}, 'WorkoutEvent': {'attr_counts': {'date': 2094, 'duration': 837, 'durationUnit': 837, 'type': 2094}, 'child_counts': {}, 'count': 2094, 'parent_counts': {'Workout': 2094}}, 'WorkoutRoute': {'attr_counts': {'creationDate': 340, 'endDate': 340, 'sourceName': 340, 'sourceVersion': 340, 'startDate': 340}, 'child_counts': {'Location': 398683, 'MetadataEntry': 546}, 'count': 340, 'parent_counts': {'Workout': 340}}} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470637152,Import workouts, https://github.com/dogsheep/healthkit-to-sqlite/issues/5#issuecomment-513514978,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5,513514978,MDEyOklzc3VlQ29tbWVudDUxMzUxNDk3OA==,9599,simonw,2019-07-21T02:55:12Z,2019-07-21T02:55:12Z,MEMBER,I'm going to show this by default. Users can pass `-s` or `--silent` to disable the progress bar.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470691622,Add progress bar, https://github.com/dogsheep/healthkit-to-sqlite/issues/5#issuecomment-513625406,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5,513625406,MDEyOklzc3VlQ29tbWVudDUxMzYyNTQwNg==,9599,simonw,2019-07-22T03:20:16Z,2019-07-22T03:20:16Z,MEMBER,"It now renders like this: ``` Importing from HealthKit [#-----------------------------------] 5% 00:01:33 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470691622,Add progress bar, https://github.com/dogsheep/healthkit-to-sqlite/issues/6#issuecomment-513626742,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6,513626742,MDEyOklzc3VlQ29tbWVudDUxMzYyNjc0Mg==,9599,simonw,2019-07-22T03:28:55Z,2019-07-22T03:28:55Z,MEMBER,"Here's what it looks like now as separate tables: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",470856782,Break up records into different tables for each type, https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514496725,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7,514496725,MDEyOklzc3VlQ29tbWVudDUxNDQ5NjcyNQ==,9599,simonw,2019-07-24T06:20:59Z,2019-07-24T06:20:59Z,MEMBER,"I'm using https://pypi.org/project/memory-profiler/ to explore this in more detail: ``` $ pip install memory-profiler matplotlib ``` Then: ``` $ mprof run healthkit-to-sqlite ~/Downloads/healthkit-export.zip healthkit.db $ mprof plot ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472097220,Script uses a lot of RAM, https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514498221,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7,514498221,MDEyOklzc3VlQ29tbWVudDUxNDQ5ODIyMQ==,9599,simonw,2019-07-24T06:26:49Z,2019-07-24T06:26:49Z,MEMBER,"Adding `el.clear()` got me a huge improvement: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472097220,Script uses a lot of RAM, https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514500253,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7,514500253,MDEyOklzc3VlQ29tbWVudDUxNDUwMDI1Mw==,9599,simonw,2019-07-24T06:34:28Z,2019-07-24T06:34:28Z,MEMBER,"Clearing the root element each time saved even more: ","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472097220,Script uses a lot of RAM, https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515226724,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9,515226724,MDEyOklzc3VlQ29tbWVudDUxNTIyNjcyNA==,9599,simonw,2019-07-25T21:46:01Z,2019-07-25T21:46:01Z,MEMBER,I can work around this here (prior to the fix in sqlite-utils) by setting the batch size to something a bit lower here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472429048,Too many SQL variables, https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515322294,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9,515322294,MDEyOklzc3VlQ29tbWVudDUxNTMyMjI5NA==,9599,simonw,2019-07-26T06:07:12Z,2019-07-26T06:07:12Z,MEMBER,@tholo this should be fixed in just-released version 0.3.2 - could you run a `pip install -U healthkit-to-sqlite` and let me know if it works for you now?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472429048,Too many SQL variables, https://github.com/dogsheep/swarm-to-sqlite/issues/2#issuecomment-526701674,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2,526701674,MDEyOklzc3VlQ29tbWVudDUyNjcwMTY3NA==,9599,simonw,2019-08-30T18:24:26Z,2019-08-30T18:24:26Z,MEMBER,I renamed `--file` to `--load` in 0e5b6025c6f9823ff81aa8aae1cbff5c45e57baf,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487598468,--save option to dump checkins to a JSON file on disk, https://github.com/dogsheep/swarm-to-sqlite/issues/4#issuecomment-526853542,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4,526853542,MDEyOklzc3VlQ29tbWVudDUyNjg1MzU0Mg==,9599,simonw,2019-08-31T18:06:32Z,2019-08-31T18:06:32Z,MEMBER,"https://your-foursquare-oauth-token.glitch.me/ Source code: https://glitch.com/~your-foursquare-oauth-token","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487601121,Online tool for getting a Foursquare OAuth token, https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-527200332,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3,527200332,MDEyOklzc3VlQ29tbWVudDUyNzIwMDMzMg==,9599,simonw,2019-09-02T16:32:20Z,2019-09-02T16:32:39Z,MEMBER,"Also needed: an option for ""fetch all checkins created within the last X days"". This should help provide support for that Swarm feature where you can retroactively checkin to places in the past.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",487600595,Option to fetch only checkins more recent than the current max checkin, https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-527682713,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4,527682713,MDEyOklzc3VlQ29tbWVudDUyNzY4MjcxMw==,9599,simonw,2019-09-03T23:48:57Z,2019-09-03T23:48:57Z,MEMBER,"One interesting challenge here is that the JSON format for tweets in the archive is subtly different from the JSON format currently returned by the API. If we want to keep the tweets in the same database table (which feels like the right thing to me) we'll need to handle this. One thing we can do is have a column for `from_archive` which is set to 1 for tweets that were recovered from the archive. We can also ensure that tweets from the API always over-write the version that came from the archive (using `.upsert()`) while tweets from the archive use `.insert(..., ignore=True)` to avoid over-writing a better version that came from the API.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488835586,Command for importing data from a Twitter Export file, https://github.com/dogsheep/twitter-to-sqlite/issues/5#issuecomment-527684202,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5,527684202,MDEyOklzc3VlQ29tbWVudDUyNzY4NDIwMg==,9599,simonw,2019-09-03T23:56:28Z,2019-09-03T23:56:28Z,MEMBER,I previously used betamax here: https://github.com/simonw/github-contents/blob/master/test_github_contents.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488874815,Write tests that simulate the Twitter API, https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527954898,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2,527954898,MDEyOklzc3VlQ29tbWVudDUyNzk1NDg5OA==,9599,simonw,2019-09-04T15:31:46Z,2019-09-04T15:31:46Z,MEMBER,I'm going to call this `twitter-to-sqlite user-timeline` to reflect the language used to describe the API endpoint: https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833698,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user", https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527955302,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2,527955302,MDEyOklzc3VlQ29tbWVudDUyNzk1NTMwMg==,9599,simonw,2019-09-04T15:32:39Z,2019-09-04T15:32:39Z,MEMBER,Rate limit is 900 / 15 minutes which is 1 call per second.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833698,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user", https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527990908,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2,527990908,MDEyOklzc3VlQ29tbWVudDUyNzk5MDkwOA==,9599,simonw,2019-09-04T16:57:24Z,2019-09-04T16:57:24Z,MEMBER,"I just tried this using `max_id=` pagination as described in [Working with timelines](https://developer.twitter.com/en/docs/tweets/timelines/guides/working-with-timelines) and I got back all 17,759 of my tweets.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488833698,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user", https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-529240286,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8,529240286,MDEyOklzc3VlQ29tbWVudDUyOTI0MDI4Ng==,9599,simonw,2019-09-08T20:48:33Z,2019-09-08T20:48:33Z,MEMBER,"```ATTACH DATABASE ""file:blah.db?mode=ro"" AS foo```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",490803176,--sql and --attach options for feeding commands from SQL queries, https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-529239307,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8,529239307,MDEyOklzc3VlQ29tbWVudDUyOTIzOTMwNw==,9599,simonw,2019-09-08T20:36:49Z,2019-09-08T20:36:49Z,MEMBER,"`--attach` can optionally take a name for the database connection alias like this: $ twitter-to-sqlite users-lookup users.db --attach foo:attending.db ... If you omit the `alias:` bit the stem of the database (without the file extension) will be used.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",490803176,--sql and --attach options for feeding commands from SQL queries, https://github.com/dogsheep/twitter-to-sqlite/issues/9#issuecomment-530028567,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9,530028567,MDEyOklzc3VlQ29tbWVudDUzMDAyODU2Nw==,9599,simonw,2019-09-10T16:59:25Z,2019-09-10T16:59:25Z,MEMBER,"By default in SQLite foreign key constraints are not enforced (you need to run `PRAGMA foreign_keys = ON;` to enforce them). We will take advantage of this - even though the `following` table has foreign keys against user we will allow IDs to populate that table without a corresponding user record. In the future we may add a command that can backfill missing user records.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",491791152,followers-ids and friends-ids subcommands, https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-530417631,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8,530417631,MDEyOklzc3VlQ29tbWVudDUzMDQxNzYzMQ==,9599,simonw,2019-09-11T14:52:44Z,2019-09-14T19:09:22Z,MEMBER,- [x] This needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",490803176,--sql and --attach options for feeding commands from SQL queries, https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-531404891,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8,531404891,MDEyOklzc3VlQ29tbWVudDUzMTQwNDg5MQ==,9599,simonw,2019-09-13T22:01:57Z,2019-09-13T22:01:57Z,MEMBER,I also wrote about this in https://simonwillison.net/2019/Sep/13/weeknotestwitter-sqlite-datasette-rure/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",490803176,--sql and --attach options for feeding commands from SQL queries, https://github.com/dogsheep/github-to-sqlite/issues/3#issuecomment-531516956,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3,531516956,MDEyOklzc3VlQ29tbWVudDUzMTUxNjk1Ng==,9599,simonw,2019-09-14T21:56:31Z,2019-09-14T21:56:31Z,MEMBER,"https://api.github.com/users/simonw/repos It would be useful to be able to fetch stargazers, forks etc as well. Not sure if that should be a separate command or a `--stargazers` option to this command. Probably a separate command since `issues` is a separate command already.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670426,Command to fetch all repos belonging to a user or organization, https://github.com/dogsheep/github-to-sqlite/issues/3#issuecomment-531517083,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3,531517083,MDEyOklzc3VlQ29tbWVudDUzMTUxNzA4Mw==,9599,simonw,2019-09-14T21:58:42Z,2019-09-14T21:58:42Z,MEMBER,Split stargazers into #4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670426,Command to fetch all repos belonging to a user or organization, https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-531517138,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4,531517138,MDEyOklzc3VlQ29tbWVudDUzMTUxNzEzOA==,9599,simonw,2019-09-14T21:59:59Z,2019-09-14T21:59:59Z,MEMBER,"Paginate through https://api.github.com/repos/simonw/datasette/stargazers Send `Accept: application/vnd.github.v3.star+json` to get the `starred_at` dates.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",493670730,Command to fetch stargazers for one or more repos, https://github.com/dogsheep/twitter-to-sqlite/issues/11#issuecomment-538711918,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11,538711918,MDEyOklzc3VlQ29tbWVudDUzODcxMTkxOA==,9599,simonw,2019-10-06T04:54:17Z,2019-10-06T04:54:17Z,MEMBER,Shipped in 0.6. Here's the documentation: https://github.com/dogsheep/twitter-to-sqlite#capturing-tweets-in-real-time-with-track-and-follow,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503045221,Commands for recording real-time tweets from the streaming API, https://github.com/dogsheep/pocket-to-sqlite/issues/1#issuecomment-538847446,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1,538847446,MDEyOklzc3VlQ29tbWVudDUzODg0NzQ0Ng==,9599,simonw,2019-10-07T05:41:17Z,2019-10-07T05:41:17Z,MEMBER,"Prototype code: ```python offset = 0 fetched = [] size = 500 while True: page = requests.get(""https://getpocket.com/v3/get"", { ""consumer_key"": consumer_key, ""access_token"": access_token, ""sort"": ""oldest"", ""detailType"": ""complete"", ""count"": size, ""offset"": offset, }).json() print(offset) fetched.append(page) offset += size if not len(page[""list""]): break ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503233021,Use better pagination (and implement progress bar), https://github.com/dogsheep/pocket-to-sqlite/issues/2#issuecomment-538847796,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2,538847796,MDEyOklzc3VlQ29tbWVudDUzODg0Nzc5Ng==,9599,simonw,2019-10-07T05:43:30Z,2019-10-07T05:43:30Z,MEMBER,We can persist the `since` value in its own single-row table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503234169,Track and use the 'since' value, https://github.com/dogsheep/twitter-to-sqlite/issues/13#issuecomment-538804815,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13,538804815,MDEyOklzc3VlQ29tbWVudDUzODgwNDgxNQ==,9599,simonw,2019-10-07T00:33:49Z,2019-10-07T00:33:49Z,MEMBER,Documentation: https://github.com/dogsheep/twitter-to-sqlite#retrieve-tweets-in-bulk,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503085013,statuses-lookup command, https://github.com/dogsheep/github-to-sqlite/issues/40#issuecomment-643393506,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/40,643393506,MDEyOklzc3VlQ29tbWVudDY0MzM5MzUwNg==,9599,simonw,2020-06-12T17:21:14Z,2020-06-12T17:21:14Z,MEMBER,"I only install SQLite for this: https://github.com/dogsheep/github-to-sqlite/blob/c0d54e0260468be38152293df5abd775c068495d/.github/workflows/deploy-demo.yml#L77-L78 I'm going to remove the need to install sqlite3 by making this possible with sqlite-utils: https://github.com/simonw/sqlite-utils/issues/115","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637899539,Demo deploy is broken, https://github.com/dogsheep/github-to-sqlite/issues/40#issuecomment-643414646,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/40,643414646,MDEyOklzc3VlQ29tbWVudDY0MzQxNDY0Ng==,9599,simonw,2020-06-12T18:06:48Z,2020-06-12T18:06:48Z,MEMBER,That fixed it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637899539,Demo deploy is broken, https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645599881,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47,645599881,MDEyOklzc3VlQ29tbWVudDY0NTU5OTg4MQ==,9599,simonw,2020-06-17T20:13:48Z,2020-06-17T20:13:48Z,MEMBER,"I've now figured out how to compile specific SQLite versions to help replicate this problem: https://github.com/simonw/til/blob/master/sqlite/ld-preload.md Next step: replicate the problem!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",639542974,Fall back to FTS4 if FTS5 is not available, https://github.com/dogsheep/twitter-to-sqlite/issues/47#issuecomment-645512127,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47,645512127,MDEyOklzc3VlQ29tbWVudDY0NTUxMjEyNw==,9599,simonw,2020-06-17T17:24:22Z,2020-06-17T17:24:22Z,MEMBER,"That means your version of SQLite is old enough that it doesn't support the FTS5 extension. Could you share what operating system you're running, and what the output is that you get from running this? python -c 'import sqlite3; print(sqlite3.connect("":memory:"").execute(""select sqlite_version()"").fetchone()[0])' I can teach this tool to fall back on FTS4 if FTS5 isn't available.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",639542974,Fall back to FTS4 if FTS5 is not available, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653947916,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653947916,MDEyOklzc3VlQ29tbWVudDY1Mzk0NzkxNg==,9599,simonw,2020-07-05T22:40:47Z,2020-07-05T22:40:47Z,MEMBER,Might be that it's not got enough RAM. I'll try deploying to a larger instance.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653960989,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653960989,MDEyOklzc3VlQ29tbWVudDY1Mzk2MDk4OQ==,9599,simonw,2020-07-06T00:32:34Z,2020-07-06T00:32:34Z,MEMBER,Same error.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653962418,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653962418,MDEyOklzc3VlQ29tbWVudDY1Mzk2MjQxOA==,9599,simonw,2020-07-06T00:41:38Z,2020-07-06T00:41:38Z,MEMBER,"https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/logs?project=datasette-222320 has some clues. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653962530,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653962530,MDEyOklzc3VlQ29tbWVudDY1Mzk2MjUzMA==,9599,simonw,2020-07-06T00:42:13Z,2020-07-06T00:42:13Z,MEMBER,So it looks like it's the ASGI lifespan change I made in https://github.com/simonw/datasette/commit/16f592247a2a0e140ada487e9972645406dcae69 - It must be incompatible with one of the plugins.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653962669,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653962669,MDEyOklzc3VlQ29tbWVudDY1Mzk2MjY2OQ==,9599,simonw,2020-07-06T00:42:57Z,2020-07-06T00:42:57Z,MEMBER,"https://github-to-sqlite.dogsheep.net/-/plugins ```json [ { ""name"": ""datasette-json-html"", ""static"": false, ""templates"": false, ""version"": ""0.6"", ""hooks"": [ ""prepare_connection"", ""render_cell"" ] }, { ""name"": ""datasette-render-markdown"", ""static"": false, ""templates"": false, ""version"": ""1.1.2"", ""hooks"": [ ""extra_template_vars"", ""render_cell"" ] }, { ""name"": ""datasette-pretty-json"", ""static"": false, ""templates"": false, ""version"": ""0.2"", ""hooks"": [ ""render_cell"" ] }, { ""name"": ""datasette-search-all"", ""static"": false, ""templates"": true, ""version"": ""0.2.1"", ""hooks"": [ ""asgi_wrapper"", ""extra_template_vars"" ] }, { ""name"": ""datasette-vega"", ""static"": true, ""templates"": false, ""version"": ""0.6.2"", ""hooks"": [ ""extra_css_urls"", ""extra_js_urls"" ] } ] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653962708,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653962708,MDEyOklzc3VlQ29tbWVudDY1Mzk2MjcwOA==,9599,simonw,2020-07-06T00:43:10Z,2020-07-06T00:43:10Z,MEMBER,I bet it's datasette-search-all.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/41#issuecomment-653966670,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41,653966670,MDEyOklzc3VlQ29tbWVudDY1Mzk2NjY3MA==,9599,simonw,2020-07-06T01:07:02Z,2020-07-06T01:07:02Z,MEMBER,OK that fix worked.https://github.com/dogsheep/github-to-sqlite/runs/839764768?check_suite_focus=true,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",651159727,Demo is failing to deploy, https://github.com/dogsheep/github-to-sqlite/issues/43#issuecomment-660547502,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43,660547502,MDEyOklzc3VlQ29tbWVudDY2MDU0NzUwMg==,9599,simonw,2020-07-18T21:50:37Z,2020-07-18T21:50:37Z,MEMBER,"``` $ github-to-sqlite tags tags.db simonw/datasette dogsheep/github-to-sqlite $ sqlite-utils tables tags.db --counts [{""table"": ""users"", ""count"": 2}, {""table"": ""licenses"", ""count"": 1}, {""table"": ""repos"", ""count"": 2}, {""table"": ""tags"", ""count"": 76}, {""table"": ""licenses_fts"", ""count"": 1}, {""table"": ""licenses_fts_data"", ""count"": 3}, {""table"": ""licenses_fts_idx"", ""count"": 1}, {""table"": ""licenses_fts_docsize"", ""count"": 1}, {""table"": ""licenses_fts_config"", ""count"": 1}, {""table"": ""repos_fts"", ""count"": 2}, {""table"": ""repos_fts_data"", ""count"": 3}, {""table"": ""repos_fts_idx"", ""count"": 1}, {""table"": ""repos_fts_docsize"", ""count"": 2}, {""table"": ""repos_fts_config"", ""count"": 1}, {""table"": ""users_fts"", ""count"": 2}, {""table"": ""users_fts_data"", ""count"": 3}, {""table"": ""users_fts_idx"", ""count"": 1}, {""table"": ""users_fts_docsize"", ""count"": 2}, {""table"": ""users_fts_config"", ""count"": 1}] $ sqlite-utils rows tags.db tags [{""repo_id"": 107914493, ""name"": ""0.45"", ""sha"": ""f1f581b7ffcd5d8f3ae6c1c654d813a6641410eb""}, {""repo_id"": 107914493, ""name"": ""0.45a5"", ""sha"": ""676bb64c877d73f8ff496cef4632f5a8a5a9283c""}, {""repo_id"": 107914493, ""name"": ""0.45a4"", ""sha"": ""265483173bc8341dc02c8b782b9b59d2ce8bbedb""}, {""repo_id"": 107914493, ""name"": ""0.45a3"", ""sha"": ""1f55a4a2b68fa65e56a28baeb7f44122fdeca7e7""}, {""repo_id"": 107914493, ""name"": ""0.45a2"", ""sha"": ""1a5b7d318fa923edfcefd3df8f64dae2e9c49d3f""}, {""repo_id"": 107914493, ""name"": ""0.45a1"", ""sha"": ""b59b92b1b0517cf18fa748ff9d0a0bf86298dd43""}, {""repo_id"": 107914493, ""name"": ""0.45a0"", ""sha"": ""dda932d818b34ccab11730a76554f0a3748d8348""}, {""repo_id"": 107914493, ""name"": ""0.44"", ""sha"": ""b906030235efbdff536405d66078f4868ce0d3bd""}, {""repo_id"": 107914493, ""name"": ""0.43"", ""sha"": ""d56f402822df102f9cf1a9a056449d01a15e3aae""}, {""repo_id"": 107914493, ""name"": ""0.42"", ""sha"": ""af6c6c5d6f929f951c0e63bfd1c82e37a071b50f""}, {""repo_id"": 107914493, ""name"": ""0.41"", ""sha"": ""182e5c8745c94576718315f7596ccc81e5e2417b""}, {""repo_id"": 107914493, ""name"": ""0.40"", ""sha"": ""8da108193b08abf140716f8ac499f32309dfe9cf""}, {""repo_id"": 107914493, ""name"": ""0.39"", ""sha"": ""dedd775512daee49925882654f252df61a9e3b6d""}, {""repo_id"": 107914493, ""name"": ""0.38"", ""sha"": ""7e357abbc38dcc9d19a2f1df3252668a48e941e4""}, {""repo_id"": 107914493, ""name"": ""0.37.1"", ""sha"": ""be20e6991eac2baa9b43e9b26ae209bae805ede5""}, {""repo_id"": 107914493, ""name"": ""0.37"", ""sha"": ""c9e6841482b299fceadc5ad548c2dbf58a8f1227""}, {""repo_id"": 107914493, ""name"": ""0.36"", ""sha"": ""b031fe97636b80b05fec409ee1dffb7d044fd4e9""}, {""repo_id"": 107914493, ""name"": ""0.35"", ""sha"": ""30b6f71b306a43605c99bef79302ed5cb22d1924""}, {""repo_id"": 107914493, ""name"": ""0.34"", ""sha"": ""e7f60d2a9b59752e20de8412f7b0a3e9a5359a31""}, {""repo_id"": 107914493, ""name"": ""0.33"", ""sha"": ""59e7014c8a0f4102d7dc79f517540c55c49e1554""}, {""repo_id"": 107914493, ""name"": ""0.32"", ""sha"": ""a95bedb9c423fa6d772c93ef47bc40f13a5bea50""}, {""repo_id"": 107914493, ""name"": ""0.31.2"", ""sha"": ""b51f258d00bb3c3b401f15d46a1fbd50394dbe1c""}, {""repo_id"": 107914493, ""name"": ""0.31.1"", ""sha"": ""a22c7761b61baa61b8e3da7d30887468d61d6b83""}, {""repo_id"": 107914493, ""name"": ""0.31"", ""sha"": ""7f89928062b1a1fdb2625a946f7cd5161e597401""}, {""repo_id"": 107914493, ""name"": ""0.30.2"", ""sha"": ""2bf7ce5f517d772a16d7855a35a8a75d4456aad7""}, {""repo_id"": 107914493, ""name"": ""0.30.1"", ""sha"": ""3ca290e0db03bb4747e24203c445873f74512107""}, {""repo_id"": 107914493, ""name"": ""0.30"", ""sha"": ""8050f9e1ece9afd0236ad38c6458c12a4ad917e6""}, {""repo_id"": 107914493, ""name"": ""0.29.3"", ""sha"": ""0fc8afde0eb5ef677f4ac31601540d6168c8208d""}, {""repo_id"": 107914493, ""name"": ""0.29.2"", ""sha"": ""6abe6faff6b035e9334dd05f8c741ae9b7a47440""}, {""repo_id"": 107914493, ""name"": ""0.29.1"", ""sha"": ""2a94f3719fb2c4335fcda374fa92f87272b02d34""}, {""repo_id"": 107914493, ""name"": ""0.29"", ""sha"": ""fb7ee8e0ad59a15083234a48e935525f6e7257dd""}, {""repo_id"": 107914493, ""name"": ""0.28"", ""sha"": ""e518f76c5f5dd0138032bfb26387f5bb91086a3f""}, {""repo_id"": 107914493, ""name"": ""0.27.1"", ""sha"": ""3f3f29ac9afe7c41ffc48a3bd2af473a53eecc8a""}, {""repo_id"": 107914493, ""name"": ""0.27"", ""sha"": ""436b8bc1d17c2ab415800ab209204f94e7f7929e""}, {""repo_id"": 107914493, ""name"": ""0.26.2"", ""sha"": ""a418c8b44f82d456be523c8690cf7236bb648c22""}, {""repo_id"": 107914493, ""name"": ""0.26.1"", ""sha"": ""4722acc73ce761556b18f5dcbe36b7fef2ee2c69""}, {""repo_id"": 107914493, ""name"": ""0.26"", ""sha"": ""424e146697309a54c05d5d1ba1f840849ddbafdc""}, {""repo_id"": 107914493, ""name"": ""0.25.2"", ""sha"": ""b5128fc53fce6a1bf3b16bad9f318451bc1d1263""}, {""repo_id"": 107914493, ""name"": ""0.25.1"", ""sha"": ""3dc0b3fa8c9b9bd81540ffe20c8b7e7a72465274""}, {""repo_id"": 107914493, ""name"": ""0.25"", ""sha"": ""57a71377c992753327a16b417daf79df7f506dd1""}, {""repo_id"": 107914493, ""name"": ""0.24"", ""sha"": ""28872a1fa789f314b0342f4e6182f1c78d6e2bca""}, {""repo_id"": 107914493, ""name"": ""0.23.2"", ""sha"": ""6df6f712b36f0fe75694174906e31242427a8d1d""}, {""repo_id"": 107914493, ""name"": ""0.23.1"", ""sha"": ""dea86b9fba78e032ad09673e884e764387daf209""}, {""repo_id"": 107914493, ""name"": ""0.23"", ""sha"": ""e04f5b0d348ef7275a0a5ab9eb53527105132885""}, {""repo_id"": 107914493, ""name"": ""0.22.1"", ""sha"": ""5d6252788230d168ba09f379d1d2af867e3302ab""}, {""repo_id"": 107914493, ""name"": ""0.22"", ""sha"": ""558d9d7bfef3dd633eb16389281b67d42c9bdeef""}, {""repo_id"": 107914493, ""name"": ""0.21"", ""sha"": ""403211de632cd15f0820cc9399305fc43c187b47""}, {""repo_id"": 107914493, ""name"": ""0.20"", ""sha"": ""3a5d7951ce8f35118ffdd7f8d86e09b909e1218c""}, {""repo_id"": 107914493, ""name"": ""0.19"", ""sha"": ""ba9bfa583179c25aaef94b1f44da7eba74620b9a""}, {""repo_id"": 107914493, ""name"": ""0.18"", ""sha"": ""43ae15c0d14b3e968e8d5bfef72ac0c39783c3a2""}, {""repo_id"": 107914493, ""name"": ""0.17"", ""sha"": ""fb988ace7c7e2bee5ac142a0eab22431d0675a77""}, {""repo_id"": 107914493, ""name"": ""0.16"", ""sha"": ""b6539ff04502536bd1fa96e3b1430bdafc456826""}, {""repo_id"": 107914493, ""name"": ""0.15"", ""sha"": ""7706fe0c67aba5cfe905c7906cae9e0c43cd75b2""}, {""repo_id"": 107914493, ""name"": ""0.14"", ""sha"": ""2edc652df6d786e4f2c3f073e3567002d248be09""}, {""repo_id"": 107914493, ""name"": ""0.13"", ""sha"": ""c160f15c3937f8fbe581276f811e8c58f9137bb1""}, {""repo_id"": 107914493, ""name"": ""0.12"", ""sha"": ""51bdd67691bd69082ae7690af8b905f06050ee80""}, {""repo_id"": 107914493, ""name"": ""0.11"", ""sha"": ""b0f3d4e375655f0764f3137dbcede324f9bbc0cb""}, {""repo_id"": 107914493, ""name"": ""0.10"", ""sha"": ""5928c11ee798a232aa4096706cd47e639d1c9fc2""}, {""repo_id"": 107914493, ""name"": ""0.9"", ""sha"": ""d75f423b6fcfc074b7c6f8f7679da8876f181edd""}, {""repo_id"": 107914493, ""name"": ""0.8"", ""sha"": ""fe279ab7b4ae99dab295d5cf4d39ad06d782997e""}, {""repo_id"": 107914493, ""name"": ""0.7"", ""sha"": ""6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2""}, {""repo_id"": 207052882, ""name"": ""2.3"", ""sha"": ""7090e43d804724ef3b31ae5ca9efd6ac05f76cbc""}, {""repo_id"": 207052882, ""name"": ""2.2"", ""sha"": ""4fe69783b55465e7692a807d3a02a710f69c9c42""}, {""repo_id"": 207052882, ""name"": ""2.1"", ""sha"": ""9d7aed336c8e62bf372caa800cb4aae3985cbae9""}, {""repo_id"": 207052882, ""name"": ""2.0"", ""sha"": ""44611df1524a03ce305405e5902c9615e3c73a72""}, {""repo_id"": 207052882, ""name"": ""1.1"", ""sha"": ""5cd34bd07d704487d48ac741ee5da5317afe88d2""}, {""repo_id"": 207052882, ""name"": ""1.0.1"", ""sha"": ""3b7ab5685de89fcb6fc92d320c0e24b17be05570""}, {""repo_id"": 207052882, ""name"": ""1.0"", ""sha"": ""1ea30c8fb1d080bd5e38c577e3ad20bb527a2fe6""}, {""repo_id"": 207052882, ""name"": ""0.7"", ""sha"": ""e35eec4343aa560c58c1634cc228d0d46c442304""}, {""repo_id"": 207052882, ""name"": ""0.6"", ""sha"": ""9eb737090fafd0e5a7e314be48402374d99e9828""}, {""repo_id"": 207052882, ""name"": ""0.5"", ""sha"": ""ae9035f8fe5aff1c54bff4c6b4c2e808a44f0f2a""}, {""repo_id"": 207052882, ""name"": ""0.4"", ""sha"": ""8c6251c31a05c58c2bfbef114247642d1b3dbb44""}, {""repo_id"": 207052882, ""name"": ""0.3"", ""sha"": ""f697f247468516aa4ee13b1862b59e0dba18d00f""}, {""repo_id"": 207052882, ""name"": ""0.2"", ""sha"": ""0fe96bc50fb3d7b1c7e4577db0ddf207eaeebbb9""}, {""repo_id"": 207052882, ""name"": ""0.1.1"", ""sha"": ""321e0284c64dc48b2143311009886293c05edb07""}, {""repo_id"": 207052882, ""name"": ""0.1"", ""sha"": ""7387c88a3f84704548e81d43b91615c02b61a957""}] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660355904,github-to-sqlite tags command for fetching tags, https://github.com/dogsheep/github-to-sqlite/issues/43#issuecomment-660548780,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43,660548780,MDEyOklzc3VlQ29tbWVudDY2MDU0ODc4MA==,9599,simonw,2020-07-18T22:02:37Z,2020-07-18T23:05:56Z,MEMBER,https://github-to-sqlite.dogsheep.net/github/tags?_facet=repo,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660355904,github-to-sqlite tags command for fetching tags, https://github.com/dogsheep/github-to-sqlite/issues/43#issuecomment-660551397,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43,660551397,MDEyOklzc3VlQ29tbWVudDY2MDU1MTM5Nw==,9599,simonw,2020-07-18T22:27:32Z,2020-07-18T23:05:45Z,MEMBER,"```sql with most_recent_releases as ( with ranked as ( select repo, tag_name, published_at, row_number() OVER ( partition BY repo ORDER BY published_at DESC ) rank FROM releases ) select * from ranked where rank = 1 ) select repos.full_name as repo, most_recent_releases.tag_name as release, commits.committer_date as release_commit_date, ( select count(*) from commits c2 where c2.repo = repos.id and c2.committer_date > commits.committer_date ) as commits_since_release, 'https://github.com/' || repos.full_name || '/compare/' || most_recent_releases.tag_name || '...' || repos.default_branch as view_commits from most_recent_releases join repos on most_recent_releases.repo = repos.id join tags on tags.repo = repos.id and tags.name = most_recent_releases.tag_name join commits on tags.sha = commits.sha order by commits_since_release desc ``` repo | release | release_commit_date | commits_since_release | view_commits -- | -- | -- | -- | -- simonw/datasette | 0.45 | 2020-07-01T21:43:07Z | 9 | https://github.com/simonw/datasette/compare/0.45...master dogsheep/twitter-to-sqlite | 0.21.1 | 2020-04-30T18:20:43Z | 2 | https://github.com/dogsheep/twitter-to-sqlite/compare/0.21.1...master dogsheep/github-to-sqlite | 2.3 | 2020-07-09T23:26:34Z | 2 | https://github.com/dogsheep/github-to-sqlite/compare/2.3...master dogsheep/dogsheep-photos | 0.4.1 | 2020-05-25T20:11:20Z | 2 | https://github.com/dogsheep/dogsheep-photos/compare/0.4.1...master dogsheep/swarm-to-sqlite | 0.3.1 | 2020-03-28T02:29:41Z | 1 | https://github.com/dogsheep/swarm-to-sqlite/compare/0.3.1...master dogsheep/hacker-news-to-sqlite | 0.3.1 | 2020-03-21T22:39:34Z | 1 | https://github.com/dogsheep/hacker-news-to-sqlite/compare/0.3.1...master simonw/sqlite-utils | 2.11 | 2020-07-08T17:36:07Z | 0 | https://github.com/simonw/sqlite-utils/compare/2.11...master dogsheep/healthkit-to-sqlite | 0.5 | 2020-03-28T01:50:51Z | 0 | https://github.com/dogsheep/healthkit-to-sqlite/compare/0.5...master dogsheep/inaturalist-to-sqlite | 0.2 | 2020-03-24T00:35:44Z | 0 | https://github.com/dogsheep/inaturalist-to-sqlite/compare/0.2...master dogsheep/genome-to-sqlite | 0.1 | 2019-09-19T15:38:10Z | 0 | https://github.com/dogsheep/genome-to-sqlite/compare/0.1...master dogsheep/pocket-to-sqlite | 0.2 | 2020-03-27T22:23:16Z | 0 | https://github.com/dogsheep/pocket-to-sqlite/compare/0.2...master https://github-to-sqlite.dogsheep.net/github?sql=with+most_recent_releases+as+%28%0D%0A++with+ranked+as+%28%0D%0A++++select%0D%0A++++++repo%2C%0D%0A++++++tag_name%2C%0D%0A++++++published_at%2C%0D%0A++++++row_number%28%29+OVER+%28%0D%0A++++++++partition+BY+repo%0D%0A++++++++ORDER+BY%0D%0A++++++++++published_at+DESC%0D%0A++++++%29+rank%0D%0A++++FROM%0D%0A++++++releases%0D%0A++%29%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++ranked%0D%0A++where%0D%0A++++rank+%3D+1%0D%0A%29%0D%0Aselect%0D%0A++repos.full_name+as+repo%2C%0D%0A++most_recent_releases.tag_name+as+release%2C%0D%0A++commits.committer_date+as+release_commit_date%2C%0D%0A++%28%0D%0A++++select%0D%0A++++++count%28*%29%0D%0A++++from%0D%0A++++++commits+c2%0D%0A++++where%0D%0A++++++c2.repo+%3D+repos.id%0D%0A++++++and+c2.committer_date+%3E+commits.committer_date%0D%0A++%29+as+commits_since_release%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos.full_name+%7C%7C+%27%2Fcompare%2F%27+%7C%7C+most_recent_releases.tag_name+%7C%7C+%27...%27+%7C%7C+repos.default_branch+as+view_commits%0D%0Afrom%0D%0A++most_recent_releases%0D%0A++join+repos+on+most_recent_releases.repo+%3D+repos.id%0D%0A++join+tags+on+tags.repo+%3D+repos.id%0D%0A++and+tags.name+%3D+most_recent_releases.tag_name%0D%0A++join+commits+on+tags.sha+%3D+commits.sha%0D%0Aorder+by%0D%0A++commits_since_release+desc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660355904,github-to-sqlite tags command for fetching tags, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660553646,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660553646,MDEyOklzc3VlQ29tbWVudDY2MDU1MzY0Ng==,9599,simonw,2020-07-18T22:51:41Z,2020-07-18T22:51:41Z,MEMBER,"I could fix this by putting `REFRESH_DB` in a commit message: https://github.com/dogsheep/github-to-sqlite/blob/4ae4aa6f172344b19ff3513707195ee6d2654bd4/.github/workflows/deploy-demo.yml#L41-L46 But... doing so would lose the data I've collected in https://github-to-sqlite.dogsheep.net/github/dependents?_sort_desc=first_seen_utc concerning the first time each dependent repo was spotted.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601,Fix the demo - it breaks because of the tags table change, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660553711,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660553711,MDEyOklzc3VlQ29tbWVudDY2MDU1MzcxMQ==,9599,simonw,2020-07-18T22:52:16Z,2020-07-18T22:52:16Z,MEMBER,"I think the best fix is to download the `github.db` database, manually fix it and then manually deploy it to Cloud Run from my laptop.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601,Fix the demo - it breaks because of the tags table change, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660554162,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660554162,MDEyOklzc3VlQ29tbWVudDY2MDU1NDE2Mg==,9599,simonw,2020-07-18T22:56:58Z,2020-07-18T22:56:58Z,MEMBER,"Manually fixing the database: ``` $ wget 'https://github-to-sqlite.dogsheep.net/github.db' --2020-07-18 15:52:33-- https://github-to-sqlite.dogsheep.net/github.db Resolving github-to-sqlite.dogsheep.net (github-to-sqlite.dogsheep.net)... 172.217.5.115 Connecting to github-to-sqlite.dogsheep.net (github-to-sqlite.dogsheep.net)|172.217.5.115|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 14626816 (14M) [application/octet-stream] Saving to: ‘github.db’ github.db 100%[============================================================================>] 13.95M 1.22MB/s in 18s 2020-07-18 15:52:53 (773 KB/s) - ‘github.db’ saved [14626816/14626816] $ sqlite3 github.db SQLite version 3.28.0 2019-04-15 14:49:49 Enter "".help"" for usage hints. sqlite> drop table tags; sqlite> ^D $ github-to-sqlite tags github.db simonw/datasette simonw/sqlite-utils dogsheep/healthkit-to-sqlite dogsheep/swarm-to-sqlite dogsheep/twitter-to-sqlite dogsheep/inaturalist-to-sqlite dogsheep/google-takeout-to-sqlite dogsheep/github-to-sqlite dogsheep/genome-to-sqlite dogsheep/pocket-to-sqlite dogsheep/hacker-news-to-sqlite dogsheep/dogsheep-photos $ sqlite-utils tables github.db --counts [{""table"": ""users"", ""count"": 4048}, {""table"": ""repos"", ""count"": 210}, ... {""table"": ""stars"", ""count"": 4140}, {""table"": ""tags"", ""count"": 188}] $ sqlite-utils rows github.db tags [{""repo"": 107914493, ""name"": ""0.45"", ""sha"": ""f1f581b7ffcd5d8f3ae6c1c654d813a6641410eb""}, {""repo"": 107914493, ""name"": ""0.45a5"", ""sha"": ""676bb64c877d73f8ff496cef4632f5a8a5a9283c""}, ... ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601,Fix the demo - it breaks because of the tags table change, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660554299,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660554299,MDEyOklzc3VlQ29tbWVudDY2MDU1NDI5OQ==,9599,simonw,2020-07-18T22:58:24Z,2020-07-18T23:02:52Z,MEMBER,"Deploying the fixed version like this: ``` $ gcloud config set run/region us-central1 $ gcloud config set project datasette-222320 $ datasette publish cloudrun /tmp/github.db \ -m demo-metadata.json \ --service github-to-sqlite \ --install=py-gfm \ --install='datasette-search-all>=0.3' \ --install='datasette-render-markdown>=1.1.2' \ --install=datasette-pretty-json \ --install=datasette-json-html \ --install=datasette-vega ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601,Fix the demo - it breaks because of the tags table change, https://github.com/dogsheep/github-to-sqlite/issues/45#issuecomment-660554811,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45,660554811,MDEyOklzc3VlQ29tbWVudDY2MDU1NDgxMQ==,9599,simonw,2020-07-18T23:03:13Z,2020-07-18T23:03:13Z,MEMBER,https://github-to-sqlite.dogsheep.net/github/tags now shows a `repo` column instead of a `repo_id` column.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660429601,Fix the demo - it breaks because of the tags table change, https://github.com/dogsheep/github-to-sqlite/issues/43#issuecomment-660536265,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43,660536265,MDEyOklzc3VlQ29tbWVudDY2MDUzNjI2NQ==,9599,simonw,2020-07-18T20:15:12Z,2020-07-18T20:15:12Z,MEMBER,"I want to create a SQL query which shows me all of my repositories that have commits that are NOT in the most recent release. The releases table doesn't have enough information for this because it doesn't tell you the commit hash associated with each release, just the tag: https://github-to-sqlite.dogsheep.net/github/releases","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",660355904,github-to-sqlite tags command for fetching tags, https://github.com/dogsheep/twitter-to-sqlite/issues/48#issuecomment-662626901,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48,662626901,MDEyOklzc3VlQ29tbWVudDY2MjYyNjkwMQ==,9599,simonw,2020-07-22T18:54:53Z,2020-07-22T18:54:53Z,MEMBER,I'm going to use a GitHub Action to run `npx markdown-toc README.md -i`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663976976,Add a table of contents to the README, https://github.com/dogsheep/twitter-to-sqlite/issues/48#issuecomment-662630868,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48,662630868,MDEyOklzc3VlQ29tbWVudDY2MjYzMDg2OA==,9599,simonw,2020-07-22T19:03:02Z,2020-07-22T19:03:02Z,MEMBER,Done!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663976976,Add a table of contents to the README, https://github.com/dogsheep/twitter-to-sqlite/issues/48#issuecomment-663143160,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48,663143160,MDEyOklzc3VlQ29tbWVudDY2MzE0MzE2MA==,9599,simonw,2020-07-23T17:46:07Z,2020-07-23T17:46:07Z,MEMBER,"Frustratingly, these links don't work on PyPI: https://pypi.org/project/twitter-to-sqlite/ There's an issue about that here: https://github.com/pypa/readme_renderer/issues/169","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663976976,Add a table of contents to the README, https://github.com/dogsheep/swarm-to-sqlite/issues/8#issuecomment-669241341,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/8,669241341,MDEyOklzc3VlQ29tbWVudDY2OTI0MTM0MQ==,9599,simonw,2020-08-05T14:55:14Z,2020-08-05T14:55:14Z,MEMBER,"Looks like there's a column that's not consistently there, so the table got created without it. Easiest fix is to add `alter=True` to this line: https://github.com/dogsheep/swarm-to-sqlite/blob/f4a82633da927cde672c9d9af92930bfca2e3ddf/swarm_to_sqlite/utils.py#L94 That will cause `sqlite-utils` to notice if there's a missing column and add it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648245071,Error thrown: table photos has no column named hasSticker, https://github.com/dogsheep/github-to-sqlite/issues/46#issuecomment-675259273,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46,675259273,MDEyOklzc3VlQ29tbWVudDY3NTI1OTI3Mw==,9599,simonw,2020-08-18T05:28:32Z,2020-08-18T05:28:32Z,MEMBER,"Oh that's interesting - i didn't realize ""reviews"" were a separate concept. I'd definitely accept a pull request adding those!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",664485022,Feature: pull request reviews and comments, https://github.com/dogsheep/github-to-sqlite/issues/39#issuecomment-675509550,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/39,675509550,MDEyOklzc3VlQ29tbWVudDY3NTUwOTU1MA==,9599,simonw,2020-08-18T14:23:56Z,2020-08-18T14:23:56Z,MEMBER,I think this is fixed: https://github-to-sqlite.dogsheep.net/github/issues?_facet=repo,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",613777056,issues foreign key to repo isn't working, https://github.com/dogsheep/github-to-sqlite/issues/47#issuecomment-675523053,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/47,675523053,MDEyOklzc3VlQ29tbWVudDY3NTUyMzA1Mw==,9599,simonw,2020-08-18T14:45:53Z,2020-08-18T14:45:53Z,MEMBER,"``` % github-to-sqlite emojis emojis.db --fetch [########----------------------------] 397/1682 23% 00:03:43 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681086659,emojis command, https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-684250044,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3,684250044,MDEyOklzc3VlQ29tbWVudDY4NDI1MDA0NA==,9599,simonw,2020-09-01T05:01:09Z,2020-09-01T05:01:23Z,MEMBER,Maybe this starts out as a custom templated canned query.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689810340,"Datasette plugin to provide custom page for running faceted, ranked searches", https://github.com/dogsheep/dogsheep-beta/issues/4#issuecomment-684395444,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4,684395444,MDEyOklzc3VlQ29tbWVudDY4NDM5NTQ0NA==,9599,simonw,2020-09-01T06:00:03Z,2020-09-01T06:00:03Z,MEMBER,I ran `sqlite-utils optimize beta.db` against my test DB and the size reduced from 183M to 176M - and a 450ms search ran in 359ms. So not a huge improvement but still worthwhile.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689839399,Optimize the FTS table, https://github.com/dogsheep/pocket-to-sqlite/issues/3#issuecomment-684424396,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/3,684424396,MDEyOklzc3VlQ29tbWVudDY4NDQyNDM5Ng==,9599,simonw,2020-09-01T06:17:45Z,2020-09-01T06:17:45Z,MEMBER,It looks like I could ignore the `image` column and synthesize a unique key from the data in the `images` column using `$item_id/$image_id`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",503243784,Extract images into separate tables, https://github.com/dogsheep/pocket-to-sqlite/issues/5#issuecomment-684425714,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/5,684425714,MDEyOklzc3VlQ29tbWVudDY4NDQyNTcxNA==,9599,simonw,2020-09-01T06:18:32Z,2020-09-01T06:18:32Z,MEMBER,"Good suggestion, I'll setup a demo somewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",629473827,Set up a demo, https://github.com/dogsheep/dogsheep-beta/issues/2#issuecomment-685115519,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2,685115519,MDEyOklzc3VlQ29tbWVudDY4NTExNTUxOQ==,9599,simonw,2020-09-01T20:31:57Z,2020-09-01T20:31:57Z,MEMBER,"Actually this doesn't work: you can't turn on stemming for specific tables, because all of the content goes into a single `search_index` table which is configured the same way. So stemming needs to be a global option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689809225,Apply porter stemming, https://github.com/dogsheep/dogsheep-beta/issues/2#issuecomment-685121074,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2,685121074,MDEyOklzc3VlQ29tbWVudDY4NTEyMTA3NA==,9599,simonw,2020-09-01T20:42:00Z,2020-09-01T20:42:00Z,MEMBER,Documentation at the bottom of the Usage section here: https://github.com/dogsheep/dogsheep-beta/blob/0.2/README.md#usage,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689809225,Apply porter stemming, https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685895540,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685895540,MDEyOklzc3VlQ29tbWVudDY4NTg5NTU0MA==,9599,simonw,2020-09-02T17:46:44Z,2020-09-02T17:46:44Z,MEMBER,"Some opet questions about this: - Should I restrict to two exclusive categories here, or should I have a generic category mechanism that can be expanded to more than two? - Should an item be able to exist in more than one category? Do I want to be able to mark an indexed item as both by-me and liked-by-me for example? This question is more interesting if the number of categories is greater than two. - How should this be modeled? Single column, multiple boolean columns, JSON array, m2m against separate table? - What's the best way to make this performant","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198,"Mechanism for differentiating between ""by me"" and ""liked by me""", https://github.com/dogsheep/dogsheep-beta/issues/8#issuecomment-685960072,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/8,685960072,MDEyOklzc3VlQ29tbWVudDY4NTk2MDA3Mg==,9599,simonw,2020-09-02T19:50:47Z,2020-09-02T19:50:47Z,MEMBER,"This doesn't actually help, because the Datasette table view page doesn't then support adding the `where search_index_fts match :query` bit.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691369691,Create a view for running faceted searches, https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-685961809,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3,685961809,MDEyOklzc3VlQ29tbWVudDY4NTk2MTgwOQ==,9599,simonw,2020-09-02T19:54:24Z,2020-09-02T19:54:24Z,MEMBER,"This should implement search highlighting too, as seen on https://til.simonwillison.net/til/search?q=cloud ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689810340,"Datasette plugin to provide custom page for running faceted, ranked searches", https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685962280,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685962280,MDEyOklzc3VlQ29tbWVudDY4NTk2MjI4MA==,9599,simonw,2020-09-02T19:55:26Z,2020-09-02T19:59:58Z,MEMBER,"Relevant: https://charlesleifer.com/blog/a-tour-of-tagging-schemas-many-to-many-bitmaps-and-more/ SQLite supports bitwise operators Binary AND (&) and Binary OR (|) - I could try those. Not sure how they interact with indexes though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198,"Mechanism for differentiating between ""by me"" and ""liked by me""", https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685965516,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685965516,MDEyOklzc3VlQ29tbWVudDY4NTk2NTUxNg==,9599,simonw,2020-09-02T20:01:54Z,2020-09-02T20:01:54Z,MEMBER,"Relevant post: https://sqlite.org/forum/forumpost/9f06fedaa5 - drh says: > Indexes are one-to-one. There is one entry in the index for each row in the table. > > You are asking for an index that is many-to-one - multiple index entries for each table row. > > A Full-Text Index is basically a many-to-one index. So if all of your array entries really are words, you could probably get this to work using a Full-Text Index.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198,"Mechanism for differentiating between ""by me"" and ""liked by me""", https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685966361,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685966361,MDEyOklzc3VlQ29tbWVudDY4NTk2NjM2MQ==,9599,simonw,2020-09-02T20:03:29Z,2020-09-02T20:03:41Z,MEMBER,"I'm going to implement the first version of this as an indexed integer `category` column which has 1 for ""about me"" and 2 for ""liked by me"" - and space for other category numerals in the future, albeit a row can only belong to one category. I'll think about a full tagging system separately.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198,"Mechanism for differentiating between ""by me"" and ""liked by me""", https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685966707,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685966707,MDEyOklzc3VlQ29tbWVudDY4NTk2NjcwNw==,9599,simonw,2020-09-02T20:04:08Z,2020-09-02T20:04:08Z,MEMBER,I'll make `category` a foreign key to a `categories` table so Datasette can automatically show the `name` column.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198,"Mechanism for differentiating between ""by me"" and ""liked by me""", https://github.com/dogsheep/dogsheep-beta/issues/7#issuecomment-685970384,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7,685970384,MDEyOklzc3VlQ29tbWVudDY4NTk3MDM4NA==,9599,simonw,2020-09-02T20:11:41Z,2020-09-02T20:11:59Z,MEMBER,"Default categories: - 1 = created - 2 = saved","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691265198,"Mechanism for differentiating between ""by me"" and ""liked by me""", https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686153967,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686153967,MDEyOklzc3VlQ29tbWVudDY4NjE1Mzk2Nw==,9599,simonw,2020-09-03T00:17:16Z,2020-09-03T00:17:55Z,MEMBER,"Maybe I can take advantage of https://sqlite.org/np1queryprob.html here - I could define a SQL query for fetching the ""display"" version of each item, and include a Jinja template fragment in the configuration as well. Maybe something like this: ```yaml photos.db: photos_with_apple_metadata: sql: |- select sha256 as key, 'Photo in ' || coalesce(place_city, 'unknown') as title, ( select group_concat(normalized_string, ' ') from labels where labels.uuid = photos_with_apple_metadata.uuid ) as search_1, date as timestamp, 1 as category from photos_with_apple_metadata display_sql: |- select sha256, place_city, date from photos_with_apple_metadata where sha256 = :key display: |-

Taken in {{ display.place_city }} on {{ display.date }}

```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686154486,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686154486,MDEyOklzc3VlQ29tbWVudDY4NjE1NDQ4Ng==,9599,simonw,2020-09-03T00:18:54Z,2020-09-03T00:18:54Z,MEMBER,"`display_sql` could be optional. If it's not defined, a `row` object is passed to the template which is the row that's stored in `search_index`. If `display_sql` IS defined then it's executed and the result is made available as a `display` object in addition to the `row` object.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686154627,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686154627,MDEyOklzc3VlQ29tbWVudDY4NjE1NDYyNw==,9599,simonw,2020-09-03T00:19:22Z,2020-09-03T00:19:22Z,MEMBER,If this performs well enough (100 displayed items will be 100 extra `display_sql` calls) then I'll go with this as the design for the feature.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686158454,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686158454,MDEyOklzc3VlQ29tbWVudDY4NjE1ODQ1NA==,9599,simonw,2020-09-03T00:32:42Z,2020-09-03T00:32:42Z,MEMBER,"If this turns out to be too inefficient I could add a `display` text column to the `search_index` table which is designed to be populated with arbitrary JSON by the indexing query, which can then be used to render the template fragment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686163754,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686163754,MDEyOklzc3VlQ29tbWVudDY4NjE2Mzc1NA==,9599,simonw,2020-09-03T00:46:21Z,2020-09-03T00:46:21Z,MEMBER,"Challenge: the `dogsheep-beta.yml` configuration file that is passed to the `dogsheep-beta index` command needs to also be made available to Datasette itself, so that it can read the configuration. Let's say it can either be duplicated in the `plugins` configuration block of the `metadata.yml` OR you can do this in `metadata.yml`: ```yaml plugins: dogsheep-beta: config_file: dogsheep-beta.yml ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/10#issuecomment-686238498,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/10,686238498,MDEyOklzc3VlQ29tbWVudDY4NjIzODQ5OA==,9599,simonw,2020-09-03T04:05:05Z,2020-09-03T04:05:05Z,MEMBER,Since the first two categories are `created` and `saved` this one should be called `received`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691557547,Category 3: received, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686688963,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686688963,MDEyOklzc3VlQ29tbWVudDY4NjY4ODk2Mw==,9599,simonw,2020-09-03T18:42:59Z,2020-09-03T18:42:59Z,MEMBER,I'm pleased with how this works now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686689122,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686689122,MDEyOklzc3VlQ29tbWVudDY4NjY4OTEyMg==,9599,simonw,2020-09-03T18:43:20Z,2020-09-03T18:43:20Z,MEMBER,Needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/5#issuecomment-686689366,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5,686689366,MDEyOklzc3VlQ29tbWVudDY4NjY4OTM2Ng==,9599,simonw,2020-09-03T18:43:50Z,2020-09-03T18:43:50Z,MEMBER,No longer needed thanks to #9,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689847361,Add a context column that's not searchable, https://github.com/dogsheep/dogsheep-beta/issues/3#issuecomment-686689612,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3,686689612,MDEyOklzc3VlQ29tbWVudDY4NjY4OTYxMg==,9599,simonw,2020-09-03T18:44:20Z,2020-09-03T18:44:20Z,MEMBER,Facets are now displayed but selecting them doesn't work yet.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",689810340,"Datasette plugin to provide custom page for running faceted, ranked searches", https://github.com/dogsheep/dogsheep-beta/issues/11#issuecomment-686618669,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11,686618669,MDEyOklzc3VlQ29tbWVudDY4NjYxODY2OQ==,9599,simonw,2020-09-03T16:47:34Z,2020-09-03T16:53:25Z,MEMBER,I think a `is_public` integer column which defaults to 0 would be good here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",692125110,Public / Private mechanism, https://github.com/dogsheep/dogsheep-beta/issues/9#issuecomment-686767208,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9,686767208,MDEyOklzc3VlQ29tbWVudDY4Njc2NzIwOA==,9599,simonw,2020-09-03T21:12:14Z,2020-09-03T21:12:14Z,MEMBER,Documentation: https://github.com/dogsheep/dogsheep-beta/blob/0.4/README.md#custom-results-display,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",691521965,Mechanism for defining custom display of results, https://github.com/dogsheep/dogsheep-beta/issues/13#issuecomment-686774592,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/13,686774592,MDEyOklzc3VlQ29tbWVudDY4Njc3NDU5Mg==,9599,simonw,2020-09-03T21:30:21Z,2020-09-03T21:30:21Z,MEMBER,"This is partially supported: the custom search SQL we run doesn't escape them, but the `?_search` used to calculate facet counts does. So this is a bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",692386625,Support advanced FTS queries, https://github.com/dogsheep/dogsheep-beta/issues/17#issuecomment-687880459,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17,687880459,MDEyOklzc3VlQ29tbWVudDY4Nzg4MDQ1OQ==,9599,simonw,2020-09-06T19:36:32Z,2020-09-06T19:36:32Z,MEMBER,At some point I may even want to support search types which are indexed from (and inflated from) more than one database file. I'm going to ignore that for the moment though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694500679,"Rename ""table"" to ""type""", https://github.com/dogsheep/dogsheep-beta/issues/18#issuecomment-688622995,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18,688622995,MDEyOklzc3VlQ29tbWVudDY4ODYyMjk5NQ==,9599,simonw,2020-09-08T05:15:21Z,2020-09-08T05:15:21Z,MEMBER,"Alternatively it could run as it does now but add a `DELETE FROM index1.search_index WHERE key not in (select key from ...)`. I'm not sure which would be more efficient.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695553522,Deleted records stay in the search index, https://github.com/dogsheep/dogsheep-beta/issues/18#issuecomment-688623097,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18,688623097,MDEyOklzc3VlQ29tbWVudDY4ODYyMzA5Nw==,9599,simonw,2020-09-08T05:15:51Z,2020-09-08T05:15:51Z,MEMBER,"I'm inclined to go with the first, simpler option. I have longer term plans for efficient incremental index updates based on clever trickery with triggers.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695553522,Deleted records stay in the search index, https://github.com/dogsheep/dogsheep-beta/issues/19#issuecomment-688625430,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19,688625430,MDEyOklzc3VlQ29tbWVudDY4ODYyNTQzMA==,9599,simonw,2020-09-08T05:24:50Z,2020-09-08T05:24:50Z,MEMBER,"I thought about allowing tables to define a incremental indexing SQL query - maybe something that can return just records touched in the past hour, or records since a recorded ""last indexed record"" value. The problem with this is deletes - if you delete a record, how does the indexer know to remove it? See #18 - that's already caused problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695556681,Figure out incremental re-indexing, https://github.com/dogsheep/dogsheep-beta/issues/19#issuecomment-688626037,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19,688626037,MDEyOklzc3VlQ29tbWVudDY4ODYyNjAzNw==,9599,simonw,2020-09-08T05:27:07Z,2020-09-08T05:27:07Z,MEMBER,"A really clever way to do this would be with triggers. The indexer script would add triggers to each of the database tables that it is indexing - each in their own database. Those triggers would then maintain a `_index_queue_` table. This table would record the primary key of rows that are added, modified or deleted. The indexer could then work by reading through the `_index_queue_` table, re-indexing (or deleting) just the primary keys listed there, and then emptying the queue once it has finished. This would add a small amount of overhead to insert/update/delete queries run against the table. My hunch is that the overhead would be miniscule, but I could still allow people to opt-out for tables that are so high traffic that this would matter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",695556681,Figure out incremental re-indexing, https://github.com/dogsheep/dogsheep-beta/issues/17#issuecomment-689226390,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17,689226390,MDEyOklzc3VlQ29tbWVudDY4OTIyNjM5MA==,9599,simonw,2020-09-09T00:36:07Z,2020-09-09T00:36:07Z,MEMBER,"Alternative names: - type - record_type - doctype I think `type` is right. It matches what Elasticsearch used to call their equivalent of this (before they removed the feature!). https://www.elastic.co/guide/en/elasticsearch/reference/current/removal-of-types.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694500679,"Rename ""table"" to ""type""", https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693773191,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693773191,MDEyOklzc3VlQ29tbWVudDY5Mzc3MzE5MQ==,9599,simonw,2020-09-17T02:39:26Z,2020-09-17T02:39:26Z,MEMBER,I'm going to start with `github-to-sqlite get` and `github-to-sqlite post` - I may add `put` and suchlike later on.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693775622,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693775622,MDEyOklzc3VlQ29tbWVudDY5Mzc3NTYyMg==,9599,simonw,2020-09-17T02:48:34Z,2020-09-17T02:48:34Z,MEMBER,I'd like a `--paginate` option that does the same thing as https://github.com/simonw/paginate-json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693788032,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693788032,MDEyOklzc3VlQ29tbWVudDY5Mzc4ODAzMg==,9599,simonw,2020-09-17T03:35:22Z,2020-09-17T03:35:22Z,MEMBER,Documentation: https://github.com/dogsheep/github-to-sqlite/blob/b02bf135485c0a7a3768868967f45a6b5e515289/README.md#making-authenticated-api-calls,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693788387,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693788387,MDEyOklzc3VlQ29tbWVudDY5Mzc4ODM4Nw==,9599,simonw,2020-09-17T03:36:47Z,2020-09-17T03:36:58Z,MEMBER,"Fun demo of the `--nl` option: github-to-sqlite get /users/simonw/repos --paginate --nl | sqlite-utils insert simonw.db repos - --nl ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693789129,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693789129,MDEyOklzc3VlQ29tbWVudDY5Mzc4OTEyOQ==,9599,simonw,2020-09-17T03:40:01Z,2020-09-17T03:40:01Z,MEMBER,"Bug with endpoints that return dictionaries rather than arrays: ``` github-to-sqlite get /users/simonw [ ""login"", ""id"", ""node_id"", ""avatar_url"", ""gravatar_id"", ""url"", ""html_url"", ""followers_url"", ""following_url"", ""gists_url"", ""starred_url"", ""subscriptions_url"", ""organizations_url"", ""repos_url"", ""events_url"", ""received_events_url"", ""type"", ""site_admin"", ""name"", ""company"", ""blog"", ""location"", ""email"", ""hireable"", ""bio"", ""twitter_username"", ""public_repos"", ""public_gists"", ""followers"", ""following"", ""created_at"", ""updated_at"" ] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-693794700,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,693794700,MDEyOklzc3VlQ29tbWVudDY5Mzc5NDcwMA==,9599,simonw,2020-09-17T04:02:39Z,2020-09-17T04:02:39Z,MEMBER,It would be useful if you could pass an `--accept` option to this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/dogsheep-beta/issues/16#issuecomment-694548909,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16,694548909,MDEyOklzc3VlQ29tbWVudDY5NDU0ODkwOQ==,9599,simonw,2020-09-17T23:15:09Z,2020-09-17T23:15:09Z,MEMBER,"I have sort by date now, #21.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",694493566,Timeline view, https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694551406,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24,694551406,MDEyOklzc3VlQ29tbWVudDY5NDU1MTQwNg==,9599,simonw,2020-09-17T23:22:07Z,2020-09-17T23:22:07Z,MEMBER,"Neat, I can debug this with the new `--pdb` option: datasette . --get '/-/beta?q=pycon&sort=oldest' --pdb ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703970814,"the JSON object must be str, bytes or bytearray, not 'Undefined'", https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694551646,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24,694551646,MDEyOklzc3VlQ29tbWVudDY5NDU1MTY0Ng==,9599,simonw,2020-09-17T23:22:48Z,2020-09-17T23:22:48Z,MEMBER,"Looks like its happening in a Jinja fragment template for one of the results: ``` /Users/simon/Dropbox/Development/dogsheep-beta/dogsheep_beta/__init__.py(169)process_results() -> output = compiled.render({**result, **{""json"": json}}) /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/asyncsupport.py(71)render() -> return original_render(self, *args, **kwargs) /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/environment.py(1090)render() -> self.environment.handle_exception() /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/environment.py(832)handle_exception() -> reraise(*rewrite_traceback_stack(source=source)) /Users/simon/.local/share/virtualenvs/dogsheep-beta-u_po4Rpj/lib/python3.8/site-packages/jinja2/_compat.py(28)reraise() -> raise value.with_traceback(tb)