{"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/1#issuecomment-513437463", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1", "id": 513437463, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMzQzNzQ2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-20T05:19:59Z", "updated_at": "2019-07-20T05:19:59Z", "author_association": "MEMBER", "body": "I ran xml_analyser against the XML HealthKit `export.xml` file and got the following results:\r\n\r\n```python\r\n{\r\n 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980,\r\n 'activeEnergyBurnedGoal': 980,\r\n 'activeEnergyBurnedUnit': 980,\r\n 'appleExerciseTime': 980,\r\n 'appleExerciseTimeGoal': 980,\r\n 'appleStandHours': 980,\r\n 'appleStandHoursGoal': 980,\r\n 'dateComponents': 980},\r\n 'child_counts': {},\r\n 'count': 980,\r\n 'parent_counts': {'HealthData': 980}},\r\n 'Correlation': {'attr_counts': {'creationDate': 1,\r\n 'endDate': 1,\r\n 'sourceName': 1,\r\n 'sourceVersion': 1,\r\n 'startDate': 1,\r\n 'type': 1},\r\n 'child_counts': {'MetadataEntry': 1, 'Record': 2},\r\n 'count': 1,\r\n 'parent_counts': {'HealthData': 1}},\r\n 'ExportDate': {'attr_counts': {'value': 1},\r\n 'child_counts': {},\r\n 'count': 1,\r\n 'parent_counts': {'HealthData': 1}},\r\n 'HealthData': {'attr_counts': {'locale': 1},\r\n 'child_counts': {'ActivitySummary': 980,\r\n 'Correlation': 1,\r\n 'ExportDate': 1,\r\n 'Me': 1,\r\n 'Record': 2672231,\r\n 'Workout': 663},\r\n 'count': 1,\r\n 'parent_counts': {}},\r\n 'HeartRateVariabilityMetadataList': {'attr_counts': {},\r\n 'child_counts': {'InstantaneousBeatsPerMinute': 93653},\r\n 'count': 2318,\r\n 'parent_counts': {'Record': 2318}},\r\n 'InstantaneousBeatsPerMinute': {'attr_counts': {'bpm': 93653, 'time': 93653},\r\n 'child_counts': {},\r\n 'count': 93653,\r\n 'parent_counts': {'HeartRateVariabilityMetadataList': 93653}},\r\n 'Location': {'attr_counts': {'altitude': 398683,\r\n 'course': 398683,\r\n 'date': 398683,\r\n 'horizontalAccuracy': 398683,\r\n 'latitude': 398683,\r\n 'longitude': 398683,\r\n 'speed': 398683,\r\n 'verticalAccuracy': 398683},\r\n 'child_counts': {},\r\n 'count': 398683,\r\n 'parent_counts': {'WorkoutRoute': 398683}},\r\n 'Me': {'attr_counts': {'HKCharacteristicTypeIdentifierBiologicalSex': 1,\r\n 'HKCharacteristicTypeIdentifierBloodType': 1,\r\n 'HKCharacteristicTypeIdentifierDateOfBirth': 1,\r\n 'HKCharacteristicTypeIdentifierFitzpatrickSkinType': 1},\r\n 'child_counts': {},\r\n 'count': 1,\r\n 'parent_counts': {'HealthData': 1}},\r\n 'MetadataEntry': {'attr_counts': {'key': 290449, 'value': 290449},\r\n 'child_counts': {},\r\n 'count': 290449,\r\n 'parent_counts': {'Correlation': 1,\r\n 'Record': 287974,\r\n 'Workout': 1928,\r\n 'WorkoutRoute': 546}},\r\n 'Record': {'attr_counts': {'creationDate': 2672233,\r\n 'device': 2665111,\r\n 'endDate': 2672233,\r\n 'sourceName': 2672233,\r\n 'sourceVersion': 2671779,\r\n 'startDate': 2672233,\r\n 'type': 2672233,\r\n 'unit': 2650012,\r\n 'value': 2672232},\r\n 'child_counts': {'HeartRateVariabilityMetadataList': 2318,\r\n 'MetadataEntry': 287974},\r\n 'count': 2672233,\r\n 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}},\r\n 'Workout': {'attr_counts': {'creationDate': 663,\r\n 'device': 230,\r\n 'duration': 663,\r\n 'durationUnit': 663,\r\n 'endDate': 663,\r\n 'sourceName': 663,\r\n 'sourceVersion': 663,\r\n 'startDate': 663,\r\n 'totalDistance': 663,\r\n 'totalDistanceUnit': 663,\r\n 'totalEnergyBurned': 663,\r\n 'totalEnergyBurnedUnit': 663,\r\n 'workoutActivityType': 663},\r\n 'child_counts': {'MetadataEntry': 1928,\r\n 'WorkoutEvent': 2094,\r\n 'WorkoutRoute': 340},\r\n 'count': 663,\r\n 'parent_counts': {'HealthData': 663}},\r\n 'WorkoutEvent': {'attr_counts': {'date': 2094,\r\n 'duration': 837,\r\n 'durationUnit': 837,\r\n 'type': 2094},\r\n 'child_counts': {},\r\n 'count': 2094,\r\n 'parent_counts': {'Workout': 2094}},\r\n 'WorkoutRoute': {'attr_counts': {'creationDate': 340,\r\n 'endDate': 340,\r\n 'sourceName': 340,\r\n 'sourceVersion': 340,\r\n 'startDate': 340},\r\n 'child_counts': {'Location': 398683, 'MetadataEntry': 546},\r\n 'count': 340,\r\n 'parent_counts': {'Workout': 340}}}\r\n```\r\n\r\nThe most interesting bit is this:\r\n\r\n```python\r\n 'HealthData': {'attr_counts': {'locale': 1},\r\n 'child_counts': {'ActivitySummary': 980,\r\n 'Correlation': 1,\r\n 'ExportDate': 1,\r\n 'Me': 1,\r\n 'Record': 2672231,\r\n 'Workout': 663},\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 470637068, "label": "Use XML Analyser to figure out the structure of the export XML"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/2#issuecomment-513439411", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2", "id": 513439411, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMzQzOTQxMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-20T05:58:57Z", "updated_at": "2019-07-20T05:58:57Z", "author_association": "MEMBER", "body": "```python\r\n 'Workout': {'attr_counts': {'creationDate': 663,\r\n 'device': 230,\r\n 'duration': 663,\r\n 'durationUnit': 663,\r\n 'endDate': 663,\r\n 'sourceName': 663,\r\n 'sourceVersion': 663,\r\n 'startDate': 663,\r\n 'totalDistance': 663,\r\n 'totalDistanceUnit': 663,\r\n 'totalEnergyBurned': 663,\r\n 'totalEnergyBurnedUnit': 663,\r\n 'workoutActivityType': 663},\r\n 'child_counts': {'MetadataEntry': 1928,\r\n 'WorkoutEvent': 2094,\r\n 'WorkoutRoute': 340},\r\n 'count': 663,\r\n 'parent_counts': {'HealthData': 663}},\r\n 'WorkoutEvent': {'attr_counts': {'date': 2094,\r\n 'duration': 837,\r\n 'durationUnit': 837,\r\n 'type': 2094},\r\n 'child_counts': {},\r\n 'count': 2094,\r\n 'parent_counts': {'Workout': 2094}},\r\n 'WorkoutRoute': {'attr_counts': {'creationDate': 340,\r\n 'endDate': 340,\r\n 'sourceName': 340,\r\n 'sourceVersion': 340,\r\n 'startDate': 340},\r\n 'child_counts': {'Location': 398683, 'MetadataEntry': 546},\r\n 'count': 340,\r\n 'parent_counts': {'Workout': 340}}}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 470637152, "label": "Import workouts"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/4#issuecomment-513440090", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4", "id": 513440090, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMzQ0MDA5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-20T06:11:50Z", "updated_at": "2019-07-20T06:11:50Z", "author_association": "MEMBER", "body": "Some examples:\r\n\r\nhttps://github.com/dogsheep/healthkit-to-sqlite/blob/d016e70c31cf84ba0f5ec3102546db54a51aaffb/tests/export.xml#L4-L13", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 470640505, "label": "Import Records"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/5#issuecomment-513514978", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5", "id": 513514978, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMzUxNDk3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-21T02:55:12Z", "updated_at": "2019-07-21T02:55:12Z", "author_association": "MEMBER", "body": "I'm going to show this by default. Users can pass `-s` or `--silent` to disable the progress bar.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 470691622, "label": "Add progress bar"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/5#issuecomment-513625406", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5", "id": 513625406, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMzYyNTQwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-22T03:20:16Z", "updated_at": "2019-07-22T03:20:16Z", "author_association": "MEMBER", "body": "It now renders like this:\r\n```\r\nImporting from HealthKit [#-----------------------------------] 5% 00:01:33\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 470691622, "label": "Add progress bar"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/6#issuecomment-513626742", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6", "id": 513626742, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMzYyNjc0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-22T03:28:55Z", "updated_at": "2019-07-22T03:28:55Z", "author_association": "MEMBER", "body": "Here's what it looks like now as separate tables:\r\n\r\n\"hello9_and_Populate__endpoint__key_in_ASGI_scope_\u00b7_Issue__537_\u00b7_simonw_datasette\"\r\n\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 470856782, "label": "Break up records into different tables for each type"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514496725", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7", "id": 514496725, "node_id": "MDEyOklzc3VlQ29tbWVudDUxNDQ5NjcyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-24T06:20:59Z", "updated_at": "2019-07-24T06:20:59Z", "author_association": "MEMBER", "body": "I'm using https://pypi.org/project/memory-profiler/ to explore this in more detail:\r\n\r\n```\r\n$ pip install memory-profiler matplotlib\r\n```\r\n\r\nThen:\r\n\r\n```\r\n$ mprof run healthkit-to-sqlite ~/Downloads/healthkit-export.zip healthkit.db\r\n$ mprof plot\r\n```\r\n\r\n\"Screen\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 472097220, "label": "Script uses a lot of RAM"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514498221", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7", "id": 514498221, "node_id": "MDEyOklzc3VlQ29tbWVudDUxNDQ5ODIyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-24T06:26:49Z", "updated_at": "2019-07-24T06:26:49Z", "author_association": "MEMBER", "body": "Adding `el.clear()` got me a huge improvement:\r\n\r\n\"Screen\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 472097220, "label": "Script uses a lot of RAM"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514500253", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7", "id": 514500253, "node_id": "MDEyOklzc3VlQ29tbWVudDUxNDUwMDI1Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-24T06:34:28Z", "updated_at": "2019-07-24T06:34:28Z", "author_association": "MEMBER", "body": "Clearing the root element each time saved even more:\r\n\r\n\"Screen\r\n", "reactions": "{\"total_count\": 2, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 2, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 472097220, "label": "Script uses a lot of RAM"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515226724", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9", "id": 515226724, "node_id": "MDEyOklzc3VlQ29tbWVudDUxNTIyNjcyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-25T21:46:01Z", "updated_at": "2019-07-25T21:46:01Z", "author_association": "MEMBER", "body": "I can work around this here (prior to the fix in sqlite-utils) by setting the batch size to something a bit lower here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 472429048, "label": "Too many SQL variables"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/9#issuecomment-515322294", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9", "id": 515322294, "node_id": "MDEyOklzc3VlQ29tbWVudDUxNTMyMjI5NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-26T06:07:12Z", "updated_at": "2019-07-26T06:07:12Z", "author_association": "MEMBER", "body": "@tholo this should be fixed in just-released version 0.3.2 - could you run a `pip install -U healthkit-to-sqlite` and let me know if it works for you now?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 472429048, "label": "Too many SQL variables"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/issues/2#issuecomment-526701674", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2", "id": 526701674, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNjcwMTY3NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-08-30T18:24:26Z", "updated_at": "2019-08-30T18:24:26Z", "author_association": "MEMBER", "body": "I renamed `--file` to `--load` in 0e5b6025c6f9823ff81aa8aae1cbff5c45e57baf", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 487598468, "label": "--save option to dump checkins to a JSON file on disk"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/issues/4#issuecomment-526853542", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4", "id": 526853542, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNjg1MzU0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-08-31T18:06:32Z", "updated_at": "2019-08-31T18:06:32Z", "author_association": "MEMBER", "body": "https://your-foursquare-oauth-token.glitch.me/\r\n\r\nSource code: https://glitch.com/~your-foursquare-oauth-token", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 487601121, "label": "Online tool for getting a Foursquare OAuth token"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-527200332", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3", "id": 527200332, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNzIwMDMzMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-02T16:32:20Z", "updated_at": "2019-09-02T16:32:39Z", "author_association": "MEMBER", "body": "Also needed: an option for \"fetch all checkins created within the last X days\".\r\n\r\nThis should help provide support for that Swarm feature where you can retroactively checkin to places in the past.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 487600595, "label": "Option to fetch only checkins more recent than the current max checkin"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-527682713", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4", "id": 527682713, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNzY4MjcxMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-03T23:48:57Z", "updated_at": "2019-09-03T23:48:57Z", "author_association": "MEMBER", "body": "One interesting challenge here is that the JSON format for tweets in the archive is subtly different from the JSON format currently returned by the API.\r\n\r\nIf we want to keep the tweets in the same database table (which feels like the right thing to me) we'll need to handle this.\r\n\r\nOne thing we can do is have a column for `from_archive` which is set to 1 for tweets that were recovered from the archive.\r\n\r\nWe can also ensure that tweets from the API always over-write the version that came from the archive (using `.upsert()`) while tweets from the archive use `.insert(..., ignore=True)` to avoid over-writing a better version that came from the API.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488835586, "label": "Command for importing data from a Twitter Export file"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/5#issuecomment-527684202", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5", "id": 527684202, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNzY4NDIwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-03T23:56:28Z", "updated_at": "2019-09-03T23:56:28Z", "author_association": "MEMBER", "body": "I previously used betamax here: https://github.com/simonw/github-contents/blob/master/test_github_contents.py", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488874815, "label": "Write tests that simulate the Twitter API"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527954898", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2", "id": 527954898, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNzk1NDg5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-04T15:31:46Z", "updated_at": "2019-09-04T15:31:46Z", "author_association": "MEMBER", "body": "I'm going to call this `twitter-to-sqlite user-timeline` to reflect the language used to describe the API endpoint: https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833698, "label": "\"twitter-to-sqlite user-timeline\" command for pulling tweets by a specific user"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527955302", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2", "id": 527955302, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNzk1NTMwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-04T15:32:39Z", "updated_at": "2019-09-04T15:32:39Z", "author_association": "MEMBER", "body": "Rate limit is 900 / 15 minutes which is 1 call per second.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833698, "label": "\"twitter-to-sqlite user-timeline\" command for pulling tweets by a specific user"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/2#issuecomment-527990908", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2", "id": 527990908, "node_id": "MDEyOklzc3VlQ29tbWVudDUyNzk5MDkwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-04T16:57:24Z", "updated_at": "2019-09-04T16:57:24Z", "author_association": "MEMBER", "body": "I just tried this using `max_id=` pagination as described in [Working with timelines](https://developer.twitter.com/en/docs/tweets/timelines/guides/working-with-timelines) and I got back all 17,759 of my tweets.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833698, "label": "\"twitter-to-sqlite user-timeline\" command for pulling tweets by a specific user"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-529239307", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8", "id": 529239307, "node_id": "MDEyOklzc3VlQ29tbWVudDUyOTIzOTMwNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-08T20:36:49Z", "updated_at": "2019-09-08T20:36:49Z", "author_association": "MEMBER", "body": "`--attach` can optionally take a name for the database connection alias like this:\r\n\r\n $ twitter-to-sqlite users-lookup users.db --attach foo:attending.db ...\r\n\r\nIf you omit the `alias:` bit the stem of the database (without the file extension) will be used.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 490803176, "label": "--sql and --attach options for feeding commands from SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-529240286", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8", "id": 529240286, "node_id": "MDEyOklzc3VlQ29tbWVudDUyOTI0MDI4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-08T20:48:33Z", "updated_at": "2019-09-08T20:48:33Z", "author_association": "MEMBER", "body": "```ATTACH DATABASE \"file:blah.db?mode=ro\" AS foo```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 490803176, "label": "--sql and --attach options for feeding commands from SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/9#issuecomment-530028567", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9", "id": 530028567, "node_id": "MDEyOklzc3VlQ29tbWVudDUzMDAyODU2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-10T16:59:25Z", "updated_at": "2019-09-10T16:59:25Z", "author_association": "MEMBER", "body": "By default in SQLite foreign key constraints are not enforced (you need to run `PRAGMA foreign_keys = ON;` to enforce them).\r\n\r\nWe will take advantage of this - even though the `following` table has foreign keys against user we will allow IDs to populate that table without a corresponding user record.\r\n\r\nIn the future we may add a command that can backfill missing user records.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 491791152, "label": "followers-ids and friends-ids subcommands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-530417631", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8", "id": 530417631, "node_id": "MDEyOklzc3VlQ29tbWVudDUzMDQxNzYzMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-11T14:52:44Z", "updated_at": "2019-09-14T19:09:22Z", "author_association": "MEMBER", "body": "- [x] This needs documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 490803176, "label": "--sql and --attach options for feeding commands from SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/8#issuecomment-531404891", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8", "id": 531404891, "node_id": "MDEyOklzc3VlQ29tbWVudDUzMTQwNDg5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-13T22:01:57Z", "updated_at": "2019-09-13T22:01:57Z", "author_association": "MEMBER", "body": "I also wrote about this in https://simonwillison.net/2019/Sep/13/weeknotestwitter-sqlite-datasette-rure/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 490803176, "label": "--sql and --attach options for feeding commands from SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/3#issuecomment-531516956", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3", "id": 531516956, "node_id": "MDEyOklzc3VlQ29tbWVudDUzMTUxNjk1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-14T21:56:31Z", "updated_at": "2019-09-14T21:56:31Z", "author_association": "MEMBER", "body": "https://api.github.com/users/simonw/repos\r\n\r\nIt would be useful to be able to fetch stargazers, forks etc as well. Not sure if that should be a separate command or a `--stargazers` option to this command.\r\n\r\nProbably a separate command since `issues` is a separate command already.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 493670426, "label": "Command to fetch all repos belonging to a user or organization"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/3#issuecomment-531517083", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3", "id": 531517083, "node_id": "MDEyOklzc3VlQ29tbWVudDUzMTUxNzA4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-14T21:58:42Z", "updated_at": "2019-09-14T21:58:42Z", "author_association": "MEMBER", "body": "Split stargazers into #4", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 493670426, "label": "Command to fetch all repos belonging to a user or organization"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-531517138", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4", "id": 531517138, "node_id": "MDEyOklzc3VlQ29tbWVudDUzMTUxNzEzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-09-14T21:59:59Z", "updated_at": "2019-09-14T21:59:59Z", "author_association": "MEMBER", "body": "Paginate through https://api.github.com/repos/simonw/datasette/stargazers\r\n\r\nSend `Accept: application/vnd.github.v3.star+json` to get the `starred_at` dates.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 493670730, "label": "Command to fetch stargazers for one or more repos"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/11#issuecomment-538711918", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11", "id": 538711918, "node_id": "MDEyOklzc3VlQ29tbWVudDUzODcxMTkxOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-06T04:54:17Z", "updated_at": "2019-10-06T04:54:17Z", "author_association": "MEMBER", "body": "Shipped in 0.6. Here's the documentation: https://github.com/dogsheep/twitter-to-sqlite#capturing-tweets-in-real-time-with-track-and-follow", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503045221, "label": "Commands for recording real-time tweets from the streaming API"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/13#issuecomment-538804815", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13", "id": 538804815, "node_id": "MDEyOklzc3VlQ29tbWVudDUzODgwNDgxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-07T00:33:49Z", "updated_at": "2019-10-07T00:33:49Z", "author_association": "MEMBER", "body": "Documentation: https://github.com/dogsheep/twitter-to-sqlite#retrieve-tweets-in-bulk", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503085013, "label": "statuses-lookup command"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/pocket-to-sqlite/issues/1#issuecomment-538847446", "issue_url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1", "id": 538847446, "node_id": "MDEyOklzc3VlQ29tbWVudDUzODg0NzQ0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-07T05:41:17Z", "updated_at": "2019-10-07T05:41:17Z", "author_association": "MEMBER", "body": "Prototype code:\r\n```python\r\noffset = 0\r\nfetched = []\r\nsize = 500\r\nwhile True:\r\n page = requests.get(\"https://getpocket.com/v3/get\", {\r\n \"consumer_key\": consumer_key,\r\n \"access_token\": access_token,\r\n \"sort\": \"oldest\",\r\n \"detailType\": \"complete\",\r\n \"count\": size,\r\n \"offset\": offset,\r\n }).json()\r\n print(offset)\r\n fetched.append(page)\r\n offset += size\r\n if not len(page[\"list\"]):\r\n break\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503233021, "label": "Use better pagination (and implement progress bar)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/pocket-to-sqlite/issues/2#issuecomment-538847796", "issue_url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2", "id": 538847796, "node_id": "MDEyOklzc3VlQ29tbWVudDUzODg0Nzc5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-07T05:43:30Z", "updated_at": "2019-10-07T05:43:30Z", "author_association": "MEMBER", "body": "We can persist the `since` value in its own single-row table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503234169, "label": "Track and use the 'since' value"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-540879620", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4", "id": 540879620, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MDg3OTYyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T02:59:16Z", "updated_at": "2019-10-11T02:59:16Z", "author_association": "MEMBER", "body": "Also import ad preferences and all that other junk.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488835586, "label": "Command for importing data from a Twitter Export file"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112108", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17", "id": 541112108, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTExMjEwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T15:30:15Z", "updated_at": "2019-10-11T15:30:15Z", "author_association": "MEMBER", "body": "It should delete the tables entirely. That way it will work even if the table schema has changed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 505674949, "label": "import command should empty all archive-* tables first"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112588", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17", "id": 541112588, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTExMjU4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T15:31:30Z", "updated_at": "2019-10-11T15:31:30Z", "author_association": "MEMBER", "body": "No need for an option:\r\n\r\n> This command will delete and recreate all of your `archive-*` tables every time you run it. If this is not what you want, run the command against a fresh SQLite database rather than running it again one that already exists.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 505674949, "label": "import command should empty all archive-* tables first"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118773", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18", "id": 541118773, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTExODc3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T15:48:31Z", "updated_at": "2019-10-11T15:48:31Z", "author_association": "MEMBER", "body": "https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-home_timeline", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 505928530, "label": "Command to import home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118934", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18", "id": 541118934, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTExODkzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T15:48:54Z", "updated_at": "2019-10-11T15:48:54Z", "author_association": "MEMBER", "body": "Rate limit is tight: 15 requests every 15 mins!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 505928530, "label": "Command to import home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541119834", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18", "id": 541119834, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTExOTgzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T15:51:22Z", "updated_at": "2019-10-11T16:51:33Z", "author_association": "MEMBER", "body": "In order to support multiple user timelines being saved in the same database, I'm going to import the tweets into the `tweets` table AND add a new `timeline_tweets` table recording that a specific tweet showed up in a specific user's timeline.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 505928530, "label": "Command to import home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541141169", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18", "id": 541141169, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTE0MTE2OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T16:51:29Z", "updated_at": "2019-10-11T16:51:29Z", "author_association": "MEMBER", "body": "Documented here: https://github.com/dogsheep/twitter-to-sqlite/blob/master/README.md#retrieving-tweets-from-your-home-timeline", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 505928530, "label": "Command to import home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-541248629", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19", "id": 541248629, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTI0ODYyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-11T22:48:56Z", "updated_at": "2019-10-11T22:48:56Z", "author_association": "MEMBER", "body": "`since_id` documented here: https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-home_timeline\r\n\r\n> Returns results with an ID greater than (that is, more recent than) the specified ID. There are limits to the number of Tweets which can be accessed through the API. If the limit of Tweets has occured since the since_id, the since_id will be forced to the oldest ID available.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506087267, "label": "since_id support for home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/6#issuecomment-541387822", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6", "id": 541387822, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTM4NzgyMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-13T05:27:39Z", "updated_at": "2019-10-13T05:27:39Z", "author_association": "MEMBER", "body": "This should be fixed by https://github.com/dogsheep/github-to-sqlite/commit/552543a74970f8a3a3f87f887be23a0c6eb1cb5b", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 504238461, "label": "sqlite3.OperationalError: table users has no column named bio"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/6#issuecomment-541387941", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6", "id": 541387941, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTM4Nzk0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-13T05:30:19Z", "updated_at": "2019-10-13T05:30:19Z", "author_association": "MEMBER", "body": "Fix released in 0.5: https://github.com/dogsheep/github-to-sqlite/releases/tag/0.5", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 504238461, "label": "sqlite3.OperationalError: table users has no column named bio"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-541388038", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20", "id": 541388038, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTM4ODAzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-13T05:31:58Z", "updated_at": "2019-10-13T05:31:58Z", "author_association": "MEMBER", "body": "For favourites a `--stop_after=200` option is probably good enough.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506268945, "label": "--since support for various commands for refresh-by-cron"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-541493242", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3", "id": 541493242, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTQ5MzI0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-14T03:35:36Z", "updated_at": "2019-10-14T03:35:36Z", "author_association": "MEMBER", "body": "https://developer.twitter.com/en/docs/tweets/search/api-reference/get-search-tweets\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833975, "label": "Command for running a search and saving tweets for that search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/7#issuecomment-541721437", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7", "id": 541721437, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTcyMTQzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-14T14:44:12Z", "updated_at": "2019-10-14T14:44:12Z", "author_association": "MEMBER", "body": "Docs: https://github.com/dogsheep/github-to-sqlite/blob/0.5/README.md#retrieving-issue-comments-for-a-repository", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506276893, "label": "issue-comments command for importing issue comments"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-541748580", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10", "id": 541748580, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTc0ODU4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-14T15:30:44Z", "updated_at": "2019-10-14T15:30:44Z", "author_association": "MEMBER", "body": "Had several recommendations for https://github.com/tqdm/tqdm which is what goodreads-to-sqlite uses.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 492297930, "label": "Rethink progress bars for various commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/21#issuecomment-542333836", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21", "id": 542333836, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MjMzMzgzNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-15T18:00:48Z", "updated_at": "2019-10-15T18:00:48Z", "author_association": "MEMBER", "body": "I'll use `html.unescape()` for this: https://docs.python.org/3/library/html.html#html.unescape", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506432572, "label": "Fix & escapes in tweet text"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542832952", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19", "id": 542832952, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MjgzMjk1Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T18:30:11Z", "updated_at": "2019-10-16T18:30:11Z", "author_association": "MEMBER", "body": "The `--since` option will derive the `since_id` from the max ID in the `timeline_tweets` table:\r\n\r\n $ twitter-to-sqlite home-timeline --since\r\n\r\nThe `--since_id=xxx` option lets you specify that ID directly.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506087267, "label": "since_id support for home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542849963", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19", "id": 542849963, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg0OTk2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T19:13:06Z", "updated_at": "2019-10-16T19:13:06Z", "author_association": "MEMBER", "body": "Updated documentation: https://github.com/dogsheep/twitter-to-sqlite/blob/fced2a9b67d2cbdf9817f1eb75f7c28e413c963b/README.md#retrieving-tweets-from-your-home-timeline", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506087267, "label": "since_id support for home-timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-542854749", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20", "id": 542854749, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg1NDc0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T19:26:01Z", "updated_at": "2019-10-16T19:26:01Z", "author_association": "MEMBER", "body": "I'm not going to do this for \"accounts that have followed me\" and \"new accounts that I have followed\" - instead I will recommend running the `friend_ids` and `followers_ids` commands on a daily basis since that data doesn't really change much by the hour.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506268945, "label": "--since support for various commands for refresh-by-cron"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855081", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12", "id": 542855081, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg1NTA4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T19:26:56Z", "updated_at": "2019-10-16T19:26:56Z", "author_association": "MEMBER", "body": "This may be the first case where I want to be able to repair existing databases rather than discarding their contents.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503053800, "label": "Extract \"source\" into a separate lookup table"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855427", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12", "id": 542855427, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg1NTQyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T19:27:55Z", "updated_at": "2019-10-16T19:27:55Z", "author_association": "MEMBER", "body": "I can do that by keeping `source` as a `TEXT` column but turning it into a non-enforced foreign key against a new `sources` table. Then I can run code that scans that column for any values beginning with a `<` and converts them.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503053800, "label": "Extract \"source\" into a separate lookup table"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542858025", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12", "id": 542858025, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg1ODAyNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T19:35:31Z", "updated_at": "2019-10-16T19:36:09Z", "author_association": "MEMBER", "body": "Maybe this means I need an `upgrade` command to apply these kinds of migrations? Total feature creep!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 503053800, "label": "Extract \"source\" into a separate lookup table"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542875885", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3", "id": 542875885, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg3NTg4NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T20:23:08Z", "updated_at": "2019-10-16T20:23:08Z", "author_association": "MEMBER", "body": "https://developer.foursquare.com/docs/api/users/checkins documents `afterTimestamp`:\r\n> Retrieve the first results to follow these seconds since epoch. This should be useful for paging forward in time, or when polling for changes. To avoid missing results when polling, we recommend subtracting several seconds from the last poll time and then de-duplicating.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 487600595, "label": "Option to fetch only checkins more recent than the current max checkin"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542876047", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3", "id": 542876047, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg3NjA0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T20:23:36Z", "updated_at": "2019-10-16T20:23:36Z", "author_association": "MEMBER", "body": "I'm going to go with `--since=1d/2w/3h` for this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 487600595, "label": "Option to fetch only checkins more recent than the current max checkin"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542882604", "issue_url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3", "id": 542882604, "node_id": "MDEyOklzc3VlQ29tbWVudDU0Mjg4MjYwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-16T20:41:23Z", "updated_at": "2019-10-16T20:41:23Z", "author_association": "MEMBER", "body": "Documented here: https://github.com/dogsheep/swarm-to-sqlite/blob/0.2/README.md#usage", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 487600595, "label": "Option to fetch only checkins more recent than the current max checkin"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/23#issuecomment-543217890", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23", "id": 543217890, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzIxNzg5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T15:03:10Z", "updated_at": "2019-10-17T15:03:10Z", "author_association": "MEMBER", "body": "Thinking about this further: the concept of migrations may end up being in direct conflict with the `sqlite-utils` concept of creating tables on demand the first time they are used - and of creating table schemas automatically to fit the shape of the JSON that is being inserted into them.\r\n\r\nI'm going to forge ahead anyway and build this because I think it will be an interesting exploration, but it's very likely this turns out to be a bad idea in the long run!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 508190730, "label": "Extremely simple migration system"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/23#issuecomment-543222239", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23", "id": 543222239, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzIyMjIzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T15:12:33Z", "updated_at": "2019-10-17T15:12:33Z", "author_association": "MEMBER", "body": "Migrations will run only if you open a database that previously existed (as opposed to opening a brand new empty database).\r\n\r\nThis means that the first time you run a command against a fresh database, migrations will not run and the `migrations` table will not be created. The _second_ time you run any command against that database the migrations will execute and populate the `migrations` table.\r\n\r\nThis also means that each migration needs to be able to sanity check the database to see if it should run or not. If it should NOT run, it will do nothing but still be marked as having executed by adding to the `migrations` table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 508190730, "label": "Extremely simple migration system"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543265058", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25", "id": 543265058, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI2NTA1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T16:51:12Z", "updated_at": "2019-10-17T16:51:12Z", "author_association": "MEMBER", "body": "This migration function only runs if there is a table called `tweets` and the migration has not run before.\r\n\r\nI think this can happen if the database has just been freshly created (by a command that fetches the user's user timeline for example) and is then run a SECOND time.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 508578780, "label": "Ensure migrations don't accidentally create foreign key twice"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/25#issuecomment-543266947", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25", "id": 543266947, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI2Njk0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T16:56:06Z", "updated_at": "2019-10-17T16:56:06Z", "author_association": "MEMBER", "body": "I wrote a test that proves that this is a problem. Should be an easy fix though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 508578780, "label": "Ensure migrations don't accidentally create foreign key twice"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543269396", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10", "id": 543269396, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI2OTM5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T17:02:07Z", "updated_at": "2019-10-17T17:02:07Z", "author_association": "MEMBER", "body": "A neat trick that Click does is detecting if an interactive terminal is attached and NOT showing a progress bar if there isn't one. Need to figure out how to do that with tqdm.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 492297930, "label": "Rethink progress bars for various commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543270714", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10", "id": 543270714, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI3MDcxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T17:05:16Z", "updated_at": "2019-10-17T17:05:16Z", "author_association": "MEMBER", "body": "https://github.com/pallets/click/blob/716a5be90f56ce6cd506bb53d5739d09374b1636/click/_termui_impl.py#L93 is how Click does this:\r\n```\r\n self.is_hidden = not isatty(self.file)\r\n```\r\nWhere `isatty` is a Click utility function: `from ._compat import isatty`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 492297930, "label": "Rethink progress bars for various commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/10#issuecomment-543271000", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10", "id": 543271000, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI3MTAwMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T17:05:59Z", "updated_at": "2019-10-17T17:05:59Z", "author_association": "MEMBER", "body": "Looks like tqdm already does a TTY check here: https://github.com/tqdm/tqdm/blob/89b73bdc30c099c5b53725806e7edf3a121c9b3a/tqdm/std.py#L889-L890", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 492297930, "label": "Rethink progress bars for various commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543273540", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3", "id": 543273540, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI3MzU0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T17:12:51Z", "updated_at": "2019-10-17T17:12:51Z", "author_association": "MEMBER", "body": "Just importing tweets here isn't enough - how are we supposed to know which tweets were imported by which search?\r\n\r\nSo I think the right thing to do here is to also create a `search_runs` table, which records each individual run of this tool (with a timestamp and the search terms used). Then have a `search_runs_tweets` m2m table which shows which Tweets were found by that search.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833975, "label": "Command for running a search and saving tweets for that search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-543290744", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3", "id": 543290744, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MzI5MDc0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-17T17:57:14Z", "updated_at": "2019-10-17T17:57:14Z", "author_association": "MEMBER", "body": "I have a working command now. I'm going to ship it early because it could do with some other people trying it out.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833975, "label": "Command for running a search and saving tweets for that search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20", "id": 544335363, "node_id": "MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-21T03:32:04Z", "updated_at": "2019-10-21T03:32:04Z", "author_association": "MEMBER", "body": "In case anyone is interested, here's an extract from the crontab I'm running these under at the moment:\r\n```\r\n1,11,21,31,41,51 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite user-timeline /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --since\r\n2,7,12,17,22,27,32,37,42,47,52,57 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite home-timeline /home/ubuntu/timeline.db -a /home/ubuntu/auth.json --since\r\n6,16,26,36,46,56 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite favorites /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --stop_after=50\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506268945, "label": "--since support for various commands for refresh-by-cron"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-544646516", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 544646516, "node_id": "MDEyOklzc3VlQ29tbWVudDU0NDY0NjUxNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-21T18:30:14Z", "updated_at": "2019-10-21T18:30:14Z", "author_association": "MEMBER", "body": "Thanks to help from Dr. Laura Cantino at Science Hack Day San Francisco I've been able to pull together this query:\r\n\r\n```sql\r\nselect rsid, genotype, case genotype\r\n when 'AA' then 'brown eye color, 80% of the time'\r\n when 'AG' then 'brown eye color'\r\n when 'GG' then 'blue eye color, 99% of the time'\r\nend as interpretation from genome where rsid = 'rs12913832'\r\n```\r\n\r\nSee also https://www.snpedia.com/index.php/Rs12913832 - in particular this table:\r\n\r\n\"rs12913832_-_SNPedia\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-544648863", "issue_url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1", "id": 544648863, "node_id": "MDEyOklzc3VlQ29tbWVudDU0NDY0ODg2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-21T18:36:03Z", "updated_at": "2019-10-21T18:36:03Z", "author_association": "MEMBER", "body": "\"natalie__select_rsid__genotype__case_genotype_when__AA__then__brown_eye_color__80__of_the_time__when__AG__then__brown_eye_color__when__GG__then__blue_eye_color__99__of_the_time__end_as_interpretation_from_genome_where_rsid____rs12913832__an\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 496415321, "label": "Figure out some interesting example SQL queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/26#issuecomment-547713287", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26", "id": 547713287, "node_id": "MDEyOklzc3VlQ29tbWVudDU0NzcxMzI4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-30T02:36:13Z", "updated_at": "2019-10-30T02:36:13Z", "author_association": "MEMBER", "body": "Shipped this in 0.13: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.13\r\n\r\nSee also this Twitter thread: https://twitter.com/simonw/status/1189369677509623809", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 513074501, "label": "Command for importing mentions timeline"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549094195", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8", "id": 549094195, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NDE5NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T00:43:16Z", "updated_at": "2019-11-03T00:43:28Z", "author_association": "MEMBER", "body": "Also need to take #5 into account - if this command creates incomplete user records, how do we repair them?\r\n\r\nAnd make sure that if we run this command first any future commands that populate users don't break (probably just a case of using `alter=True` in a few places).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 516763727, "label": "stargazers command, refs #4"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/5#issuecomment-549094229", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5", "id": 549094229, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NDIyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T00:44:03Z", "updated_at": "2019-11-03T00:44:03Z", "author_association": "MEMBER", "body": "Might not need an incomplete boolean - may be possible to handle this with `alter=True` and then by filtering for users with null values in certain columns.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 493671014, "label": "Add \"incomplete\" boolean to users table for incomplete profiles"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095217", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27", "id": 549095217, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NTIxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T01:06:25Z", "updated_at": "2019-11-03T01:06:25Z", "author_association": "MEMBER", "body": "Wow, that `retweets_of_me` endpoint is almost completely useless:\r\n```\r\n$ twitter-to-sqlite fetch https://api.twitter.com/1.1/statuses/retweets_of_me.json\r\n```\r\nIt returns my own tweets that have been retweeted, but with no indication at all of who retweeted them.\r\n\r\nIt looks like this needs to be combined with this API - https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets-id - to fetch the details of up to 100 recent users who actually DID retweet an individual status. But that has a one-every-12-seconds rate limit on it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 514459062, "label": "retweets-of-me command"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095317", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27", "id": 549095317, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NTMxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T01:08:10Z", "updated_at": "2019-11-03T01:08:10Z", "author_association": "MEMBER", "body": "Hmm... one thing that could be useful is that `retweets_of_me` can support a `--since` parameter - so if run frequently it should hopefully let us know which tweets we would need to run `statuses/retweets/:id.json` against.\r\n\r\nI'm not sure if the `--since` parameter would show me a tweet that was previously retweeted but has now been retweeted again. I'll have a bit of a test and see.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 514459062, "label": "retweets-of-me command"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095463", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27", "id": 549095463, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NTQ2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T01:10:52Z", "updated_at": "2019-11-03T01:10:52Z", "author_association": "MEMBER", "body": "I imagine it won't, since the data I would be recording and then passing to `since_id` would be the highest ID of my own tweets that have been retweeted at least once. So it won't be able to spot if I should check for fresh retweets of a given tweet.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 514459062, "label": "retweets-of-me command"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/27#issuecomment-549095641", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27", "id": 549095641, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NTY0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T01:12:58Z", "updated_at": "2019-11-03T01:12:58Z", "author_association": "MEMBER", "body": "It looks like Twitter really want you to subscribe to a premium API for this kind of thing and consume retweets via webhooks: https://developer.twitter.com/en/docs/accounts-and-users/subscribe-account-activity/api-reference\r\n\r\nI'm going to give up on this for the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 514459062, "label": "retweets-of-me command"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549096321", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3", "id": 549096321, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTA5NjMyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-03T01:27:55Z", "updated_at": "2019-11-03T01:28:17Z", "author_association": "MEMBER", "body": "It would be neat if this could support `--since`, with that argument automatically finding the maximum tweet ID from a previous search that used the same exact arguments (using the `search_runs` table).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833975, "label": "Command for running a search and saving tweets for that search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549226399", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3", "id": 549226399, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTIyNjM5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-04T05:11:57Z", "updated_at": "2019-11-04T05:11:57Z", "author_association": "MEMBER", "body": "I'm going to add a `hash` column to `search_runs` to support that. It's going to be the sha1 hash of the key-ordered JSON of the search arguments used by that run. Then `--since` can look for an identical hash and use it to identify the highest last fetched tweet to use in `since_id`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833975, "label": "Command for running a search and saving tweets for that search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/3#issuecomment-549228535", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3", "id": 549228535, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTIyODUzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-04T05:31:55Z", "updated_at": "2019-11-04T05:31:55Z", "author_association": "MEMBER", "body": "Documented here: https://github.com/dogsheep/twitter-to-sqlite/blob/801c0c2daf17d8abce9dcb5d8d610410e7e25dbe/README.md#running-searches", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 488833975, "label": "Command for running a search and saving tweets for that search"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/10#issuecomment-549230337", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10", "id": 549230337, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTIzMDMzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-04T05:47:18Z", "updated_at": "2019-11-04T05:47:18Z", "author_association": "MEMBER", "body": "This definition isn't quite right - it's not pulling the identity of the user who starred the repo (`users.login` ends up being the owner login instead).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 516967682, "label": "Add this repos_starred view"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549230583", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8", "id": 549230583, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTIzMDU4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-04T05:49:26Z", "updated_at": "2019-11-04T05:49:26Z", "author_association": "MEMBER", "body": "Adding the view from #10 would be useful here too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 516763727, "label": "stargazers command, refs #4"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-549233778", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8", "id": 549233778, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTIzMzc3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-04T06:14:40Z", "updated_at": "2019-11-04T06:14:40Z", "author_association": "MEMBER", "body": "Spotted a tricky problem: running `github-to-sqlite starred stargazers.db` results in an incomplete `simonw` record. It creates a proper record for me thanks to this bit:\r\n\r\nhttps://github.com/dogsheep/github-to-sqlite/blob/ea07274667a08c67907e8bfbbccb6f0fb95ce817/github_to_sqlite/cli.py#L120-L126\r\n\r\nBut then... when it gets to the `datasette` repository which I have starred it over-writes my full user record with one that's missing most of the details, thanks to this bit:\r\n\r\nhttps://github.com/dogsheep/github-to-sqlite/blob/ea07274667a08c67907e8bfbbccb6f0fb95ce817/github_to_sqlite/utils.py#L117-L124\r\n\r\nI need to find a way of NOT over-writing a good record with a thinner one.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 516763727, "label": "stargazers command, refs #4"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/4#issuecomment-550388354", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4", "id": 550388354, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MDM4ODM1NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-06T16:26:55Z", "updated_at": "2019-11-06T16:26:55Z", "author_association": "MEMBER", "body": "Here's a query I figured out using a window function that shows cumulative stargazers over time:\r\n```sql\r\nselect\r\n yyyymmdd,\r\n sum(n) over (\r\n order by\r\n yyyymmdd rows unbounded preceding\r\n ) as cumulative_count\r\nfrom\r\n (\r\n select\r\n substr(starred_at, 0, 11) as yyyymmdd,\r\n count(*) as n\r\n from\r\n stars\r\n group by\r\n yyyymmdd\r\n )\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 493670730, "label": "Command to fetch stargazers for one or more repos"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550783316", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10", "id": 550783316, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MDc4MzMxNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-07T05:16:56Z", "updated_at": "2019-11-07T05:34:29Z", "author_association": "MEMBER", "body": "It looks like Apple changed the location of these in iOS 13 - they are now in separate `.gpx` files:\r\n\r\n![2FF70E95-CDEE-4241-A5C5-EE95A862E519](https://user-images.githubusercontent.com/9599/68362042-be12e000-00da-11ea-8925-7397410332d8.png)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 519038979, "label": "Failed to import workout points"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550806302", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10", "id": 550806302, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MDgwNjMwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-07T05:33:31Z", "updated_at": "2019-11-07T05:33:31Z", "author_association": "MEMBER", "body": "The XML now includes references to these new files:\r\n\r\n![CBBA54FC-51FB-4BB3-927C-C2CA99237B04](https://user-images.githubusercontent.com/9599/68362716-121ec400-00dd-11ea-9846-387c7cd64c8b.jpeg)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 519038979, "label": "Failed to import workout points"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550824838", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10", "id": 550824838, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MDgyNDgzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-07T05:47:07Z", "updated_at": "2019-11-07T05:47:07Z", "author_association": "MEMBER", "body": "Relevant code:\r\n\r\nhttps://github.com/dogsheep/healthkit-to-sqlite/blob/d16f45f06fbae6ec8a78cc9ca7b5b7db0413f139/healthkit_to_sqlite/utils.py#L58-L64", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 519038979, "label": "Failed to import workout points"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/healthkit-to-sqlite/issues/10#issuecomment-550828084", "issue_url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10", "id": 550828084, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MDgyODA4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-07T05:49:24Z", "updated_at": "2019-11-07T05:49:24Z", "author_association": "MEMBER", "body": "So the fix there is going to be to detect the new `FileReference` element and load the corresponding points data from it.\r\n\r\nThis will be a little tricky because that function will need access to the zip file.\r\n\r\nIt probably won't work at all for the mode where the `export.xml` file is passed directly using the `--xml` option.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 519038979, "label": "Failed to import workout points"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552129686", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29", "id": 552129686, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEyOTY4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T19:27:39Z", "updated_at": "2019-11-09T19:27:39Z", "author_association": "MEMBER", "body": "I think this is fixed by the latest version of `sqlite-utils` - https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-12-1 - I'll bump the dependency.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 518725064, "label": "`import` command fails on empty files"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552129921", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29", "id": 552129921, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEyOTkyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T19:30:42Z", "updated_at": "2019-11-09T19:30:42Z", "author_association": "MEMBER", "body": "Confirmed, that seems to fix it:\r\n```\r\n(twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ twitter-to-sqlite import blah.db ~/Dropbox/dogsheep/twitter-2019-06-25-b31f246100821b551f2f9a23f21ac6fb565dab49dd23a35630cabbf2b94a1f03/account-suspension.js \r\nTraceback (most recent call last):\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite\", line 11, in \r\n load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')()\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 764, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 717, in main\r\n rv = self.invoke(ctx)\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 1137, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 956, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py\", line 555, in invoke\r\n return callback(*args, **kwargs)\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py\", line 633, in import_\r\n archive.import_from_file(db, path.name, open(path, \"rb\").read())\r\n File \"/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/archive.py\", line 224, in import_from_file\r\n db[table_name].upsert_all(rows, hash_id=\"pk\")\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py\", line 1094, in upsert_all\r\n extracts=extracts,\r\n File \"/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py\", line 961, in insert_all\r\n first_record = next(records)\r\nStopIteration\r\n(twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ pip install -U sqlite-utils\r\nCollecting sqlite-utils\r\n Using cached https://files.pythonhosted.org/packages/ee/a2/1b135010c7ac8e2d7545f659e9e6c6ede0f406f20b52e08d5817e1e31a9a/sqlite_utils-1.12.1-py3-none-any.whl\r\nRequirement already satisfied, skipping upgrade: click in /Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages (from sqlite-utils) (7.0)\r\nRequirement already satisfied, skipping upgrade: tabulate in /Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages (from sqlite-utils) (0.8.5)\r\nRequirement already satisfied, skipping upgrade: click-default-group in /Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages (from sqlite-utils) (1.2.2)\r\nInstalling collected packages: sqlite-utils\r\n Found existing installation: sqlite-utils 1.11\r\n Uninstalling sqlite-utils-1.11:\r\n Successfully uninstalled sqlite-utils-1.11\r\nSuccessfully installed sqlite-utils-1.12.1\r\n(twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ twitter-to-sqlite import blah.db ~/Dropbox/dogsheep/twitter-2019-06-25-b31f246100821b551f2f9a23f21ac6fb565dab49dd23a35630cabbf2b94a1f03/account-suspension.js \r\n(twitter-to-sqlite) ~/Dropbox/Development/twitter-to-sqlite $ \r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 518725064, "label": "`import` command fails on empty files"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552131798", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30", "id": 552131798, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEzMTc5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T19:54:45Z", "updated_at": "2019-11-09T19:54:45Z", "author_association": "MEMBER", "body": "Good catch - not sure how that bug crept in. Removing line 116 looks like the right fix to me.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 518739697, "label": "`followers` fails because `transform_user` is called twice"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552133449", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29", "id": 552133449, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T20:15:15Z", "updated_at": "2019-11-09T20:15:15Z", "author_association": "MEMBER", "body": "Released: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.15", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 518725064, "label": "`import` command fails on empty files"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/30#issuecomment-552133468", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30", "id": 552133468, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ2OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T20:15:27Z", "updated_at": "2019-11-09T20:15:27Z", "author_association": "MEMBER", "body": "Released: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.15", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 518739697, "label": "`followers` fails because `transform_user` is called twice"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/28#issuecomment-552133488", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28", "id": 552133488, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEzMzQ4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T20:15:42Z", "updated_at": "2019-11-09T20:15:42Z", "author_association": "MEMBER", "body": "Released: https://github.com/dogsheep/twitter-to-sqlite/releases/tag/0.15", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 515658861, "label": "Add indexes to followers table"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/31#issuecomment-552135263", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31", "id": 552135263, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjEzNTI2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-09T20:38:35Z", "updated_at": "2019-11-09T20:38:35Z", "author_association": "MEMBER", "body": "Command still needs documentation and a bit more testing.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 520508502, "label": "\"friends\" command (similar to \"followers\")"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559883311", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14", "id": 559883311, "node_id": "MDEyOklzc3VlQ29tbWVudDU1OTg4MzMxMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-29T21:30:37Z", "updated_at": "2019-11-29T21:30:37Z", "author_association": "MEMBER", "body": "I should build the command to persist ETags and obey their polling guidelines:\r\n\r\n> Events are optimized for polling with the \"ETag\" header. If no new events have been triggered, you will see a \"304 Not Modified\" response, and your current rate limit will be untouched. There is also an \"X-Poll-Interval\" header that specifies how often (in seconds) you are allowed to poll. In times of high server load, the time may increase. Please obey the header.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 530491074, "label": "Command for importing events"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559902818", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14", "id": 559902818, "node_id": "MDEyOklzc3VlQ29tbWVudDU1OTkwMjgxOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-30T01:32:38Z", "updated_at": "2019-11-30T01:32:38Z", "author_association": "MEMBER", "body": "Prototype:\r\n```\r\npip install sqlite-utils paginate-json\r\npaginate-json \"https://api.github.com/users/simonw/events\" | sqlite-utils insert /tmp/events.db events - --pk=id\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 530491074, "label": "Command for importing events"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/12#issuecomment-594151327", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12", "id": 594151327, "node_id": "MDEyOklzc3VlQ29tbWVudDU5NDE1MTMyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-03T20:26:15Z", "updated_at": "2020-03-03T20:32:23Z", "author_association": "MEMBER", "body": "Better version (since this also includes JSON array of repository topics):\r\n```sql\r\nCREATE VIEW recent_releases AS select\r\n repos.rowid as rowid,\r\n json_object(\"label\", repos.full_name, \"href\", repos.html_url) as repo,\r\n json_object(\r\n \"href\",\r\n releases.html_url,\r\n \"label\",\r\n releases.name\r\n ) as release,\r\n substr(releases.published_at, 0, 11) as date,\r\n releases.body as body_markdown,\r\n releases.published_at,\r\n coalesce(repos.topics, '[]') as topics\r\nfrom\r\n releases\r\n join repos on repos.id = releases.repo\r\norder by\r\n releases.published_at desc\r\n```\r\nThat `repos.rowid as rowid` bit is necessary because otherwise clicking on a link in facet-by-topic doesn't return any results.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 520756546, "label": "Add this view for seeing new releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/pull/8#issuecomment-594154644", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8", "id": 594154644, "node_id": "MDEyOklzc3VlQ29tbWVudDU5NDE1NDY0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-03T20:33:57Z", "updated_at": "2020-03-03T20:33:57Z", "author_association": "MEMBER", "body": "`sqlite-utils` supports proper upserts now so this problem should be easy to fix.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 516763727, "label": "stargazers command, refs #4"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/12#issuecomment-594155249", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12", "id": 594155249, "node_id": "MDEyOklzc3VlQ29tbWVudDU5NDE1NTI0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-03T20:35:17Z", "updated_at": "2020-03-03T20:35:17Z", "author_association": "MEMBER", "body": "`swarm-to-sqlite` has an example of adding views here: https://github.com/dogsheep/swarm-to-sqlite/blob/f2c89dd613fb8a7f14e5267ccc2145463b996190/swarm_to_sqlite/utils.py#L141\r\n\r\nI think that approach can be approved by first checking if the view exists, then dropping it, then recreating it. Could even try to see if the view exists and matches what we were going to set it to and do nothing if that is the case.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 520756546, "label": "Add this view for seeing new releases"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597354514", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17", "id": 597354514, "node_id": "MDEyOklzc3VlQ29tbWVudDU5NzM1NDUxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-10T22:37:45Z", "updated_at": "2020-03-10T22:37:45Z", "author_association": "MEMBER", "body": "I should add an option to stop the moment you see a commit you have fetched before.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 578883725, "label": "Command for importing commits"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/17#issuecomment-597358364", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17", "id": 597358364, "node_id": "MDEyOklzc3VlQ29tbWVudDU5NzM1ODM2NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-10T22:50:20Z", "updated_at": "2020-03-11T01:18:36Z", "author_association": "MEMBER", "body": "By default it will stop when it sees a commit that has already been stored. You will be able to over-ride that behaviour using `--all`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 578883725, "label": "Command for importing commits"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/34#issuecomment-601861908", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34", "id": 601861908, "node_id": "MDEyOklzc3VlQ29tbWVudDYwMTg2MTkwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-20T18:56:44Z", "updated_at": "2020-03-20T18:56:44Z", "author_association": "MEMBER", "body": "Could this be a bug in `sqlite-utils`? This table has a primary key, so why is it running a query on `rowid = ?`?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 585266763, "label": "IndexError running user-timeline command"}, "performed_via_github_app": null}