home / github

Menu
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where issue = 530491074

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

id ▼ html_url issue_url node_id user created_at updated_at author_association body reactions issue performed_via_github_app
559883311 https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559883311 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14 MDEyOklzc3VlQ29tbWVudDU1OTg4MzMxMQ== simonw 9599 2019-11-29T21:30:37Z 2019-11-29T21:30:37Z MEMBER I should build the command to persist ETags and obey their polling guidelines: > Events are optimized for polling with the "ETag" header. If no new events have been triggered, you will see a "304 Not Modified" response, and your current rate limit will be untouched. There is also an "X-Poll-Interval" header that specifies how often (in seconds) you are allowed to poll. In times of high server load, the time may increase. Please obey the header. {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for importing events 530491074  
559902818 https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559902818 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14 MDEyOklzc3VlQ29tbWVudDU1OTkwMjgxOA== simonw 9599 2019-11-30T01:32:38Z 2019-11-30T01:32:38Z MEMBER Prototype: ``` pip install sqlite-utils paginate-json paginate-json "https://api.github.com/users/simonw/events" | sqlite-utils insert /tmp/events.db events - --pk=id ``` {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for importing events 530491074  
613641947 https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-613641947 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14 MDEyOklzc3VlQ29tbWVudDYxMzY0MTk0Nw== simonw 9599 2020-04-14T19:38:24Z 2020-04-14T19:38:34Z MEMBER Since events include payloads with full object representations in them (for issues, repos and more) running this command every few minutes may be all it takes to keep a constant copy of everything updated in a very rate-limit friendly manner (thanks to the ETags). {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for importing events 530491074  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 18.049ms · About: simonw/datasette-graphql