home / github

Menu
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where issue = 952179830

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

id ▼ html_url issue_url node_id user created_at updated_at author_association body reactions issue performed_via_github_app
886135562 https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886135562 https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2 IC_kwDODtX3eM400VsK simonw 9599 2021-07-25T02:01:11Z 2021-07-25T02:01:11Z MEMBER That page doesn't have an API but does look easy to scrape. The other option here is the HN Search API powered by Algolia, documented at https://hn.algolia.com/api {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for fetching Hacker News threads from the search API 952179830  
886135922 https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886135922 https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2 IC_kwDODtX3eM400Vxy simonw 9599 2021-07-25T02:06:20Z 2021-07-25T02:06:20Z MEMBER https://hn.algolia.com/api/v1/search_by_date?query=simonwillison.net&restrictSearchableAttributes=url looks like it does what I want. https://hn.algolia.com/api/v1/search_by_date?query=simonwillison.net&restrictSearchableAttributes=url&hitsPerPage=1000 - returns 1000 at once. Otherwise you have to paginate using `&page=2` etc - up to `nbPages` pages. https://www.algolia.com/doc/api-reference/api-parameters/hitsPerPage/ says 1000 is the maximum. {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for fetching Hacker News threads from the search API 952179830  
886136224 https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886136224 https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2 IC_kwDODtX3eM400V2g simonw 9599 2021-07-25T02:08:29Z 2021-07-25T02:08:29Z MEMBER Prototype: curl "https://hn.algolia.com/api/v1/search_by_date?query=simonwillison.net&restrictSearchableAttributes=url&hitsPerPage=1000" | \ jq .hits | sqlite-utils insert hn.db items - --pk objectID --alter {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for fetching Hacker News threads from the search API 952179830  
886140431 https://github.com/dogsheep/hacker-news-to-sqlite/issues/2#issuecomment-886140431 https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2 IC_kwDODtX3eM400W4P simonw 9599 2021-07-25T03:12:57Z 2021-07-25T03:12:57Z MEMBER I'm going to build a general-purpose `hacker-new-to-sqlite search ...` command, where one of the options is to search within the URL. {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for fetching Hacker News threads from the search API 952179830  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 20.187ms · About: simonw/datasette-graphql