issue_comments
6 rows where issue = 348043884
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
id ▼ | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
410818501 | https://github.com/simonw/datasette/issues/357#issuecomment-410818501 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDQxMDgxODUwMQ== | simonw 9599 | 2018-08-06T19:04:54Z | 2018-08-06T19:04:54Z | OWNER | Another potential use-case for this hook: loading metadata via a URL | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Plugin hook for loading metadata.json 348043884 | |
558432963 | https://github.com/simonw/datasette/issues/357#issuecomment-558432963 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQzMjk2Mw== | simonw 9599 | 2019-11-26T02:40:31Z | 2019-11-26T02:40:31Z | OWNER | A plugin hook for this would enable #639. Renaming this issue. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Plugin hook for loading metadata.json 348043884 | |
558446045 | https://github.com/simonw/datasette/issues/357#issuecomment-558446045 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQ0NjA0NQ== | simonw 9599 | 2019-11-26T03:43:17Z | 2019-11-26T03:43:17Z | OWNER | I think only one plugin gets to work at a time. The plugin can return a dictionary which is used for live lookups of metadata every time it's accessed - which means the plugin can itself mutate that dictionary. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Plugin hook for loading metadata.json 348043884 | |
558459823 | https://github.com/simonw/datasette/issues/357#issuecomment-558459823 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQ1OTgyMw== | simonw 9599 | 2019-11-26T04:55:44Z | 2019-11-26T04:56:24Z | OWNER | This needs to play nicely with `asyncio` - which means that the plugin hook needs to be able to interact with the event loop somehow. That said... I don't particularly want to change everywhere that accesses metadata into a `await` call. So this is tricky. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Plugin hook for loading metadata.json 348043884 | |
558461851 | https://github.com/simonw/datasette/issues/357#issuecomment-558461851 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQ2MTg1MQ== | simonw 9599 | 2019-11-26T05:05:21Z | 2019-11-26T05:05:21Z | OWNER | Here's an example plugin I set up using the experimental hook in d11fd2cbaa6b31933b1319f81b5d1520726cb0b6 ```python import json from datasette import hookimpl import threading import requests import time def change_over_time(m, metadata_value): while True: print(metadata_value) fetched = requests.get(metadata_value).json() counter = m["counter"] m.clear() m["counter"] = counter + 1 m.update(fetched) m["counter"] += 1 m["title"] = "{} {}".format(m.get("title", ""), m["counter"]) time.sleep(10) @hookimpl(trylast=True) def load_metadata(metadata_value): m = { "counter": 0, } x = threading.Thread(target=change_over_time, args=(m, metadata_value), daemon=True) x.start() x.setName("datasette-metadata-counter") return m ``` It runs a separate thread that fetches the provided URL every 10 seconds: ``` datasette -m metadata.json --memory -p 8069 -m https://gist.githubusercontent.com/simonw/e8e4fcd7c0a9c951f7dd976921992157/raw/b702d18a6a078a0fb94ef1cee62e11a3396e0336/demo-metadata.json ``` I learned a bunch of things from this prototype. First, this is the wrong place to run the code: https://github.com/simonw/datasette/blob/d11fd2cbaa6b31933b1319f81b5d1520726cb0b6/datasette/cli.py#L337-L343 I wanted the plugin hook to be able to receive a `datasette` instance, so implementations could potentially run their own database queries. Calling the hook in the CLI function here happens BEFORE the `Datasette()` instance is created, so that doesn't work. I wanted to build a demo of a plugin that would load metadata periodically from an external URL (see #238) - but this threaded implementation is pretty naive. It results in a hit every 10 seconds even if no-one is using Datasette! A smarter implementation would be to fetch and cache the results - then only re-fetch them if more than 10 seconds have passed since the last time the metadata was accessed. But... doing t… | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Plugin hook for loading metadata.json 348043884 | |
647189045 | https://github.com/simonw/datasette/issues/357#issuecomment-647189045 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDY0NzE4OTA0NQ== | simonw 9599 | 2020-06-21T22:19:58Z | 2020-06-21T22:19:58Z | OWNER | I'm going to take this in a different direction. I'm not happy with how `metadata.(json|yaml)` keeps growing new features. Rather than having a single plugin hook for all of `metadata.json` I'm going to split out the feature that shows actual real metadata for tables and databases - `source`, `license` etc - into its own plugin-powered mechanism. So I'm going to close this ticket and spin up a new one for that. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Plugin hook for loading metadata.json 348043884 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);