issue_comments
14 rows where issue_url = "https://api.github.com/repos/simonw/datasette/issues/683"
This data as json, CSV (advanced)
Suggested facets: updated_at (date)
id ▼ | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
590518182 | https://github.com/simonw/datasette/pull/683#issuecomment-590518182 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDUxODE4Mg== | simonw 9599 | 2020-02-24T19:53:12Z | 2020-02-24T19:53:12Z | OWNER | Next steps are from comment https://github.com/simonw/datasette/issues/682#issuecomment-590517338 > I'm going to move ahead without needing that ability though. I figure SQLite writes are _fast_, and plugins can be trusted to implement just fast writes. So I'm going to support either fire-and-forget writes (they get added to the queue and a task ID is returned) or have the option to block awaiting the completion of the write (using Janus) but let callers decide which version they want. I may add optional timeouts some time in the future. > > I am going to make both `execute_write()` and `execute_write_fn()` awaitable functions though, for consistency with `.execute()` and to give me flexibility to change how they work in the future. > > I'll also add a `block=True` option to both of them which causes the function to wait for the write to be successfully executed - defaults to `False` (fire-and-forget mode). | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590592581 | https://github.com/simonw/datasette/pull/683#issuecomment-590592581 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5MjU4MQ== | simonw 9599 | 2020-02-24T23:00:44Z | 2020-02-24T23:01:09Z | OWNER | I've been testing this out by running one-off demo plugins. I saved the following in a file called `write-plugins/log_asgi.py` (it's a hacked around copy of [asgi-log-to-sqlite](https://github.com/simonw/asgi-log-to-sqlite)) and then running `datasette data.db --plugins-dir=write-plugins/`: ```python from datasette import hookimpl import sqlite_utils import time class AsgiLogToSqliteViaWriteQueue: lookup_columns = ( "path", "user_agent", "referer", "accept_language", "content_type", "query_string", ) def __init__(self, app, db): self.app = app self.db = db self._tables_ensured = False async def ensure_tables(self): def _ensure_tables(conn): db = sqlite_utils.Database(conn) for column in self.lookup_columns: table = "{}s".format(column) if not db[table].exists(): db[table].create({"id": int, "name": str}, pk="id") if "requests" not in db.table_names(): db["requests"].create( { "start": float, "method": str, "path": int, "query_string": int, "user_agent": int, "referer": int, "accept_language": int, "http_status": int, "content_type": int, "client_ip": str, "duration": float, "body_size": int, }, foreign_keys=self.lookup_columns, ) await self.db.execute_write_fn(_ensure_tables) async def __call__(self, scope, receive, send): if not self._tables_ensured: self._tables_ensured = True await self.ensure_tables() response_headers = [] body_size = 0 … | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590593120 | https://github.com/simonw/datasette/pull/683#issuecomment-590593120 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5MzEyMA== | simonw 9599 | 2020-02-24T23:02:30Z | 2020-02-24T23:02:30Z | OWNER | I'm going to muck around with a couple more demo plugins - in particular one derived from [datasette-upload-csvs](https://github.com/simonw/datasette-upload-csvs) - to make sure I'm comfortable with this API - then add a couple of tests and merge it with documentation that warns "this is still an experimental feature and may change". | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590598248 | https://github.com/simonw/datasette/pull/683#issuecomment-590598248 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5ODI0OA== | simonw 9599 | 2020-02-24T23:18:50Z | 2020-02-24T23:18:50Z | OWNER | I'm not convinced by the return value of the `.execute_write_fn()` method: https://github.com/simonw/datasette/blob/ab2348280206bde1390b931ae89d372c2f74b87e/datasette/database.py#L79-L83 Do I really need that `WriteResponse` class or can I do something nicer? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590598689 | https://github.com/simonw/datasette/pull/683#issuecomment-590598689 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5ODY4OQ== | simonw 9599 | 2020-02-24T23:20:11Z | 2020-02-24T23:20:11Z | OWNER | I think `if block` it makes sense to return the return value of the function that was executed. Without it all I really need to do is return the `uuid` so something could theoretically poll for completion later on. But is it weird having a function that returns different types depending on if you passed `block=True` or not? Should they be differently named functions? I'm OK with the `block=True` pattern changing the return value I think. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590599257 | https://github.com/simonw/datasette/pull/683#issuecomment-590599257 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5OTI1Nw== | simonw 9599 | 2020-02-24T23:21:56Z | 2020-02-24T23:22:35Z | OWNER | Also: are UUIDs really necessary here or could I use a simpler form of task identifier? Like an in-memory counter variable that starts at 0 and increments every time this instance of Datasette issues a new task ID? The neat thing about UUIDs is that I don't have to worry if there are multiple Datasette instances accepting writes behind a load balancer. That seems pretty unlikely (especially considering SQLite databases encourage only one process to be writing at a time)... but I am experimenting with PostgreSQL support in #670 so it's probably worth ensuring these task IDs really are globally unique. I'm going to stick with UUIDs. They're short-lived enough that their size doesn't really matter. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590606825 | https://github.com/simonw/datasette/pull/683#issuecomment-590606825 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYwNjgyNQ== | simonw 9599 | 2020-02-24T23:47:38Z | 2020-02-24T23:47:38Z | OWNER | Another demo plugin: `delete_table.py` ```python from datasette import hookimpl from datasette.utils import escape_sqlite from starlette.responses import HTMLResponse from starlette.endpoints import HTTPEndpoint class DeleteTableApp(HTTPEndpoint): def __init__(self, scope, receive, send, datasette): self.datasette = datasette super().__init__(scope, receive, send) async def post(self, request): formdata = await request.form() database = formdata["database"] db = self.datasette.databases[database] await db.execute_write("drop table {}".format(escape_sqlite(formdata["table"]))) return HTMLResponse("Table has been deleted.") @hookimpl def asgi_wrapper(datasette): def wrap_with_asgi_auth(app): async def wrapped_app(scope, recieve, send): if scope["path"] == "/-/delete-table": await DeleteTableApp(scope, recieve, send, datasette) else: await app(scope, recieve, send) return wrapped_app return wrap_with_asgi_auth ``` Then I saved this as `table.html` in the `write-templates/` directory: ```html+django {% extends "default:table.html" %} {% block content %} <form action="/-/delete-table" method="POST"> <p> <input type="hidden" name="database" value="{{ database }}"> <input type="hidden" name="table" value="{{ table }}"> <input type="submit" value="Delete this table"> </p> </form> {{ super() }} {% endblock %} ``` (Needs CSRF protection added) I ran Datasette like this: $ datasette --plugins-dir=write-plugins/ data.db --template-dir=write-templates/ Result: I can delete tables! <img width="596" alt="data__everything__30_132_rows_-_Mozilla_Firefox" src="https://user-images.githubusercontent.com/9599/75201302-f9cec580-571c-11ea-9c55-67a49e68ec0c.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590607385 | https://github.com/simonw/datasette/pull/683#issuecomment-590607385 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYwNzM4NQ== | simonw 9599 | 2020-02-24T23:49:37Z | 2020-02-24T23:49:37Z | OWNER | Here's the `upload_csv.py` plugin file I've been playing with: ```python from datasette import hookimpl from starlette.responses import PlainTextResponse, HTMLResponse from starlette.endpoints import HTTPEndpoint import csv as csv_std import codecs import sqlite_utils class UploadApp(HTTPEndpoint): def __init__(self, scope, receive, send, datasette): self.datasette = datasette super().__init__(scope, receive, send) def get_database(self): # For the moment just use the first one that's not immutable mutable = [db for db in self.datasette.databases.values() if db.is_mutable] return mutable[0] async def get(self, request): return HTMLResponse( await self.datasette.render_template( "upload_csv.html", {"database_name": self.get_database().name} ) ) async def post(self, request): formdata = await request.form() csv = formdata["csv"] # csv.file is a SpooledTemporaryFile, I can read it directly filename = csv.filename # TODO: Support other encodings: reader = csv_std.reader(codecs.iterdecode(csv.file, "utf-8")) headers = next(reader) docs = (dict(zip(headers, row)) for row in reader) if filename.endswith(".csv"): filename = filename[:-4] # Import data into a table of that name using sqlite-utils db = self.get_database() def fn(conn): writable_conn = sqlite_utils.Database(db.path) writable_conn[filename].insert_all(docs, alter=True) return writable_conn[filename].count # Without block=True we may attempt 'select count(*) from ...' # before the table has been created by the write thread count = await db.execute_write_fn(fn, block=True) return HTMLResponse( await self.datasette.render_template( "upload_csv_done.html", { … | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590608228 | https://github.com/simonw/datasette/pull/683#issuecomment-590608228 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYwODIyOA== | simonw 9599 | 2020-02-24T23:52:35Z | 2020-02-24T23:52:35Z | OWNER | I'm going to punt on the ability to introspect the write queue and poll for completion using a UUID for the moment. Can add those later. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590610180 | https://github.com/simonw/datasette/pull/683#issuecomment-590610180 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYxMDE4MA== | simonw 9599 | 2020-02-25T00:00:07Z | 2020-02-25T00:00:07Z | OWNER | Basic stuff to cover in unit tests: - Exercise `.execute_write(sql)` - both with block=True and block=False - Exercise `.execute_write_fn(fn)` in the same way - Throw 10 updates in the queue, block on just the last one, check it worked correctly I'm going to write these tests directly against a `Database()` object rather than booting up an entire Datasette instance. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590614896 | https://github.com/simonw/datasette/pull/683#issuecomment-590614896 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYxNDg5Ng== | simonw 9599 | 2020-02-25T00:16:51Z | 2020-02-25T00:16:51Z | OWNER | The other problem with the poll-for-UUID-completion idea: how long does this mean Datasette needs to keep holding onto the `WriteTask` objects? Maybe we say you only get to ask "is this UUID still in the queue" and if the answer is "no" then you assume the task has been completed. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590617822 | https://github.com/simonw/datasette/pull/683#issuecomment-590617822 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYxNzgyMg== | simonw 9599 | 2020-02-25T00:26:48Z | 2020-02-25T00:26:48Z | OWNER | This failing test is a nasty one - the whole thing just hangs (so I imagine Travis will run for a while before hopefully giving up). Here's what happens if I add `--full-trace` and then hit Ctrl+C to cancel a test run: ``` $ pytest -k test_execute_write_fn_block_true --full-trace =================================================================== test session starts =================================================================== platform darwin -- Python 3.7.5, pytest-5.2.4, py-1.8.1, pluggy-0.13.1 rootdir: /Users/simonw/Dropbox/Development/datasette, inifile: pytest.ini plugins: asyncio-0.10.0 collected 410 items / 409 deselected / 1 selected tests/test_database.py ^C^C ================================================================= 409 deselected in 4.45s ================================================================= Traceback (most recent call last): File "/Users/simonw/.local/share/virtualenvs/datasette-oJRYYJuA/lib/python3.7/site-packages/_pytest/main.py", line 193, in wrap_session session.exitstatus = doit(config, session) or 0 File "/Users/simonw/.local/share/virtualenvs/datasette-oJRYYJuA/lib/python3.7/site-packages/_pytest/main.py", line 237, in _main config.hook.pytest_runtestloop(session=session) File "/Users/simonw/.local/share/virtualenvs/datasette-oJRYYJuA/lib/python3.7/site-packages/pluggy/hooks.py", line 286, in __call__ return self._hookexec(self, self.get_hookimpls(), kwargs) File "/Users/simonw/.local/share/virtualenvs/datasette-oJRYYJuA/lib/python3.7/site-packages/pluggy/manager.py", line 93, in _hookexec return self._inner_hookexec(hook, methods, kwargs) File "/Users/simonw/.local/share/virtualenvs/datasette-oJRYYJuA/lib/python3.7/site-packages/pluggy/manager.py", line 87, in <lambda> firstresult=hook.spec.opts.get("firstresult") if hook.spec else False, File "/Users/simonw/.local/share/virtualenvs/datasette-oJRYYJuA/lib/pyt… | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590679273 | https://github.com/simonw/datasette/pull/683#issuecomment-590679273 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDY3OTI3Mw== | simonw 9599 | 2020-02-25T04:37:21Z | 2020-02-25T04:37:21Z | OWNER | I'm happy with this now. I'm going to merge to master. | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 | |
590681676 | https://github.com/simonw/datasette/pull/683#issuecomment-590681676 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDY4MTY3Ng== | simonw 9599 | 2020-02-25T04:48:29Z | 2020-02-25T04:48:29Z | OWNER | Documentation: https://datasette.readthedocs.io/en/latest/internals.html#await-db-execute-write-sql-params-none-block-false | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .execute_write() and .execute_write_fn() methods on Database 570101428 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);