issue_comments
27 rows where issue = 396212021
This data as json, CSV (advanced)
Suggested facets: user, author_association, reactions, created_at (date), updated_at (date)
id ▼ | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
451704724 | https://github.com/simonw/datasette/issues/394#issuecomment-451704724 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ1MTcwNDcyNA== | simonw 9599 | 2019-01-06T00:32:23Z | 2019-01-06T00:33:44Z | OWNER | I found a really nice pattern for writing the unit tests for this (though it would look even nicer with a solution to #395) ```python @pytest.mark.parametrize("prefix", ["/prefix/", "https://example.com/"]) @pytest.mark.parametrize("path", [ "/", "/fixtures", "/fixtures/compound_three_primary_keys", "/fixtures/compound_three_primary_keys/a,a,a", "/fixtures/paginated_view", ]) def test_url_prefix_config(prefix, path): for client in make_app_client(config={ "url_prefix": prefix, }): response = client.get(path) soup = Soup(response.body, "html.parser") for a in soup.findAll("a"): href = a["href"] if href not in { "https://github.com/simonw/datasette", "https://github.com/simonw/datasette/blob/master/LICENSE", "https://github.com/simonw/datasette/blob/master/tests/fixtures.py", }: assert href.startswith(prefix), (href, a.parent) ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
499320973 | https://github.com/simonw/datasette/issues/394#issuecomment-499320973 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ5OTMyMDk3Mw== | kevindkeogh 13896256 | 2019-06-06T02:07:59Z | 2019-06-06T02:07:59Z | CONTRIBUTOR | Hey was this ever merged? Trying to run this behind nginx, and encountering this issue. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
499923145 | https://github.com/simonw/datasette/issues/394#issuecomment-499923145 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDQ5OTkyMzE0NQ== | kevindkeogh 13896256 | 2019-06-07T15:10:57Z | 2019-06-07T15:11:07Z | CONTRIBUTOR | Putting this here in case anyone else encounters the same issue with nginx, I was able to resolve it by passing the header in the nginx proxy config (i.e., `proxy_set_header Host $host`). | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
556749086 | https://github.com/simonw/datasette/issues/394#issuecomment-556749086 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDU1Njc0OTA4Ng== | jsfenfen 639012 | 2019-11-21T01:15:34Z | 2019-11-21T01:21:45Z | CONTRIBUTOR | Hey @simonw is the url_prefix config option available in another branch, it looks like you've written some tests for it above? In 0.32 I get "url_prefix is not a valid option". I think this would be *really helpful*! This would be really handy for proxying datasette in another domain's *subdirectory* I believe this will allow folks to run upstream authentication, but the links break if the url_prefix doesn't match. I'd prefer not to host a proxied version of datasette on a subdomain (e.g. datasette.myurl.com b/c then I gotta worry about sharing authorization cookies with the subdomain, which I just assume not do, but...) Edit: I see the wip-url-prefix branch, I may try with that https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
567127981 | https://github.com/simonw/datasette/issues/394#issuecomment-567127981 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDU2NzEyNzk4MQ== | terrycojones 132978 | 2019-12-18T17:18:06Z | 2019-12-18T17:18:06Z | NONE | Agreed, this would be nice to have. I'm currently working around it in `nginx` with additional location blocks: ``` location /datasette/ { proxy_pass http://127.0.0.1:8001/; proxy_redirect off; include proxy_params; } location /dna-protein-genome/ { proxy_pass http://127.0.0.1:8001/dna-protein-genome/; proxy_redirect off; include proxy_params; } location /rna-protein-genome/ { proxy_pass http://127.0.0.1:8001/rna-protein-genome/; proxy_redirect off; include proxy_params; } ``` The 2nd and 3rd above are my databases. This works, but I have a small problem with URLs like `/rna-protein-genome?params....` that I could fix with some more nginx munging. I seem to do this sort of thing once every 5 years and then have to look it all up again. Thanks! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
567128636 | https://github.com/simonw/datasette/issues/394#issuecomment-567128636 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDU2NzEyODYzNg== | terrycojones 132978 | 2019-12-18T17:19:46Z | 2019-12-18T17:19:46Z | NONE | Hmmm, wait, maybe my mindless (copy/paste) use of `proxy_redirect` is causing me grief... | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
567133734 | https://github.com/simonw/datasette/issues/394#issuecomment-567133734 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDU2NzEzMzczNA== | jsfenfen 639012 | 2019-12-18T17:33:23Z | 2019-12-18T17:33:23Z | CONTRIBUTOR | FWIW I did a dumb merge of the branch here: https://github.com/jsfenfen/datasette and it seemed to work in that I could run stuff at a subdirectory, but ended up abandoning it in favor of just posting a subdomain because getting the nginx configs right was making me crazy. I still would prefer posting at a subdirectory but the subdomain seems simpler at the moment. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
567219479 | https://github.com/simonw/datasette/issues/394#issuecomment-567219479 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDU2NzIxOTQ3OQ== | terrycojones 132978 | 2019-12-18T21:24:23Z | 2019-12-18T21:24:23Z | NONE | @simonw What about allowing a base url. The `<base>....</base>` tag has been around forever. Then just use all relative URLs, which I guess is likely what you already do. See https://www.w3schools.com/TAGs/tag_base.asp | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
602904184 | https://github.com/simonw/datasette/issues/394#issuecomment-602904184 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMjkwNDE4NA== | betatim 1448859 | 2020-03-23T23:03:42Z | 2020-03-23T23:03:42Z | NONE | On mybinder.org we allow access to arbitrary processes listening on a port inside the container via a [reverse proxy](https://github.com/jupyterhub/jupyter-server-proxy). This means we need support for a proxy prefix as the proxy ends up running at a URL like `/something/random/proxy/datasette/...` An example that shows the problem is https://github.com/psychemedia/jupyterserverproxy-datasette-demo. Launch directly into a datasette instance on mybinder.org with https://mybinder.org/v2/gh/psychemedia/jupyterserverproxy-datasette-demo/master?urlpath=datasette then try to follow links inside the UI. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
602907207 | https://github.com/simonw/datasette/issues/394#issuecomment-602907207 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMjkwNzIwNw== | wragge 127565 | 2020-03-23T23:12:18Z | 2020-03-23T23:12:18Z | CONTRIBUTOR | This would also be useful for running Datasette in Jupyter notebooks on [Binder](https://mybinder.org/). While you can use [Jupyter-server-proxy](https://github.com/jupyterhub/jupyter-server-proxy) to access Datasette on Binder, the links are broken. Why run Datasette on Binder? I'm developing a [range of Jupyter notebooks](https://glam-workbench.github.io/) that are aimed at getting humanities researchers to explore data from libraries, archives, and museums. Many of them are aimed at researchers with limited digital skills, so being able to run examples in Binder without them installing anything is fantastic. For example, there are a [series of notebooks](https://glam-workbench.github.io/trove-harvester/) that help researchers harvest digitised historical newspaper articles from Trove. The metadata from this harvest is saved as a CSV file that users can download. I've also provided some extra notebooks that use Pandas etc to demonstrate ways of analysing and visualising the harvested data. But it would be really nice if, after completing a harvest, the user could spin up Datasette for some initial exploration of their harvested data without ever leaving their browser. | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
602911133 | https://github.com/simonw/datasette/issues/394#issuecomment-602911133 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMjkxMTEzMw== | terrycojones 132978 | 2020-03-23T23:22:10Z | 2020-03-23T23:22:10Z | NONE | I just updated #652 to remove a merge conflict. I think it's an easy way to add this functionality. I don't have time to do more though, sorry! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
602913427 | https://github.com/simonw/datasette/issues/394#issuecomment-602913427 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMjkxMzQyNw== | simonw 9599 | 2020-03-23T23:27:44Z | 2020-03-23T23:27:44Z | OWNER | Thanks very much @terrycojones - I'll see if I can finish it up from here. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
602916580 | https://github.com/simonw/datasette/issues/394#issuecomment-602916580 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMjkxNjU4MA== | terrycojones 132978 | 2020-03-23T23:37:06Z | 2020-03-23T23:37:06Z | NONE | @simonw You're welcome - I was just trying it out back in December as I thought it should work. Now there's a pandemic to work on though.... so no time at all for more at the moment. BTW, I have datasette running on several protein and full (virus) genome databases I build, and it's great - thank you! Hi and best regards to you & Nat :-) | {"total_count": 1, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
602955699 | https://github.com/simonw/datasette/issues/394#issuecomment-602955699 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMjk1NTY5OQ== | simonw 9599 | 2020-03-24T01:34:06Z | 2020-03-24T01:34:15Z | OWNER | I don't think I'll go with the `<base>` solution purely because it doesn't work with JSON APIs - and there are quite a few places where Datasette APIs return URLs (for things like toggling facets - e.g. `suggested_facets` on https://latest.datasette.io/fixtures/facetable.json?_labels=on&_size=0 ) The good news is that if you look at the templates almost all of the URLs have been generated in Python code: https://github.com/simonw/datasette/blob/a498d0fe6590f9bdbc4faf9e0dd5faeb3b06002c/datasette/templates/table.html - so it shouldn't be too hard to fix in Python. Ideally I'd like to fix this with as few template changes as possible. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603501719 | https://github.com/simonw/datasette/issues/394#issuecomment-603501719 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzUwMTcxOQ== | simonw 9599 | 2020-03-24T20:59:28Z | 2020-03-24T20:59:28Z | OWNER | Here's the line I'm stuck on now: https://github.com/simonw/datasette/blob/298a899e792ebd0cd82a5f01b613c31f19082e51/datasette/views/base.py#L417 Tricky question: do I continue to rebuild URLs based on the incoming `request` (on the assumption that it has been modified to the new thing) or do I expect that I may still see un-prefixed incoming requests and need to change them? If the incoming URL paths contain the prefix, at what point do I drop that so I can run the regular URL matching code? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603508356 | https://github.com/simonw/datasette/issues/394#issuecomment-603508356 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzUwODM1Ng== | simonw 9599 | 2020-03-24T21:14:31Z | 2020-03-24T21:14:31Z | OWNER | I'm going to assume that whatever is proxying to Datasette leaves the full incoming URL path intact, so I'm going to need to teach the URL routing code to strip off the prefix before processing the incoming request. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603508785 | https://github.com/simonw/datasette/issues/394#issuecomment-603508785 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzUwODc4NQ== | simonw 9599 | 2020-03-24T21:15:28Z | 2020-03-24T21:15:28Z | OWNER | That means I should teach `AsgiRouter` how to handle an optional prefix: https://github.com/simonw/datasette/blob/298a899e792ebd0cd82a5f01b613c31f19082e51/datasette/utils/asgi.py#L81-L93 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603509266 | https://github.com/simonw/datasette/issues/394#issuecomment-603509266 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzUwOTI2Ng== | simonw 9599 | 2020-03-24T21:16:34Z | 2020-03-24T21:16:34Z | OWNER | Actually I'll teach `DatasetteRouter` since that subclasses `AsgiRouter` but has access to a `datasette` instance (which it can read configuration values from): https://github.com/simonw/datasette/blob/298a899e792ebd0cd82a5f01b613c31f19082e51/datasette/app.py#L750-L753 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603525339 | https://github.com/simonw/datasette/issues/394#issuecomment-603525339 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzUyNTMzOQ== | simonw 9599 | 2020-03-24T21:55:46Z | 2020-03-24T22:07:40Z | OWNER | OK, I have an implementation of this over in the `base-url` branch (see pull request #708) which is passing all of the unit tests. Anyone willing to give it a quick test and see if it works for your particular use-case? You can install it with: pip install https://github.com/simonw/datasette/archive/base-url.zip Then you can run Datasette like this: datasette fixtures.db --config base_url:/new-base/path/here/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603539349 | https://github.com/simonw/datasette/issues/394#issuecomment-603539349 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzUzOTM0OQ== | terrycojones 132978 | 2020-03-24T22:33:23Z | 2020-03-24T22:33:23Z | NONE | Hi Simon - I'm just (trying, at least) to follow along in the above. I can't try it out now, but I will if no one else gets to it. Sorry I didn't write any tests in the original bit of code I pushed - I was just trying to see if it could work & whether you'd want to maybe head in that direction. Anyway, thank you, I will certainly use this. Comment back here if no one tried it out & I'll make time. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603570972 | https://github.com/simonw/datasette/issues/394#issuecomment-603570972 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzU3MDk3Mg== | simonw 9599 | 2020-03-25T00:17:24Z | 2020-03-25T00:17:24Z | OWNER | I got this working as a proxied instance inside Binder, building on @psychemedia's work: https://github.com/simonw/jupyterserverproxy-datasette-demo/issues/1 Now that I've seen it working there I'm going to land the pull request. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603631640 | https://github.com/simonw/datasette/issues/394#issuecomment-603631640 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzYzMTY0MA== | simonw 9599 | 2020-03-25T04:19:08Z | 2020-03-25T04:19:08Z | OWNER | Shipped in 0.39: https://datasette.readthedocs.io/en/latest/changelog.html#v0-39 | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
603849245 | https://github.com/simonw/datasette/issues/394#issuecomment-603849245 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwMzg0OTI0NQ== | terrycojones 132978 | 2020-03-25T13:48:13Z | 2020-03-25T13:48:13Z | NONE | Great - thanks again. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
604166918 | https://github.com/simonw/datasette/issues/394#issuecomment-604166918 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDYwNDE2NjkxOA== | wragge 127565 | 2020-03-26T00:56:30Z | 2020-03-26T00:56:30Z | CONTRIBUTOR | Thanks! I'm trying to launch Datasette from *within* a notebook using the jupyter-server-proxy and the new `base_url` parameter. While the assets load ok, and the breadcrumb navigation works, the facet links don't seem to use the `base_url`. Or have I missed something? My test repository is here: https://github.com/wragge/datasette-test | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
641889565 | https://github.com/simonw/datasette/issues/394#issuecomment-641889565 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDY0MTg4OTU2NQ== | LVerneyPEReN 58298410 | 2020-06-10T09:49:34Z | 2020-06-10T09:49:34Z | NONE | Hi, I came across this issue while looking for a way to spawn Datasette as a SQLite files viewer in JupyterLab. I found https://github.com/simonw/jupyterserverproxy-datasette-demo which seems to be the most up to date proof of concept, but it seems to be failing to list the available db (at least in the Binder demo, https://hub.gke.mybinder.org/user/simonw-jupyters--datasette-demo-uw4dmlnn/datasette/, I only have `:memory`). Does anyone tried to improve on this proof of concept to have a Datasette visualization for SQLite files? Thanks! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
641908346 | https://github.com/simonw/datasette/issues/394#issuecomment-641908346 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDY0MTkwODM0Ng== | wragge 127565 | 2020-06-10T10:22:54Z | 2020-06-10T10:22:54Z | CONTRIBUTOR | There's a working demo here: https://github.com/wragge/datasette-test And if you want something that's more than just proof-of-concept, here's a notebook which does some harvesting from web archives and then displays the results using Datasette: https://nbviewer.jupyter.org/github/GLAM-Workbench/web-archives/blob/master/explore_presentations.ipynb | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 | |
642522285 | https://github.com/simonw/datasette/issues/394#issuecomment-642522285 | https://api.github.com/repos/simonw/datasette/issues/394 | MDEyOklzc3VlQ29tbWVudDY0MjUyMjI4NQ== | LVerneyPEReN 58298410 | 2020-06-11T09:15:19Z | 2020-06-11T09:15:19Z | NONE | Hi @wragge, This looks great, thanks for the share! I refactored it into a self-contained function, binding on a random available TCP port (multi-user context). I am using subprocess API directly since the `%run` magic was leaving defunct process behind :/ ![image](https://user-images.githubusercontent.com/58298410/84367566-b5d0d500-abd4-11ea-96e2-f5c05a28e506.png) ```python import socket from signal import SIGINT from subprocess import Popen, PIPE from IPython.display import display, HTML from notebook.notebookapp import list_running_servers def get_free_tcp_port(): """ Get a free TCP port. """ tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM) tcp.bind(('', 0)) _, port = tcp.getsockname() tcp.close() return port def datasette(database): """ Run datasette on an SQLite database. """ # Get current running servers servers = list_running_servers() # Get the current base url base_url = next(servers)['base_url'] # Get a free port port = get_free_tcp_port() # Create a base url for Datasette suing the proxy path proxy_url = f'{base_url}proxy/absolute/{port}/' # Display a link to Datasette display(HTML(f'<p><a href="{proxy_url}">View Datasette</a> (Click on the stop button to close the Datasette server)</p>')) # Launch Datasette with Popen( [ 'python', '-m', 'datasette', '--', database, '--port', str(port), '--config', f'base_url:{proxy_url}' ], stdout=PIPE, stderr=PIPE, bufsize=1, universal_newlines=True ) as p: print(p.stdout.readline(), end='') while True: try: line = p.stderr.readline() if not line: break print(line, end='') exit_code = p.poll() except KeyboardInterrupt: p.send_signal(SIGINT) ``` Ideal… | {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} | base_url configuration setting 396212021 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);