html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/638#issuecomment-557331366,https://api.github.com/repos/simonw/datasette/issues/638,557331366,MDEyOklzc3VlQ29tbWVudDU1NzMzMTM2Ng==,9599,2019-11-22T00:19:40Z,2019-11-22T00:19:40Z,OWNER,"We currently use `select distinct wikipedia_url ...` to suggest facets. This query would only return rows which are represented twice or more: ```sql select wikipedia_url, count(*) as n from museums where wikipedia_url is not null group by wikipedia_url having n > 1 ``` https://www.niche-museums.com/museums?sql=select%0D%0A++wikipedia_url%2C+count%28*%29+as+n%0D%0Afrom%0D%0A++museums%0D%0Awhere%0D%0A++wikipedia_url+is+not+null%0D%0Agroup+by+wikipedia_url%0D%0Ahaving+n+%3E+1","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",526913133, https://github.com/simonw/datasette/issues/638#issuecomment-557333467,https://api.github.com/repos/simonw/datasette/issues/638,557333467,MDEyOklzc3VlQ29tbWVudDU1NzMzMzQ2Nw==,9599,2019-11-22T00:28:07Z,2019-11-22T00:28:07Z,OWNER,"It's not as simple as that - `planet_int` should be a suggested facet on https://latest.datasette.io/fixtures/facetable?_facet=planet_int because it returns two filters, even though one of those two is a value of 1. Switching to the new proposed SQL statement misses this. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",526913133, https://github.com/simonw/datasette/issues/638#issuecomment-557343750,https://api.github.com/repos/simonw/datasette/issues/638,557343750,MDEyOklzc3VlQ29tbWVudDU1NzM0Mzc1MA==,9599,2019-11-22T01:14:59Z,2019-11-22T01:14:59Z,OWNER,Demo: https://latest.datasette.io/fixtures/facetable doesn't suggest `distinct_some_null` as a facet.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",526913133, https://github.com/simonw/datasette/issues/637#issuecomment-557770129,https://api.github.com/repos/simonw/datasette/issues/637,557770129,MDEyOklzc3VlQ29tbWVudDU1Nzc3MDEyOQ==,9599,2019-11-23T05:55:42Z,2019-11-23T05:56:16Z,OWNER,"Design idea: HTML: ```html

0 records

```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",525993034, https://github.com/simonw/datasette/issues/637#issuecomment-557771312,https://api.github.com/repos/simonw/datasette/issues/637,557771312,MDEyOklzc3VlQ29tbWVudDU1Nzc3MTMxMg==,9599,2019-11-23T06:17:23Z,2019-11-23T06:17:23Z,OWNER,"Demo: * https://latest.datasette.io/fixtures/123_starts_with_digits * https://latest.datasette.io/fixtures/neighborhood_search?text=foop","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",525993034, https://github.com/simonw/datasette/issues/639#issuecomment-558432868,https://api.github.com/repos/simonw/datasette/issues/639,558432868,MDEyOklzc3VlQ29tbWVudDU1ODQzMjg2OA==,9599,2019-11-26T02:40:06Z,2019-11-26T02:40:06Z,OWNER,"Unfortunately I don't think it's possible to do this with Heroku. Heroku treats all deployments as total replacements - that's part of how they achieve zero-downtime deployments, since they run the new deployment at the same time as the old deployment and then switch traffic over at the load balancer. I did have one idea that's relevant here: #238 - which would provide a mechanism for `metadata.json` to be hosted on a separate URL (e.g. a gist) and have Datasette periodically fetch a new copy. I closed that in favour of #357 - a plugin hook for loading metadata. That's still something I'm interested in exploring. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527670799, https://github.com/simonw/datasette/issues/357#issuecomment-558432963,https://api.github.com/repos/simonw/datasette/issues/357,558432963,MDEyOklzc3VlQ29tbWVudDU1ODQzMjk2Mw==,9599,2019-11-26T02:40:31Z,2019-11-26T02:40:31Z,OWNER,A plugin hook for this would enable #639. Renaming this issue.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",348043884, https://github.com/simonw/datasette/issues/639#issuecomment-558437707,https://api.github.com/repos/simonw/datasette/issues/639,558437707,MDEyOklzc3VlQ29tbWVudDU1ODQzNzcwNw==,172847,2019-11-26T03:02:53Z,2019-11-26T03:03:29Z,NONE,"@simonw - Thanks for the reply! My reading of the heroku documents is that if one sets things up using git, then one can use ""git push"" (from a {local, GitHub, GitLab} git repository to Heroku) to ""update"" a Heroku deployment, but I'm not sure exactly how this works. However, assuming there is some way to use ""git push"" to update the Heroku deployment, the question becomes how can one do this in conjunction with datasette. Again based on my reading the heroku documents, it would seem that the following should work (but it doesn't quite): 1) Use datasette to create a deployment (named MYAPP) 2) Put it in maintenance mode 3) heroku git:clone -a MYAPP -- This results in an empty repository (as expected) 4) In another directory, heroku slugs:download -a MYAPP 5) Copy the downloaded slug into the repository 6) Make some change to metadata.json 6) Commit and push it back 7) Take the deployment out of maintenance mode 8) Refresh the deployment Using the heroku console, I've verified that the edits appear on heroku, but somehow they are not reflected in the running app. I'm hopeful that with some small tweak or perhaps the addition of a bit of voodoo, this strategy will work. I think it will be important to get this working for another reason: getting Heroku, Cloudcube, and datasette to work together, to overcome the slug size limitation so that large SQLite databases can be deployed to Heroku using Datasette. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527670799, https://github.com/simonw/datasette/issues/639#issuecomment-558439989,https://api.github.com/repos/simonw/datasette/issues/639,558439989,MDEyOklzc3VlQ29tbWVudDU1ODQzOTk4OQ==,9599,2019-11-26T03:14:27Z,2019-11-26T03:14:27Z,OWNER,@jacobian does this sound like something that could work?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527670799, https://github.com/simonw/datasette/issues/641#issuecomment-558443464,https://api.github.com/repos/simonw/datasette/issues/641,558443464,MDEyOklzc3VlQ29tbWVudDU1ODQ0MzQ2NA==,9599,2019-11-26T03:30:02Z,2019-11-26T03:30:02Z,OWNER,https://datasette.readthedocs.io/en/latest/custom_templates.html#serving-static-files,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",528442126, https://github.com/simonw/datasette/issues/357#issuecomment-558446045,https://api.github.com/repos/simonw/datasette/issues/357,558446045,MDEyOklzc3VlQ29tbWVudDU1ODQ0NjA0NQ==,9599,2019-11-26T03:43:17Z,2019-11-26T03:43:17Z,OWNER,I think only one plugin gets to work at a time. The plugin can return a dictionary which is used for live lookups of metadata every time it's accessed - which means the plugin can itself mutate that dictionary.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",348043884, https://github.com/simonw/datasette/issues/357#issuecomment-558459823,https://api.github.com/repos/simonw/datasette/issues/357,558459823,MDEyOklzc3VlQ29tbWVudDU1ODQ1OTgyMw==,9599,2019-11-26T04:55:44Z,2019-11-26T04:56:24Z,OWNER,"This needs to play nicely with `asyncio` - which means that the plugin hook needs to be able to interact with the event loop somehow. That said... I don't particularly want to change everywhere that accesses metadata into a `await` call. So this is tricky.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",348043884, https://github.com/simonw/datasette/issues/357#issuecomment-558461851,https://api.github.com/repos/simonw/datasette/issues/357,558461851,MDEyOklzc3VlQ29tbWVudDU1ODQ2MTg1MQ==,9599,2019-11-26T05:05:21Z,2019-11-26T05:05:21Z,OWNER,"Here's an example plugin I set up using the experimental hook in d11fd2cbaa6b31933b1319f81b5d1520726cb0b6 ```python import json from datasette import hookimpl import threading import requests import time def change_over_time(m, metadata_value): while True: print(metadata_value) fetched = requests.get(metadata_value).json() counter = m[""counter""] m.clear() m[""counter""] = counter + 1 m.update(fetched) m[""counter""] += 1 m[""title""] = ""{} {}"".format(m.get(""title"", """"), m[""counter""]) time.sleep(10) @hookimpl(trylast=True) def load_metadata(metadata_value): m = { ""counter"": 0, } x = threading.Thread(target=change_over_time, args=(m, metadata_value), daemon=True) x.start() x.setName(""datasette-metadata-counter"") return m ``` It runs a separate thread that fetches the provided URL every 10 seconds: ``` datasette -m metadata.json --memory -p 8069 -m https://gist.githubusercontent.com/simonw/e8e4fcd7c0a9c951f7dd976921992157/raw/b702d18a6a078a0fb94ef1cee62e11a3396e0336/demo-metadata.json ``` I learned a bunch of things from this prototype. First, this is the wrong place to run the code: https://github.com/simonw/datasette/blob/d11fd2cbaa6b31933b1319f81b5d1520726cb0b6/datasette/cli.py#L337-L343 I wanted the plugin hook to be able to receive a `datasette` instance, so implementations could potentially run their own database queries. Calling the hook in the CLI function here happens BEFORE the `Datasette()` instance is created, so that doesn't work. I wanted to build a demo of a plugin that would load metadata periodically from an external URL (see #238) - but this threaded implementation is pretty naive. It results in a hit every 10 seconds even if no-one is using Datasette! A smarter implementation would be to fetch and cache the results - then only re-fetch them if more than 10 seconds have passed since the last time the metadata was accessed. But... doing this neatly requires asyncio - and the plugin isn't running inside an event loop (since `uvicorn.run(ds.app()...)` has not run yet so the event loop hasn't even started). I could try and refactor everything so that all calls to read from `metadata` happen via `await`, but this feels like a pretty invasive change. It would be necessary if metadata might be read via a SQL query though. Or maybe I could set it up so the plugin can start itself running in the event loop and call back to the `datasette` object to update metadata whenever it feels like it?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",348043884, https://github.com/simonw/datasette/issues/639#issuecomment-558687342,https://api.github.com/repos/simonw/datasette/issues/639,558687342,MDEyOklzc3VlQ29tbWVudDU1ODY4NzM0Mg==,21148,2019-11-26T15:40:00Z,2019-11-26T15:40:00Z,CONTRIBUTOR,"A bit of background: the reason `heroku git:clone` brings down an empty directory is because `datasette publish heroku` uses the [builds API](https://devcenter.heroku.com/articles/build-and-release-using-the-api), rather than a `git push`, to release the app. I originally did this because it seemed like a lower bar than having a working `git`, but the downside is, as you found out, that tweaking the created app is hard. So there's one option -- change `datasette publish heroku` to use `git push` instead of `heroku builds:create`. @pkoppstein - what you suggested seems like it ought to work (you don't need maintenance mode, though). I'm not sure why it doesn't. You could also look into using the [slugs API](https://devcenter.heroku.com/articles/platform-api-deploying-slugs) to download the slug, change `metadata.json`, re-pack and re-upload the slug. Ultimately though I think I think @simonw's idea of reading `metadata.json` from an external source might be better (#357). Reading from an alternate URL would be fine, or you could also just stuff the whole `metadata.json` into a Heroku config var, and write a plugin to read it from there. Hope this helps a bit!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527670799, https://github.com/simonw/datasette/issues/639#issuecomment-558852316,https://api.github.com/repos/simonw/datasette/issues/639,558852316,MDEyOklzc3VlQ29tbWVudDU1ODg1MjMxNg==,172847,2019-11-26T22:54:23Z,2019-11-26T22:54:23Z,NONE,"@jacobian - Thanks for your help. Having to upload an entire slug each time a small change is needed in `metadata.json` seems no better than the current situation so I probably won't go down that rabbit hole just yet. In any case, the really important goal is moving the SQLite file out of Heroku in a way that the Heroku app can still read it efficiently. Is this possible? Is Cloudcube the right place to start? Is there any alternative? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527670799, https://github.com/simonw/sqlite-utils/pull/67#issuecomment-559108591,https://api.github.com/repos/simonw/sqlite-utils/issues/67,559108591,MDEyOklzc3VlQ29tbWVudDU1OTEwODU5MQ==,9599,2019-11-27T14:24:59Z,2019-11-27T14:24:59Z,OWNER,Failed due to black testing dependency: https://travis-ci.com/simonw/sqlite-utils/jobs/260995814,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",529376481, https://github.com/simonw/datasette/issues/642#issuecomment-559142893,https://api.github.com/repos/simonw/datasette/issues/642,559142893,MDEyOklzc3VlQ29tbWVudDU1OTE0Mjg5Mw==,9599,2019-11-27T15:47:36Z,2019-11-27T15:47:42Z,OWNER,"It can include options for quickly bootstrapping custom template function or SQL function plugins, which are really simple.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",529429214, https://github.com/simonw/datasette/issues/642#issuecomment-559143123,https://api.github.com/repos/simonw/datasette/issues/642,559143123,MDEyOklzc3VlQ29tbWVudDU1OTE0MzEyMw==,9599,2019-11-27T15:48:11Z,2019-11-27T15:48:11Z,OWNER,This will also make bundling static files less error-prone.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",529429214, https://github.com/simonw/datasette/issues/642#issuecomment-559146316,https://api.github.com/repos/simonw/datasette/issues/642,559146316,MDEyOklzc3VlQ29tbWVudDU1OTE0NjMxNg==,9599,2019-11-27T15:55:51Z,2019-11-27T15:55:51Z,OWNER,"One thing that put me off cookiecutter in the past is that I didn't think it could conditionally create files. I was wrong! You can use post- hooks to delete the files that you didn't need: https://github.com/audreyr/cookiecutter-pypackage/blob/master/hooks/post_gen_project.py I could use this mechanism to rename directories too if I needed to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",529429214, https://github.com/simonw/datasette/issues/642#issuecomment-559207224,https://api.github.com/repos/simonw/datasette/issues/642,559207224,MDEyOklzc3VlQ29tbWVudDU1OTIwNzIyNA==,82988,2019-11-27T18:40:57Z,2019-11-27T18:41:07Z,CONTRIBUTOR,"Would cookie cutter approaches also work for creating various flavours of customised templates? I need to try to create a couple of sites for myself to get a feel for what sorts of thing are easily doable, and what cribbable cookie cutter items might be. I'm guessing https://simonwillison.net/2019/Nov/25/niche-museums/ is a good place to start from?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",529429214, https://github.com/simonw/datasette/issues/577#issuecomment-559610951,https://api.github.com/repos/simonw/datasette/issues/577,559610951,MDEyOklzc3VlQ29tbWVudDU1OTYxMDk1MQ==,9599,2019-11-28T22:10:36Z,2019-11-28T22:10:49Z,OWNER,"Better idea: take advantage of pluggy dependency injection. If a plugin takes a `render` argument we can send it a function that can be used to render a template. The need to `await render(...)` might be difficult here though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497171390, https://github.com/simonw/datasette/issues/573#issuecomment-559632608,https://api.github.com/repos/simonw/datasette/issues/573,559632608,MDEyOklzc3VlQ29tbWVudDU1OTYzMjYwOA==,82988,2019-11-29T01:43:38Z,2019-11-29T01:43:38Z,CONTRIBUTOR,"In passing, it looks like a start was made on a datasette Jupyter server extension in https://github.com/lucasdurand/jupyter-datasette although the build fails in MyBinder.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",492153532, https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559883311,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14,559883311,MDEyOklzc3VlQ29tbWVudDU1OTg4MzMxMQ==,9599,2019-11-29T21:30:37Z,2019-11-29T21:30:37Z,MEMBER,"I should build the command to persist ETags and obey their polling guidelines: > Events are optimized for polling with the ""ETag"" header. If no new events have been triggered, you will see a ""304 Not Modified"" response, and your current rate limit will be untouched. There is also an ""X-Poll-Interval"" header that specifies how often (in seconds) you are allowed to poll. In times of high server load, the time may increase. Please obey the header.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530491074, https://github.com/dogsheep/github-to-sqlite/issues/14#issuecomment-559902818,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14,559902818,MDEyOklzc3VlQ29tbWVudDU1OTkwMjgxOA==,9599,2019-11-30T01:32:38Z,2019-11-30T01:32:38Z,MEMBER,"Prototype: ``` pip install sqlite-utils paginate-json paginate-json ""https://api.github.com/users/simonw/events"" | sqlite-utils insert /tmp/events.db events - --pk=id ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530491074, https://github.com/simonw/datasette/issues/639#issuecomment-559916057,https://api.github.com/repos/simonw/datasette/issues/639,559916057,MDEyOklzc3VlQ29tbWVudDU1OTkxNjA1Nw==,172847,2019-11-30T06:08:50Z,2019-11-30T06:08:50Z,NONE,"@simonw, @jacobian - I was able to resolve the metadata.json issue by adding `-m metadata.json` to the Procfile. Now `git push heroku master` picks up the changes, though I have the impression that heroku is doing more work than necessary (e.g. one of the information messages is: `Installing requirements with pip`). I also had to set the environment variable WEB_CONCURRENCY -- I used WEB_CONCURRENCY=1. I am still anxious to know whether it's possible for Datasette on Heroku to access the SQLite file at another location. Cloudcube seems the most promising, and I'm hoping it can be done by tweaking the Procfile suitably, but maybe that's too optimistic? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527670799, https://github.com/simonw/datasette/issues/645#issuecomment-560036585,https://api.github.com/repos/simonw/datasette/issues/645,560036585,MDEyOklzc3VlQ29tbWVudDU2MDAzNjU4NQ==,9599,2019-12-01T01:28:35Z,2019-12-01T01:28:35Z,OWNER,"Plugins are currently expected to return this: ```python @hookimpl def register_output_renderer(datasette): return { ""extension"": ""test"", ""callback"": render_test } ``` We can add an optional third argument, `""should_suggest""`, which takes the same arguments as the callback but simply returns `True` or `False` depending on if the plugin can work for the current set of data. If that dictionary key is omitted, Datasette will treat this test as returning `True`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530653633, https://github.com/simonw/datasette/issues/645#issuecomment-560036740,https://api.github.com/repos/simonw/datasette/issues/645,560036740,MDEyOklzc3VlQ29tbWVudDU2MDAzNjc0MA==,9599,2019-12-01T01:29:58Z,2019-12-01T01:29:58Z,OWNER,"It should be optionally awaitable - as should the existing `""callback""`. Can use the same pattern as this one: https://github.com/simonw/datasette/blob/8c642f04e0608bf537fdd1f76d64c2367fb04d57/datasette/views/base.py#L124-L135","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530653633, https://github.com/simonw/datasette/issues/645#issuecomment-560913459,https://api.github.com/repos/simonw/datasette/issues/645,560913459,MDEyOklzc3VlQ29tbWVudDU2MDkxMzQ1OQ==,9599,2019-12-02T23:38:22Z,2019-12-02T23:38:22Z,OWNER,"I'm going to add unit tests for the hook, and as part of that I'll fix the weird thing at the moment where the plugins for the unit tests are defined inside a quoted string as opposed to their own separate file.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530653633, https://github.com/simonw/datasette/issues/646#issuecomment-561133534,https://api.github.com/repos/simonw/datasette/issues/646,561133534,MDEyOklzc3VlQ29tbWVudDU2MTEzMzUzNA==,18017473,2019-12-03T11:50:44Z,2019-12-03T11:50:44Z,NONE,"Thanks for the reply. Will try to implement that on my end, if I have any success I will post here/ make a pull request.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",531502365, https://github.com/simonw/datasette/issues/646#issuecomment-561247711,https://api.github.com/repos/simonw/datasette/issues/646,561247711,MDEyOklzc3VlQ29tbWVudDU2MTI0NzcxMQ==,18017473,2019-12-03T16:31:39Z,2019-12-03T17:31:33Z,NONE,"> I don't think this is possible at the moment but you're right, it totally should be. Just give me a heads-up if you think you can do that quickly. I am trying to implement it with very little knowledge of how datasette works, so it will take loads of time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",531502365, https://github.com/simonw/datasette/issues/580#issuecomment-561481377,https://api.github.com/repos/simonw/datasette/issues/580,561481377,MDEyOklzc3VlQ29tbWVudDU2MTQ4MTM3Nw==,9599,2019-12-04T05:18:25Z,2019-12-04T05:18:25Z,OWNER,`datasette-atom` shipped with a copy of these classes too: https://github.com/simonw/datasette-atom/blob/c2e84207fccff0582d7152f3966dd2952fb0b74f/tests/utils.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",502355384, https://github.com/simonw/datasette/issues/514#issuecomment-561494291,https://api.github.com/repos/simonw/datasette/issues/514,561494291,MDEyOklzc3VlQ29tbWVudDU2MTQ5NDI5MQ==,9599,2019-12-04T06:14:16Z,2019-12-04T06:14:16Z,OWNER,I've been successfully running the systemd recipe above by Russs on a couple of projects. I shared some notes about what's been working for me here: https://gist.github.com/simonw/63797bb10bb74e615695edd8f850844f,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459397625, https://github.com/simonw/datasette/issues/640#issuecomment-562462630,https://api.github.com/repos/simonw/datasette/issues/640,562462630,MDEyOklzc3VlQ29tbWVudDU2MjQ2MjYzMA==,9599,2019-12-06T07:19:34Z,2019-12-06T07:19:34Z,OWNER,"`heroku apps --json` returns a JSON list of apps. `heroku apps --json | sqlite-utils insert /tmp/heroku.db apps -` creates a SQLite database from them, useful for exploring them. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",527710055, https://github.com/simonw/datasette/issues/648#issuecomment-562913040,https://api.github.com/repos/simonw/datasette/issues/648,562913040,MDEyOklzc3VlQ29tbWVudDU2MjkxMzA0MA==,9599,2019-12-08T04:56:43Z,2019-12-08T04:56:43Z,OWNER,"Idea: do this with a simple template naming convention. If you hit `/about` and there is no matching database, check for a template file called `about-page.html`. If it exists, render it. Otherwise return a 404 database not found.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534492501, https://github.com/simonw/datasette/issues/649#issuecomment-562940252,https://api.github.com/repos/simonw/datasette/issues/649,562940252,MDEyOklzc3VlQ29tbWVudDU2Mjk0MDI1Mg==,9599,2019-12-08T11:59:52Z,2019-12-08T12:00:12Z,OWNER,"The easiest solution would be to only show counts on the index pages for immutable (`-i`) databases. I don't like this, because the most common uses of Datasette don't in my opinion justify it. Most of the time Datasette will be running against a single, small, mutable database. I'd like to show counts in that case. Some options: - disable counts on the index page for mutable databases of more than one is attached - disable counts on the index page for databases where the file in disk is larger than a specified threshold (maybe 10MB? I'm making up this number) - implement an overall timer which cuts off table counting once the sum of time spent on it has gone beyond a second Worth prototyping a bit to see what works best.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534530973, https://github.com/simonw/datasette/issues/648#issuecomment-563000942,https://api.github.com/repos/simonw/datasette/issues/648,563000942,MDEyOklzc3VlQ29tbWVudDU2MzAwMDk0Mg==,9599,2019-12-08T22:08:14Z,2019-12-08T22:08:14Z,OWNER,"Alternative idea: a new concept of ""pages"" which live inside `templates/pages/` and where the file name minus the `.html` extension defines the URL. `templates/about/me.html` would be served at `/about/me` - but only if no matching database and table were found. This only takes effect on 404 errors from core Datasette.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534492501, https://github.com/simonw/datasette/issues/648#issuecomment-563001064,https://api.github.com/repos/simonw/datasette/issues/648,563001064,MDEyOklzc3VlQ29tbWVudDU2MzAwMTA2NA==,9599,2019-12-08T22:09:20Z,2019-12-08T22:09:20Z,OWNER,Stretch goal: it would be neat if these pages could return custom HTTP headers (eg content-type) and maybe even status codes (eg for redirects) somehow.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534492501, https://github.com/simonw/datasette/issues/648#issuecomment-563015290,https://api.github.com/repos/simonw/datasette/issues/648,563015290,MDEyOklzc3VlQ29tbWVudDU2MzAxNTI5MA==,9599,2019-12-09T00:18:17Z,2019-12-09T00:18:17Z,OWNER,"The implementation in https://github.com/simonw/datasette/commit/c5e8cd84d3ef55ed86771ac0bde0ca91d6b0e07a acts as a proof of concept. It has a big flaw though: it doesn't reuse the regular render() mechanism, which means it doesn't register custom template tags from plugins. This is bad because it means that pages rendered in this way cannot take advantage of things like [datasette-template-sql](https://github.com/simonw/datasette-template-sql). This means this issue is likely dependent on #577 - a documented mechanism to allow plugins to render templates.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534492501, https://github.com/simonw/datasette/issues/650#issuecomment-563016016,https://api.github.com/repos/simonw/datasette/issues/650,563016016,MDEyOklzc3VlQ29tbWVudDU2MzAxNjAxNg==,9599,2019-12-09T00:24:20Z,2019-12-09T00:24:20Z,OWNER,In the future I may write a script which extracts the terms from this document into a separate database table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534629631, https://github.com/simonw/datasette/issues/577#issuecomment-563449488,https://api.github.com/repos/simonw/datasette/issues/577,563449488,MDEyOklzc3VlQ29tbWVudDU2MzQ0OTQ4OA==,9599,2019-12-09T21:32:47Z,2019-12-09T21:32:47Z,OWNER,"I'm going to go with `Datasette(...).render_template(...)` - I need that for #648, and it makes sense to me that the `Datasette` class is the documented interface that plugins interact with.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497171390, https://github.com/simonw/datasette/issues/577#issuecomment-564214355,https://api.github.com/repos/simonw/datasette/issues/577,564214355,MDEyOklzc3VlQ29tbWVudDU2NDIxNDM1NQ==,9599,2019-12-10T19:47:17Z,2019-12-10T19:47:17Z,OWNER,"The reason I've been dragging my heels on adding template rendering to the Datasette class is that it feels messy - should that class be responsible for both data access AND template rendering? I think I can come to terms with this thanks to plugins. The Datasette class can represent the family of features that plugins affect - which means that having it expose the template rendering API is reasonable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497171390, https://github.com/simonw/datasette/pull/644#issuecomment-565755208,https://api.github.com/repos/simonw/datasette/issues/644,565755208,MDEyOklzc3VlQ29tbWVudDU2NTc1NTIwOA==,6025893,2019-12-14T21:33:31Z,2019-12-14T21:33:31Z,CONTRIBUTOR,"Hi @simonw Have you had a chance to look at this at all? I'm going to have a chunk of time free next week so if there is additional work needed on this, that would be a particularly convenient time for me to revisit this. Cheers","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",530513784, https://github.com/simonw/datasette/issues/494#issuecomment-565832380,https://api.github.com/repos/simonw/datasette/issues/494,565832380,MDEyOklzc3VlQ29tbWVudDU2NTgzMjM4MA==,9599,2019-12-15T18:08:54Z,2019-12-15T18:08:54Z,OWNER,"Just saw this bug again. It's annoying. Here's the code that causes it: https://github.com/simonw/datasette/blob/d6b6c9171f3fd945c4e5e4144923ac831c43c208/datasette/cli.py#L327-L333 `files` here is the list of mutable databases. `immutable` is the list of immutable ones.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",449931899, https://github.com/simonw/datasette/issues/493#issuecomment-566265079,https://api.github.com/repos/simonw/datasette/issues/493,566265079,MDEyOklzc3VlQ29tbWVudDU2NjI2NTA3OQ==,9599,2019-12-16T22:04:05Z,2019-12-16T22:04:05Z,OWNER,This is particularly relevant as plugins increasingly use `metadata.json` for their plugin configuration: https://datasette.readthedocs.io/en/0.32/plugins.html#plugin-configuration,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",449886319, https://github.com/simonw/datasette/issues/394#issuecomment-567127981,https://api.github.com/repos/simonw/datasette/issues/394,567127981,MDEyOklzc3VlQ29tbWVudDU2NzEyNzk4MQ==,132978,2019-12-18T17:18:06Z,2019-12-18T17:18:06Z,NONE,"Agreed, this would be nice to have. I'm currently working around it in `nginx` with additional location blocks: ``` location /datasette/ { proxy_pass http://127.0.0.1:8001/; proxy_redirect off; include proxy_params; } location /dna-protein-genome/ { proxy_pass http://127.0.0.1:8001/dna-protein-genome/; proxy_redirect off; include proxy_params; } location /rna-protein-genome/ { proxy_pass http://127.0.0.1:8001/rna-protein-genome/; proxy_redirect off; include proxy_params; } ``` The 2nd and 3rd above are my databases. This works, but I have a small problem with URLs like `/rna-protein-genome?params....` that I could fix with some more nginx munging. I seem to do this sort of thing once every 5 years and then have to look it all up again. Thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021, https://github.com/simonw/datasette/issues/394#issuecomment-567128636,https://api.github.com/repos/simonw/datasette/issues/394,567128636,MDEyOklzc3VlQ29tbWVudDU2NzEyODYzNg==,132978,2019-12-18T17:19:46Z,2019-12-18T17:19:46Z,NONE,"Hmmm, wait, maybe my mindless (copy/paste) use of `proxy_redirect` is causing me grief...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021, https://github.com/simonw/datasette/issues/394#issuecomment-567133734,https://api.github.com/repos/simonw/datasette/issues/394,567133734,MDEyOklzc3VlQ29tbWVudDU2NzEzMzczNA==,639012,2019-12-18T17:33:23Z,2019-12-18T17:33:23Z,CONTRIBUTOR,"FWIW I did a dumb merge of the branch here: https://github.com/jsfenfen/datasette and it seemed to work in that I could run stuff at a subdirectory, but ended up abandoning it in favor of just posting a subdomain because getting the nginx configs right was making me crazy. I still would prefer posting at a subdirectory but the subdomain seems simpler at the moment. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021, https://github.com/simonw/datasette/issues/394#issuecomment-567219479,https://api.github.com/repos/simonw/datasette/issues/394,567219479,MDEyOklzc3VlQ29tbWVudDU2NzIxOTQ3OQ==,132978,2019-12-18T21:24:23Z,2019-12-18T21:24:23Z,NONE,"@simonw What about allowing a base url. The `....` tag has been around forever. Then just use all relative URLs, which I guess is likely what you already do. See https://www.w3schools.com/TAGs/tag_base.asp","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021, https://github.com/simonw/datasette/issues/596#issuecomment-567225156,https://api.github.com/repos/simonw/datasette/issues/596,567225156,MDEyOklzc3VlQ29tbWVudDU2NzIyNTE1Ng==,132978,2019-12-18T21:40:35Z,2019-12-18T21:40:35Z,NONE,"I initially went looking for a way to hide a column completely. Today I found the setting to truncate cells, but it applies to all cells. In my case I have text columns that can have many thousands of characters. I was wondering whether the metadata JSON would be an appropriate place to indicate how columns are displayed (on a col-by-col basis). E.g., I'd like to be able to specify that only 20 chars of a given column be shown, and the font be monospace. But maybe I can do that in some other way - I barely know anything about datasette yet, sorry!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",507454958, https://github.com/simonw/datasette/issues/596#issuecomment-567226048,https://api.github.com/repos/simonw/datasette/issues/596,567226048,MDEyOklzc3VlQ29tbWVudDU2NzIyNjA0OA==,132978,2019-12-18T21:43:13Z,2019-12-18T21:43:13Z,NONE,"Meant to add that of course it would be better not to reinvent CSS (one time was already enough). But one option would be to provide a mechanism to specify a CSS class for a column (a cell, a row...) and let the user give a URL path to a CSS file on the command line.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",507454958, https://github.com/simonw/datasette/issues/651#issuecomment-568268746,https://api.github.com/repos/simonw/datasette/issues/651,568268746,MDEyOklzc3VlQ29tbWVudDU2ODI2ODc0Ng==,9599,2019-12-22T14:37:37Z,2019-12-22T14:37:37Z,OWNER,"I've not yet been able to figure out what the escaping rule are for FTS5 queries. If we figure out how those work maybe we can bundle them as a custom function? select ... where docs_fts match fts_escape(:search) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",539590148, https://github.com/simonw/datasette/issues/651#issuecomment-568269476,https://api.github.com/repos/simonw/datasette/issues/651,568269476,MDEyOklzc3VlQ29tbWVudDU2ODI2OTQ3Ng==,9599,2019-12-22T14:46:37Z,2019-12-22T14:47:03Z,OWNER,"https://stackoverflow.com/a/43756146 says that an escaping mechanism that works is this one: select * from blah where term match '""bacon"" ""and"" ""eggs""' So split on whitespace and then encapsulate each search term in double quotes.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",539590148, https://github.com/simonw/datasette/issues/654#issuecomment-568274520,https://api.github.com/repos/simonw/datasette/issues/654,568274520,MDEyOklzc3VlQ29tbWVudDU2ODI3NDUyMA==,9599,2019-12-22T15:51:58Z,2019-12-22T15:51:58Z,OWNER,"Proof of concept: ```diff diff --git a/datasette/views/base.py b/datasette/views/base.py index 5182479..39d1f77 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -1,6 +1,7 @@ import asyncio import csv import itertools +import json import re import time import urllib @@ -138,28 +139,31 @@ class BaseView(AsgiView): ) extra_template_vars.update(extra_vars) + template_context = { + **context, + **{ + ""app_css_hash"": self.ds.app_css_hash(), + ""select_templates"": select_templates, + ""zip"": zip, + ""body_scripts"": body_scripts, + ""extra_css_urls"": self._asset_urls( + ""extra_css_urls"", template, context + ), + ""extra_js_urls"": self._asset_urls( + ""extra_js_urls"", template, context + ), + ""format_bytes"": format_bytes, + ""database_url"": self.database_url, + ""database_color"": self.database_color, + }, + **extra_template_vars, + } + if request.args.get(""_context""): + return Response.html(""
{}
"".format( + escape(json.dumps(template_context, default=repr, indent=4)) + )) return Response.html( - await template.render_async( - { - **context, - **{ - ""app_css_hash"": self.ds.app_css_hash(), - ""select_templates"": select_templates, - ""zip"": zip, - ""body_scripts"": body_scripts, - ""extra_css_urls"": self._asset_urls( - ""extra_css_urls"", template, context - ), - ""extra_js_urls"": self._asset_urls( - ""extra_js_urls"", template, context - ), - ""format_bytes"": format_bytes, - ""database_url"": self.database_url, - ""database_color"": self.database_color, - }, - **extra_template_vars, - } - ) + await template.render_async(template_context) ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",541467590, https://github.com/simonw/datasette/issues/654#issuecomment-568274570,https://api.github.com/repos/simonw/datasette/issues/654,568274570,MDEyOklzc3VlQ29tbWVudDU2ODI3NDU3MA==,9599,2019-12-22T15:52:43Z,2019-12-22T15:52:43Z,OWNER,"One problem with this: what if secrets end up being dumped out in this debug view? This won't happen with default Datasette but could potentially happen with a plugin. This feature should be opt-in - maybe a `template_debug:1` config setting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",541467590, https://github.com/simonw/datasette/issues/577#issuecomment-568276310,https://api.github.com/repos/simonw/datasette/issues/577,568276310,MDEyOklzc3VlQ29tbWVudDU2ODI3NjMxMA==,9599,2019-12-22T16:10:31Z,2019-12-22T16:10:31Z,OWNER,"The code in question currently lives in `BaseView.render()`: https://github.com/simonw/datasette/blob/d54318fc7f2565e6121920ce1ea9cb8b700e629a/datasette/views/base.py#L106-L163 Should `datasette.render_template()` do exactly this, or should it be slightly different? Plugins need the option to not pass a `request` object - so maybe that parameter becomes optional. Perhaps plugins should be able to render templates without other plugins getting to inject their own variables? Does it always make sense to dump in all of those extra template context variables?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497171390, https://github.com/simonw/datasette/issues/654#issuecomment-568276548,https://api.github.com/repos/simonw/datasette/issues/654,568276548,MDEyOklzc3VlQ29tbWVudDU2ODI3NjU0OA==,9599,2019-12-22T16:13:11Z,2019-12-22T16:13:11Z,OWNER,"Documentation: https://datasette.readthedocs.io/en/latest/config.html#template-debug Demos: * https://latest.datasette.io/?_context=1 * https://latest.datasette.io/fixtures?_context=1 * https://latest.datasette.io/fixtures/roadside_attractions?_context=1 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",541467590, https://github.com/simonw/datasette/issues/581#issuecomment-569005894,https://api.github.com/repos/simonw/datasette/issues/581,569005894,MDEyOklzc3VlQ29tbWVudDU2OTAwNTg5NA==,9599,2019-12-26T08:03:59Z,2019-12-26T08:03:59Z,OWNER,"To solve https://github.com/simonw/datasette-atom/issues/6 the name of the current canned query should be made available somehow. That way the plugin configuration could specify that the title for browse/feed should be X.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",502993509, https://github.com/simonw/datasette/issues/655#issuecomment-569056952,https://api.github.com/repos/simonw/datasette/issues/655,569056952,MDEyOklzc3VlQ29tbWVudDU2OTA1Njk1Mg==,9599,2019-12-26T13:16:14Z,2019-12-26T13:16:53Z,OWNER,"I just tried copying and pasting in [Ace](https://ace.c9.io/) - an alternative rich code highlighting JavaScript editor - and it didn't work reliably there either. Maybe the fix here is to detect mobile safari and avoid loading codemirror entirely for that browser, falling back on a textarea?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",542553350, https://github.com/simonw/sqlite-utils/issues/70#issuecomment-569130037,https://api.github.com/repos/simonw/sqlite-utils/issues/70,569130037,MDEyOklzc3VlQ29tbWVudDU2OTEzMDAzNw==,9599,2019-12-26T20:39:04Z,2019-12-26T20:39:04Z,OWNER,"I hadn't thought about those at all. Are you suggesting a utility mechanism in the library for setting it up so that, for a specific foreign key, rows are deleted from other tables if the row they are pointing at is deleted?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",539204432, https://github.com/simonw/sqlite-utils/issues/66#issuecomment-569131397,https://api.github.com/repos/simonw/sqlite-utils/issues/66,569131397,MDEyOklzc3VlQ29tbWVudDU2OTEzMTM5Nw==,9599,2019-12-26T20:49:11Z,2019-12-26T20:49:11Z,OWNER,Don't forget to update the documentation. This will be quite an involved task.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",521868864, https://github.com/simonw/sqlite-utils/issues/66#issuecomment-569226620,https://api.github.com/repos/simonw/sqlite-utils/issues/66,569226620,MDEyOklzc3VlQ29tbWVudDU2OTIyNjYyMA==,9599,2019-12-27T09:05:29Z,2019-12-27T09:05:36Z,OWNER,"I'm going to start by ignoring the existing `upsert` entirely and implementing `.insert(..., replace=True)` and `$ sqlite-utils insert --replace`. Including updating the tests. Then I'll figure out how to implement the new `.upsert()` / `$ sqlite-utils upsert`. Then I'll update the documentation, and ship `sqlite-utils` 2.0.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",521868864, https://github.com/simonw/sqlite-utils/issues/71#issuecomment-569233996,https://api.github.com/repos/simonw/sqlite-utils/issues/71,569233996,MDEyOklzc3VlQ29tbWVudDU2OTIzMzk5Ng==,9599,2019-12-27T09:45:17Z,2019-12-27T09:45:17Z,OWNER,"It looks like those backports no longer include sqlite3 - Google Searches still find it but when you click through to launchpad you get 404s: https://launchpad.net/~jonathonf/+archive/ubuntu/codelite/+build/10511920 Maybe Travis have a newer Ubuntu I can use that ships with FTS5 in its SQLite?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",542814756, https://github.com/simonw/sqlite-utils/issues/71#issuecomment-569234096,https://api.github.com/repos/simonw/sqlite-utils/issues/71,569234096,MDEyOklzc3VlQ29tbWVudDU2OTIzNDA5Ng==,9599,2019-12-27T09:45:52Z,2019-12-27T09:45:52Z,OWNER,I'll try `bionic`: https://docs.travis-ci.com/user/reference/bionic/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",542814756, https://github.com/simonw/sqlite-utils/issues/71#issuecomment-569234571,https://api.github.com/repos/simonw/sqlite-utils/issues/71,569234571,MDEyOklzc3VlQ29tbWVudDU2OTIzNDU3MQ==,9599,2019-12-27T09:48:48Z,2019-12-27T09:48:48Z,OWNER,That fixed it: https://travis-ci.com/simonw/sqlite-utils/builds/142443259,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",542814756, https://github.com/simonw/sqlite-utils/issues/66#issuecomment-569588216,https://api.github.com/repos/simonw/sqlite-utils/issues/66,569588216,MDEyOklzc3VlQ29tbWVudDU2OTU4ODIxNg==,9599,2019-12-30T05:31:45Z,2019-12-30T05:31:45Z,OWNER,Last step: update changelog and ship 2.0. Then I can close this issue.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",521868864, https://github.com/simonw/sqlite-utils/pull/67#issuecomment-569844320,https://api.github.com/repos/simonw/sqlite-utils/issues/67,569844320,MDEyOklzc3VlQ29tbWVudDU2OTg0NDMyMA==,9599,2019-12-31T01:29:43Z,2019-12-31T01:29:43Z,OWNER,I don't really care about 3.5 any more.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",529376481, https://github.com/simonw/sqlite-utils/issues/66#issuecomment-569844426,https://api.github.com/repos/simonw/sqlite-utils/issues/66,569844426,MDEyOklzc3VlQ29tbWVudDU2OTg0NDQyNg==,9599,2019-12-31T01:30:20Z,2019-12-31T01:30:20Z,OWNER,"I shipped 2.0 - release notes here: https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v2 I also wrote about it on my blog: https://simonwillison.net/2019/Dec/30/sqlite-utils-2/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",521868864, https://github.com/simonw/sqlite-utils/issues/73#issuecomment-570931650,https://api.github.com/repos/simonw/sqlite-utils/issues/73,570931650,MDEyOklzc3VlQ29tbWVudDU3MDkzMTY1MA==,9599,2020-01-05T17:34:33Z,2020-01-05T17:34:33Z,OWNER,Released as 2.0.1 https://github.com/simonw/sqlite-utils/releases/tag/2.0.1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",545407916, https://github.com/simonw/sqlite-utils/issues/73#issuecomment-571138093,https://api.github.com/repos/simonw/sqlite-utils/issues/73,571138093,MDEyOklzc3VlQ29tbWVudDU3MTEzODA5Mw==,82988,2020-01-06T13:28:31Z,2020-01-06T13:28:31Z,NONE,"I think I actually had several issues in play... The missing key was one, but I think there is also an issue as per below. For example, in the following: ```python def init_testdb(dbname='test.db'): if os.path.exists(dbname): os.remove(dbname) conn = sqlite3.connect(dbname) db = Database(conn) return conn, db conn, db = init_testdb() c = conn.cursor() c.executescript('CREATE TABLE ""test1"" (""Col1"" TEXT, ""Col2"" TEXT, PRIMARY KEY (""Col1""));') c.executescript('CREATE TABLE ""test2"" (""Col1"" TEXT, ""Col2"" TEXT, PRIMARY KEY (""Col1""));') print('Test 1...') for i in range(3): db['test1'].upsert_all([{'Col1':'a', 'Col2':'x'},{'Col1':'b', 'Col2':'x'}], pk=('Col1')) db['test2'].upsert_all([{'Col1':'a', 'Col2':'x'},{'Col1':'b', 'Col2':'x'}], pk=('Col1')) print('Test 2...') for i in range(3): db['test1'].upsert_all([{'Col1':'a', 'Col2':'x'},{'Col1':'b', 'Col2':'x'}], pk=('Col1')) db['test2'].upsert_all([{'Col1':'a', 'Col2':'x'},{'Col1':'b', 'Col2':'x'}, {'Col1':'c','Col2':'x'}], pk=('Col1')) print('Done...') --------------------------------------------------------------------------- Test 1... Test 2... IndexError: list index out of range --------------------------------------------------------------------------- IndexError Traceback (most recent call last) in 22 print('Test 2...') 23 for i in range(3): ---> 24 db['test1'].upsert_all([{'Col1':'a', 'Col2':'x'},{'Col1':'b', 'Col2':'x'}], pk=('Col1')) 25 db['test2'].upsert_all([{'Col1':'a', 'Col2':'x'},{'Col1':'b', 'Col2':'x'}, 26 {'Col1':'c','Col2':'x'}], pk=('Col1')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, extracts) 1157 alter=alter, 1158 extracts=extracts, -> 1159 upsert=True, 1160 ) 1161 /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, upsert) 1097 # self.last_rowid will be 0 if a ""INSERT OR IGNORE"" happened 1098 if (hash_id or pk) and self.last_rowid: -> 1099 row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] 1100 if hash_id: 1101 self.last_pk = row[hash_id] IndexError: list index out of range ``` the first test works but the second fails. Is the length of the list of items being upserted leaking somewhere?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",545407916, https://github.com/simonw/datasette/issues/576#issuecomment-571366326,https://api.github.com/repos/simonw/datasette/issues/576,571366326,MDEyOklzc3VlQ29tbWVudDU3MTM2NjMyNg==,9599,2020-01-06T23:50:33Z,2020-01-06T23:50:33Z,OWNER,I wrote about this a bit here: https://simonwillison.net/2020/Jan/6/sitemap-xml/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497170355, https://github.com/dogsheep/github-to-sqlite/issues/16#issuecomment-571412923,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16,571412923,MDEyOklzc3VlQ29tbWVudDU3MTQxMjkyMw==,15092,2020-01-07T03:06:46Z,2020-01-07T03:06:46Z,NONE,"I re-tried after doing `auth`, and I get the same result.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",546051181, https://github.com/simonw/sqlite-utils/issues/73#issuecomment-572870032,https://api.github.com/repos/simonw/sqlite-utils/issues/73,572870032,MDEyOklzc3VlQ29tbWVudDU3Mjg3MDAzMg==,9599,2020-01-10T04:38:41Z,2020-01-10T04:38:41Z,OWNER,"Odd.. I'm not able to replicate that error. Here's what I got: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",545407916, https://github.com/simonw/sqlite-utils/issues/74#issuecomment-572871797,https://api.github.com/repos/simonw/sqlite-utils/issues/74,572871797,MDEyOklzc3VlQ29tbWVudDU3Mjg3MTc5Nw==,9599,2020-01-10T04:47:55Z,2020-01-10T04:47:55Z,OWNER,"This is odd. I'd love to see more about that result object. Could you try running `pytest --pdb` and then `result.exit_code, result.exception` in the PDB prompt, something like this? ``` $ pytest --pdb ========================================================= test session starts ========================================================= platform darwin -- Python 3.7.4, pytest-5.2.2, py-1.8.0, pluggy-0.13.0 rootdir: /Users/simonw/Dropbox/Development/sqlite-utils plugins: cov-2.8.1 collected 216 items tests/test_black.py s [ 0%] tests/test_cli.py F >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> traceback >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> db_path = '/private/var/folders/bl/5x847xbj2yb7xmp7f2tz7l280000gn/T/pytest-of-simonw/pytest-3/test_tables0/test.db' def test_tables(db_path): result = CliRunner().invoke(cli.cli, [""tables1"", db_path]) > assert '[{""table"": ""Gosh""},\n {""table"": ""Gosh2""}]' == result.output.strip() E assert '[{""table"": ""...e"": ""Gosh2""}]' == '' E - [{""table"": ""Gosh""}, E - {""table"": ""Gosh2""}] tests/test_cli.py:28: AssertionError >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> entering PDB >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PDB post_mortem (IO-capturing turned off) >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> > /Users/simonw/Dropbox/Development/sqlite-utils/tests/test_cli.py(28)test_tables() -> assert '[{""table"": ""Gosh""},\n {""table"": ""Gosh2""}]' == result.output.strip() (Pdb) result.exit_code, result.exception (1, OperationalError('near ""/"": syntax error')) ``` That should show the exception that caused the script to fail to run.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",546073980, https://github.com/simonw/sqlite-utils/issues/73#issuecomment-573047321,https://api.github.com/repos/simonw/sqlite-utils/issues/73,573047321,MDEyOklzc3VlQ29tbWVudDU3MzA0NzMyMQ==,82988,2020-01-10T14:02:56Z,2020-01-10T14:09:23Z,NONE,"Hmmm... just tried with installs from pip and the repo (v2.0.0 and v2.0.1) and I get the error each time (start of second run through the second loop). Could it be sqlite3? I'm on 3.30.1. UPDATE: just tried it on jupyter.org/try and I get the error there, too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",545407916, https://github.com/simonw/sqlite-utils/issues/74#issuecomment-573388052,https://api.github.com/repos/simonw/sqlite-utils/issues/74,573388052,MDEyOklzc3VlQ29tbWVudDU3MzM4ODA1Mg==,15092,2020-01-12T06:51:30Z,2020-01-12T06:51:30Z,CONTRIBUTOR,"Thanks. That showed me that there was a click cli runner error, and setting `export LANG=en_US.UTF-8` fixed it. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",546073980, https://github.com/simonw/sqlite-utils/issues/74#issuecomment-573389669,https://api.github.com/repos/simonw/sqlite-utils/issues/74,573389669,MDEyOklzc3VlQ29tbWVudDU3MzM4OTY2OQ==,15092,2020-01-12T07:21:17Z,2020-01-12T07:21:17Z,CONTRIBUTOR,"I guess there is some extra flag for ` CliRunner.invoke` to check exitcode and raise the exception, or that should be an extra assert added.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",546073980, https://github.com/simonw/datasette/issues/657#issuecomment-575321322,https://api.github.com/repos/simonw/datasette/issues/657,575321322,MDEyOklzc3VlQ29tbWVudDU3NTMyMTMyMg==,1055831,2020-01-16T20:01:43Z,2020-01-16T20:01:43Z,NONE,"I have successfully tested datasette using a parquet VIRTUAL TABLE. In the first terminal: ```datasette airports.db --load-extension=libparquet``` In another terminal I load the same sqlite db file using the sqlite3 cli client. ```$ sqlite3 airports.db``` and then load the parquet extension and create the virtual table. ``` sqlite> .load /home/darreng/metars/libparquet sqlite> CREATE VIRTUAL TABLE mytable USING parquet('/home/xx/data.parquet'); ``` Now the parquet virtual table is usable by the datasette web UI. Its not an ideal solution but is a proof that datasette works the parquet extension.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089, https://github.com/simonw/sqlite-utils/issues/70#issuecomment-575799104,https://api.github.com/repos/simonw/sqlite-utils/issues/70,575799104,MDEyOklzc3VlQ29tbWVudDU3NTc5OTEwNA==,26292069,2020-01-17T21:20:17Z,2020-01-17T21:20:17Z,NONE,"Omg sorry I took so long to reply! On SQL we can say how the foreign key behaves when it is deleted or updated on the parent table (see https://www.sqlitetutorial.net/sqlite-foreign-key/ for more details). I did not see clearly how to create tables with this feature on sqlite-utils library.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",539204432, https://github.com/simonw/datasette/issues/655#issuecomment-575950911,https://api.github.com/repos/simonw/datasette/issues/655,575950911,MDEyOklzc3VlQ29tbWVudDU3NTk1MDkxMQ==,9599,2020-01-19T00:13:56Z,2020-01-19T00:13:56Z,OWNER,"https://www.shellcheck.net/ has a workaround for this issue: ![9BA173CA-7F1E-447D-A672-1E02E46CA9B7](https://user-images.githubusercontent.com/9599/72672359-81822f80-3a0d-11ea-81a7-a9a0f1b0c34b.jpeg) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",542553350, https://github.com/simonw/datasette/issues/659#issuecomment-575951056,https://api.github.com/repos/simonw/datasette/issues/659,575951056,MDEyOklzc3VlQ29tbWVudDU3NTk1MTA1Ng==,9599,2020-01-19T00:15:39Z,2020-01-19T00:15:39Z,OWNER,"Great feedback, thanks. This is mainly caused by the lack of an official Datasette website! I've been working on that - I'll move the news section out of the readme as soon as it's live.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",551834842, https://github.com/simonw/datasette/issues/656#issuecomment-576293773,https://api.github.com/repos/simonw/datasette/issues/656,576293773,MDEyOklzc3VlQ29tbWVudDU3NjI5Mzc3Mw==,6371750,2020-01-20T14:17:11Z,2020-01-20T14:17:11Z,CONTRIBUTOR,Seems that headers and definitions has simply to be filled as an HTML table in the description field of matadata.json.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",546961357, https://github.com/simonw/datasette/issues/657#issuecomment-576759416,https://api.github.com/repos/simonw/datasette/issues/657,576759416,MDEyOklzc3VlQ29tbWVudDU3Njc1OTQxNg==,1055831,2020-01-21T16:20:19Z,2020-01-21T16:20:19Z,NONE,"Hi, I've completed some changes to my fork of datasette that allows it to automatically create the parquet virtual table when you supply it with a filename that has the "".parquet"" extension. I had to figure out how to make the ""CREATE VIRTUAL TABLE"" statement only be applied to the fake in memory parquet database and not to any others that were also being loaded. Thus it supports mixed mode databases e.g ``` datasette my_test.parquet normal_sqlite_file.db --load-extension=libparquet.so --load-extensio n=mod_spatialite.so ``` Please see my changes here: https://github.com/dazzag24/datasette/commit/8e18394353114f17291fd1857073b1e0485a1faf Thanks ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089, https://github.com/simonw/datasette/pull/660#issuecomment-576935300,https://api.github.com/repos/simonw/datasette/issues/660,576935300,MDEyOklzc3VlQ29tbWVudDU3NjkzNTMwMA==,9599,2020-01-21T23:27:59Z,2020-01-21T23:27:59Z,OWNER,"Most excellent fix, thank you.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",552773632, https://github.com/simonw/datasette/issues/651#issuecomment-579675357,https://api.github.com/repos/simonw/datasette/issues/651,579675357,MDEyOklzc3VlQ29tbWVudDU3OTY3NTM1Nw==,2181410,2020-01-29T09:45:00Z,2021-07-14T19:26:06Z,NONE,"Hi Simon Thank you for adding the escape_function, but it does not work on my datasette-installation (0.33). I've added the following file to my datasette-dir: `/plugins/sql_functions.py`: ```python from datasette import hookimpl def escape_fts_query(query): bits = query.split() return ' '.join('""{}""'.format(bit.replace('""', '')) for bit in bits) @hookimpl def prepare_connection(conn): conn.create_function(""escape_fts_query"", 1, escape_fts_query)` ``` It has no effect on the standard queries to the tables though, as they still produce errors when including any characters like '-', '/', '+' or '?' Does the function only work when using costum queries, where I can include the escape_fts-function explicitly in the sql-query? PS. I'm calling datasette with --plugins=plugins, and my other plugins work just fine. PPS. The fts5 virtual table is created with 'sqlite3' like so: `CREATE VIRTUAL TABLE ""cases_fts"" USING FTS5( title, subtitle, resume, suggestion, presentation, detail = full, content_rowid = 'id', content = 'cases', tokenize='unicode61', 'remove_diacritics 2', 'tokenchars ""-_""' );` Thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",539590148, https://github.com/simonw/datasette/pull/663#issuecomment-579928863,https://api.github.com/repos/simonw/datasette/issues/663,579928863,MDEyOklzc3VlQ29tbWVudDU3OTkyODg2Mw==,9599,2020-01-29T19:48:05Z,2020-01-29T19:48:05Z,OWNER,Needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557077945, https://github.com/simonw/datasette/issues/661#issuecomment-580075725,https://api.github.com/repos/simonw/datasette/issues/661,580075725,MDEyOklzc3VlQ29tbWVudDU4MDA3NTcyNQ==,134771,2020-01-30T04:17:51Z,2020-01-30T04:17:51Z,NONE,Thanks for the elegant solution to the problem as stated. I'm packaging right now :-),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",555832585, https://github.com/simonw/sqlite-utils/issues/77#issuecomment-580515506,https://api.github.com/repos/simonw/sqlite-utils/issues/77,580515506,MDEyOklzc3VlQ29tbWVudDU4MDUxNTUwNg==,9599,2020-01-30T23:48:41Z,2020-01-30T23:48:41Z,OWNER,"Potential design: a `conversions={}` option. Used like this: ```python db[table].insert(record, conversions={""geom"": ""GeomFromText(?, 4326)""}) ``` The `conversions=` key would be supported on `.insert()`, `.insert_all()`, `.upsert()` etc. It could also be passed to the `db.table()` constructor function: ```python table = db.table( ""features"", pk=""id"", conversions={ ""geom"": ""GeomFromText(?, 4326)"" } ) # Then used like this: table.insert(record) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557825032, https://github.com/simonw/sqlite-utils/pull/75#issuecomment-580523995,https://api.github.com/repos/simonw/sqlite-utils/issues/75,580523995,MDEyOklzc3VlQ29tbWVudDU4MDUyMzk5NQ==,9599,2020-01-31T00:21:11Z,2020-01-31T00:21:11Z,OWNER,"This makes sense, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",546078359, https://github.com/simonw/sqlite-utils/issues/77#issuecomment-580527238,https://api.github.com/repos/simonw/sqlite-utils/issues/77,580527238,MDEyOklzc3VlQ29tbWVudDU4MDUyNzIzOA==,9599,2020-01-31T00:34:02Z,2020-01-31T00:34:02Z,OWNER,Documentation: https://sqlite-utils.readthedocs.io/en/stable/python-api.html#python-api-conversions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557825032, https://github.com/simonw/sqlite-utils/pull/80#issuecomment-580567505,https://api.github.com/repos/simonw/sqlite-utils/issues/80,580567505,MDEyOklzc3VlQ29tbWVudDU4MDU2NzUwNQ==,9599,2020-01-31T03:39:19Z,2020-01-31T03:39:19Z,OWNER,"Still needs documentation and tests. Also I'm not certain that this should be an argument you can pass to the `.table()` constructor, need to think that over.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557892819, https://github.com/simonw/sqlite-utils/pull/80#issuecomment-580567604,https://api.github.com/repos/simonw/sqlite-utils/issues/80,580567604,MDEyOklzc3VlQ29tbWVudDU4MDU2NzYwNA==,9599,2020-01-31T03:39:58Z,2020-01-31T03:39:58Z,OWNER,Perhaps this should be called `after_create` instead of `on_create`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557892819, https://github.com/simonw/sqlite-utils/pull/80#issuecomment-580567886,https://api.github.com/repos/simonw/sqlite-utils/issues/80,580567886,MDEyOklzc3VlQ29tbWVudDU4MDU2Nzg4Ng==,9599,2020-01-31T03:41:31Z,2020-01-31T03:41:31Z,OWNER,I think it does make sense to be able to pass it to the `.table()` constructor.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557892819, https://github.com/simonw/sqlite-utils/pull/80#issuecomment-580569059,https://api.github.com/repos/simonw/sqlite-utils/issues/80,580569059,MDEyOklzc3VlQ29tbWVudDU4MDU2OTA1OQ==,9599,2020-01-31T03:48:41Z,2020-01-31T03:48:41Z,OWNER,"This may not be the right feature after all, see https://github.com/simonw/geojson-to-sqlite/issues/6#issuecomment-580569002","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557892819, https://github.com/simonw/sqlite-utils/pull/80#issuecomment-580584269,https://api.github.com/repos/simonw/sqlite-utils/issues/80,580584269,MDEyOklzc3VlQ29tbWVudDU4MDU4NDI2OQ==,9599,2020-01-31T05:08:04Z,2020-01-31T05:08:04Z,OWNER,Ditching this since it won't actually solve my problem.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",557892819, https://github.com/simonw/sqlite-utils/issues/73#issuecomment-580745213,https://api.github.com/repos/simonw/sqlite-utils/issues/73,580745213,MDEyOklzc3VlQ29tbWVudDU4MDc0NTIxMw==,82988,2020-01-31T14:02:38Z,2020-01-31T14:21:09Z,NONE,"So the conundrum continues.. The simple test case above now runs, but if I upsert a large number of new records (successfully) and then try to upsert a fewer number of new records to a different table, I get the same error. If I run the same upserts again (which in the first case means there are no new records to add, because they were already added), the second upsert works correctly. It feels as if the number of items added via an upsert >> the number of items I try to add in an upsert immediately after, I get the error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",545407916, https://github.com/simonw/sqlite-utils/issues/81#issuecomment-581071010,https://api.github.com/repos/simonw/sqlite-utils/issues/81,581071010,MDEyOklzc3VlQ29tbWVudDU4MTA3MTAxMA==,9599,2020-02-01T21:27:00Z,2020-02-01T21:27:00Z,OWNER,"Here's the current method: https://github.com/simonw/sqlite-utils/blob/f7289174e66ae4d91d57de94bbd9d09fabf7aff4/sqlite_utils/db.py#L823-L845 If I make it a utility function instead of a class method I could ensure it is directly importable like so: ```python from sqlite_utils import detect_column_types ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",558600274, https://github.com/simonw/sqlite-utils/issues/81#issuecomment-581071116,https://api.github.com/repos/simonw/sqlite-utils/issues/81,581071116,MDEyOklzc3VlQ29tbWVudDU4MTA3MTExNg==,9599,2020-02-01T21:28:35Z,2020-02-01T21:28:53Z,OWNER,"Should I keep `table.detect_column_types()` working so as not to break existing code? If it was part of the documented API then I wouldn't break that without bumping to 3.x. Since it's undocumented I'm going to make it as a breaking change instead (and bump the `geojson-to-sqlite` dependency version).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",558600274, https://github.com/simonw/sqlite-utils/issues/81#issuecomment-581071235,https://api.github.com/repos/simonw/sqlite-utils/issues/81,581071235,MDEyOklzc3VlQ29tbWVudDU4MTA3MTIzNQ==,9599,2020-02-01T21:30:09Z,2020-02-01T21:30:09Z,OWNER,Actually I'll put it in the `utils.py` module.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",558600274, https://github.com/simonw/sqlite-utils/issues/81#issuecomment-581071434,https://api.github.com/repos/simonw/sqlite-utils/issues/81,581071434,MDEyOklzc3VlQ29tbWVudDU4MTA3MTQzNA==,9599,2020-02-01T21:32:34Z,2020-02-01T21:32:34Z,OWNER,While I'm at it I think I'll rename it to `suggest_column_types` - it's not really detecting them since the input is just a list of dictionaries.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",558600274, https://github.com/simonw/sqlite-utils/issues/82#issuecomment-581651409,https://api.github.com/repos/simonw/sqlite-utils/issues/82,581651409,MDEyOklzc3VlQ29tbWVudDU4MTY1MTQwOQ==,9599,2020-02-03T22:32:41Z,2020-02-03T22:32:41Z,OWNER,This should work - the data should be chunked automatically. It looks like this is a bug.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",559197745, https://github.com/simonw/sqlite-utils/issues/82#issuecomment-581652388,https://api.github.com/repos/simonw/sqlite-utils/issues/82,581652388,MDEyOklzc3VlQ29tbWVudDU4MTY1MjM4OA==,9599,2020-02-03T22:35:44Z,2020-02-03T22:35:44Z,OWNER,"I can't replicate this problem: ``` /tmp $ sqlite-utils --version sqlite-utils, version 2.2 /tmp $ curl ""https://data.nasa.gov/resource/y77d-th95.json"" | sqlite-utils insert meteorites.db meteorites - --pk=id % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 240k 0 240k 0 0 185k 0 --:--:-- 0:00:01 --:--:-- 185k ``` Could you run `sqlite-utils --version` and tell me what you get?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",559197745,