id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason
267513424,MDU6SXNzdWUyNjc1MTM0MjQ=,1,Addressable pages for every row in a table,9599,closed,0,,2857392,6,2017-10-23T00:44:16Z,2017-10-24T14:11:04Z,2017-10-24T14:11:03Z,OWNER,," /database-name-7sha256/table-name/compound-pk
/database-name-7sha256/table-name/compound-pk.json
Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
267726219,MDU6SXNzdWUyNjc3MjYyMTk=,16,Default HTML/CSS needs to look reasonable and be responsive,9599,closed,0,,2857392,6,2017-10-23T16:05:22Z,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,,"Version one should have the following characteristics:
- Looks OK
- Works great on mobile
- Loads extremely fast
- No JavaScript! At least not in v1.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
267788884,MDU6SXNzdWUyNjc3ODg4ODQ=,23,Support Django-style filters in querystring arguments,9599,closed,0,,2857392,6,2017-10-23T19:29:42Z,2017-10-25T04:23:03Z,2017-10-25T04:23:02Z,OWNER,,"e.g
/database/table?name__contains=Simon&age__gte=4
Same format as Django: double underscore as the split.
If you need to match against a column that happens to contain a double underscore in its official name, do this:
/database/table?weird__column__exact=Simon
__exact is the default operation if none is supplied.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
273678673,MDU6SXNzdWUyNzM2Nzg2NzM=,85,Detect foreign keys and use them to link HTML pages together,9599,closed,0,,2919870,6,2017-11-14T06:12:05Z,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,,"https://stackoverflow.com/a/44430157/6083 documents the PRAGMA needed to extract foreign key references for a table.
At a minimum we can link column values known to be foreign keys to the corresponding row page.
We could try to summarize the linked row in some way too - somehow extracting a sensible link title, maybe based on additional configuration in the metadata.json file.
Still todo:
- [x] Fix it to csvs-to-sqlite refactoring command correctly creates primary key on generated tables
- [x] Ship new csvs-to-sqlite with refactoring command
- [x] Refactor column logic to be more predictable in our templates (the rowid special case)
- [x] Mechanism by which table metadata can specify the ""label"" column for a table
- [x] Automatically set the label column as the first column that isn't a primary key (falling back on primary key)
- [x] Code which runs a ""select id, label from table where id in (...)"" query as part of the tableview and populates a lookup dictionary
- [x] Modify templates to use values from that lookup dictionary",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
280315352,MDU6SXNzdWUyODAzMTUzNTI=,167,Nasty bug: last column not being correctly displayed,9599,closed,0,,2949431,6,2017-12-07T23:23:46Z,2017-12-10T01:00:21Z,2017-12-10T01:00:20Z,OWNER,,"e.g. https://datasette-bwnojrhmmg.now.sh/dk3-bde9a9a/dk?source__contains=http

The JSON output shows that the column is there, but is being displayed incorrectly:
https://datasette-bwnojrhmmg.now.sh/dk3-bde9a9a/dk.jsono?source__contains=http

",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
310533258,MDU6SXNzdWUzMTA1MzMyNTg=,191,Figure out how to bundle a more up-to-date SQLite,9599,closed,0,,,6,2018-04-02T16:33:25Z,2018-07-10T17:46:13Z,2018-07-10T17:46:13Z,OWNER,,"The version of SQLite that ships with Python 3 is a bit limited - it doesn't support row values for example https://www.sqlite.org/rowvalue.html
Figure out how to bundle a more recent SQLite engine with datasette. We need to figure out two cases:
* Bundling a recent version in a Dockerfile build. I expect this to be quite easy.
* Making a more recent version available to people hacking around in Mac OS X. I have no idea how to start on this.
I want it working on Mac OS X too because I don't want to force Docker as a dependency for anyone who just want to hack around with Datasette a little and run the test suite.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
314471743,MDU6SXNzdWUzMTQ0NzE3NDM=,211,Load plugins from a `--plugins-dir=plugins/` directory,9599,closed,0,,,6,2018-04-16T01:17:43Z,2018-04-16T05:22:02Z,2018-04-16T05:22:02Z,OWNER,,"In #14 and 33c7c53ff87c2 I've added working support for setuptools entry_points plugins. These can be installed from PyPI using `pip install ...`.
I imagine some projects will benefit from being able to add plugins without first publishing them to PyPI. Datasette already supports [loading custom templates](http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates) like so:
datasette serve --template-dir=mytemplates/ mydb.db
I propose an additional option, `--plugins-dir=` which specifies a directory full of `blah.py` files which will be loaded into Datasette when the application server starts.
datasette serve --plugins-dir=myplugins/ mydb.db
This will also need to be supported by `datasette publish` as those Python files should be copied up as part of the deployment.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
314506446,MDU6SXNzdWUzMTQ1MDY0NDY=,214,Ability for plugins to define extra JavaScript and CSS,9599,closed,0,,,6,2018-04-16T05:29:34Z,2020-09-30T20:36:11Z,2018-04-18T03:13:03Z,OWNER,,"This can hook in to the existing `extra_css_urls` and `extra_js_urls` mechanism:
https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L304-L305
The plugins should be able to bundle their own assets though, so it will also have to integrate with the `/static/` static mounts mechanism somehow:
https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1255-L1257
Refs #14",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
322787470,MDU6SXNzdWUzMjI3ODc0NzA=,259,inspect() should detect many-to-many relationships,9599,closed,0,,,6,2018-05-14T12:03:58Z,2019-05-23T03:55:37Z,2019-05-23T03:55:37Z,OWNER,,"Relates to #255 - in particular supporting facets across M2M relationships.
It should be possible for `.inspect()` to notice when a table has two foreign keys to two different tables, and assume that this means there is a M2M relationship between those tables.
When rendering a table with a m2m relationship we could display the first X associated records as a comma separated list of hyperlinks in a new column on the table view, with a column name derived from the table on the other side.
Since SQLite doesn't have RANK or an equivalent of https://www.xaprb.com/blog/2006/12/02/how-to-number-rows-in-mysql/ this would be implemented as N+1 queries (one query per cell that we want to display an m2m summary). This should be OK in SQLite: https://sqlite.org/np1queryprob.html",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
333086005,MDU6SXNzdWUzMzMwODYwMDU=,313,Deploy demo of Datasette on every commit that passes tests,9599,closed,0,,,6,2018-06-17T19:19:12Z,2018-06-17T21:52:58Z,2018-06-17T21:52:58Z,OWNER,,We can use Travis CI and Zeit Now to ensure there is always a live demo of current master. We can ship archived demos for releases as well.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
349827640,MDU6SXNzdWUzNDk4Mjc2NDA=,359,Faceted browse against a JSON list of tags,9599,closed,0,,,6,2018-08-12T17:01:14Z,2019-05-29T21:39:12Z,2019-05-03T00:21:44Z,OWNER,,"If a table has a `[""foo"", ""bar"", ""baz""]` JSON column allow that to be faceted against.
- [x] Support `?column__arraycontains=x` filter queries
- [x] Support `?_facet_array=column` faceting",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
395236066,MDU6SXNzdWUzOTUyMzYwNjY=,393,"CSV export in ""Advanced export"" pane doesn't respect query",1727065,closed,0,,,6,2019-01-02T12:39:41Z,2021-06-17T18:14:24Z,2019-01-03T02:44:10Z,NONE,,"It looks like there's an inconsistency when exporting to CSV via the the web interface. Say I'm looking at [songs released in 1989](https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989) in the `classic-rock/classic-rock-song-list` table from the Five Thirty Eight data. The JSON and CSV export links at the top of the page both give me filtered data using `Release+Year__exact=1989` in the URL. In the `Advanced export` tab, though, the CSV option gives me the whole data set, while the JSON options preserve the query.
It may be that this is intended behaviour related to the streaming CSV stuff [discussed here](https://github.com/simonw/datasette/issues/266), but if that's the case then I think it should be a little clearer.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
449818897,MDU6SXNzdWU0NDk4MTg4OTc=,24,Additional Column Constraints?,98555,closed,0,,,6,2019-05-29T13:47:03Z,2019-06-13T06:47:17Z,2019-06-13T06:30:26Z,NONE,,"I'm looking to import data from XML with a pre-defined schema that maps fairly closely to a relational database.
In particular, it has explicit annotations for when fields are required, optional, or when a default value should be inferred.
Would there be value in adding the ability to define `NOT NULL` and `DEFAULT` column constraints to sqlite-utils?",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459621683,MDU6SXNzdWU0NTk2MjE2ODM=,521,Easier way of creating custom row templates,9599,closed,0,,,6,2019-06-23T21:49:27Z,2019-07-03T03:23:56Z,2019-07-03T03:23:56Z,OWNER,,"I was messing around with a custom `_rows_and_columns.html` template and ended up with this:
```html
{% for row in display_rows %}
{% for cell in row %}
{% if cell.column == ""First_Name"" %}
{{ cell.value }}
{% elif cell.column == ""Last_Name"" %}
{{ cell.value }}
{% elif cell.column == ""Short_Description"" %}
{{ cell.column }}: {{ cell.value }}
{% else %}
{{ cell.column }}: {{ cell.value }}
{% endif %}
{% endfor %}
{% endfor %}
```
This is nasty. I'd like to be able to do something like this instead:
```
{% for row in display_rows %}
{{ row[""First_Name""] }} {{ row[""Last_Name""] }}
...
```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
465815372,MDU6SXNzdWU0NjU4MTUzNzI=,37,Experiment with type hints,9599,closed,0,,,6,2019-07-09T14:30:34Z,2021-08-18T21:48:57Z,2021-08-18T21:48:57Z,OWNER,,"Since it's designed to be used in Jupyter or for rapid prototyping in an IDE (and it's still pretty small) `sqlite-utils` feels like a great candidate for me to finally try out Python type hints.
https://veekaybee.github.io/2019/07/08/python-type-hints/ is good.
It suggests the mypy docs for getting started: https://mypy.readthedocs.io/en/latest/existing_code.html plus this tutorial: https://pymbook.readthedocs.io/en/latest/typehinting.html",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
488833975,MDU6SXNzdWU0ODg4MzM5NzU=,3,Command for running a search and saving tweets for that search,9599,closed,0,,,6,2019-09-03T21:29:56Z,2019-11-04T05:31:56Z,2019-11-04T05:31:16Z,MEMBER,, $ twitter-to-sqlite search dogsheep,206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
512996469,MDU6SXNzdWU1MTI5OTY0Njk=,607,Ways to improve fuzzy search speed on larger data sets?,8431341,closed,0,,,6,2019-10-27T17:31:37Z,2019-11-07T03:38:10Z,2019-11-07T03:38:10Z,NONE,,"I have an sqlite table with 16 million rows in it. Having read @simonw article ""[Fast Autocomplete Search for Your Website](https://24ways.org/2018/fast-autocomplete-search-for-your-website/)"" I was curious to try datasette to see what kind of query performance I could get out of it. In truth I don't need to do full text search since all I would like to do is give my users a way to search for the names of investors such as ""Warren Buffet"", or ""Tim Cook"" (who's names are in a single column).
On the first search, Datasette takes over 20 seconds to return all records associated with `elon musk`:
> 
> 
If I rerun the same search, it then takes almost 9 seconds:
> 
That's far to slow to implement an autocomplete feature. I could reduce the latency by making a special table of only unique investor names, thereby reducing the search space to less than a million rows (then I'd need to implement a way to add only new investor names to the table as I received new data.. about 4,000 rows a day). If I did that, I'm still concerned the new table wouldn't be lean enough to lookup investor names quickly. Plus, even if I can implement the autocomplete feature, I would still finally have to lookup records for that investors which would take between 8 - 20 seconds.
Are there any tricks for speeding this up?
Here's my hardware:
> 
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
513008936,MDU6SXNzdWU1MTMwMDg5MzY=,608,"Improve UI of ""datasette publish cloudrun"" to reduce chances of accidentally over-writing a service",9599,closed,0,,,6,2019-10-27T19:21:28Z,2019-11-08T02:51:36Z,2019-11-08T02:48:46Z,OWNER,,"The concept of a ""service"" in Cloud Run is crucial: if you deploy to the same service, you will over-write what you deployed there last!
As such, I'd like to make service a required positional argument for `publish cloudrun`:
datasette publish cloudrun my-service one.db two.db three.db
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
529429214,MDU6SXNzdWU1Mjk0MjkyMTQ=,642,Provide a cookiecutter template for creating new plugins,9599,closed,0,,3268330,6,2019-11-27T15:46:36Z,2020-06-20T03:20:33Z,2020-06-20T03:20:25Z,OWNER,,See this conversation: https://twitter.com/psychemedia/status/1199707352540368896,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
545407916,MDU6SXNzdWU1NDU0MDc5MTY=,73,upsert_all() throws issue when upserting to empty table,82988,closed,0,,,6,2020-01-05T11:58:57Z,2020-01-31T14:21:09Z,2020-01-05T17:20:18Z,NONE,,"If I try to add a list of `dict`s to an empty table using `upsert_all`, I get an error:
```python
import sqlite3
from sqlite_utils import Database
import pandas as pd
conx = sqlite3.connect(':memory')
cx = conx.cursor()
cx.executescript('CREATE TABLE ""test"" (""Col1"" TEXT);')
q=""SELECT * FROM test;""
pd.read_sql(q, conx) #shows empty table
db = Database(conx)
db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}])
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
in
1 db = Database(conx)
----> 2 db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}])
/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, extracts)
1157 alter=alter,
1158 extracts=extracts,
-> 1159 upsert=True,
1160 )
1161
/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, upsert)
1040 sql = ""INSERT OR IGNORE INTO [{table}]({pks}) VALUES({pk_placeholders});"".format(
1041 table=self.name,
-> 1042 pks="", "".join([""[{}]"".format(p) for p in pks]),
1043 pk_placeholders="", "".join([""?"" for p in pks]),
1044 )
TypeError: 'NoneType' object is not iterable
```
A hacky workaround in use is:
```python
try:
db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}])
except:
db['test'].insert_all([{'Col1':'a'},{'Col1':'b'}])
```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
573583971,MDU6SXNzdWU1NzM1ODM5NzE=,689,"""Templates considered"" comment broken in >=0.35",35075,closed,0,,,6,2020-03-01T17:31:21Z,2020-04-05T19:39:44Z,2020-04-05T19:39:44Z,NONE,,"Noticed that the ""Templates Considered"" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34:
```diff
< ""datasette_version"": ""0.34"",
< ""app_css_hash"": ""ffa51a"",
< ""select_templates"": [
< ""*index.html""
< ],
< ""zip"": """",
< ""body_scripts"": [],
< ""extra_css_urls"": """",
< ""extra_js_urls"": """",
< ""format_bytes"": """",
< ""database_url"": "">"",
< ""database_color"": "">""
---
> ""datasette_version"": ""0.35"",
> ""database_url"": "">"",
> ""database_color"": "">""
```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
585633142,MDU6SXNzdWU1ODU2MzMxNDI=,706,"Documentation for the ""request"" object",9599,closed,0,,3268330,6,2020-03-22T02:55:50Z,2020-05-30T13:20:00Z,2020-05-27T22:31:22Z,OWNER,,"Since that object is passed to the `extra_template_vars` hooks AND the classes registered by `register_facet_classes` it should be part of the documented interface on https://datasette.readthedocs.io/en/stable/internals.html
I could also start passing it to the `register_output_renderer` callback.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/706/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
592829135,MDU6SXNzdWU1OTI4MjkxMzU=,713,Support YAML in metadata - metadata.yaml,9599,closed,0,,,6,2020-04-02T18:10:05Z,2020-04-02T19:36:17Z,2020-04-02T19:30:55Z,OWNER,,"I was originally going to do this with a plugin - see #357 - but the more I work with `metadata.json` the more I want it to just accept YAML as an optional alternative to JSON.
The best example why is still this one: https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/master/russian-ads-metadata.yaml
YAML is just SO much better than JSON for multi-line strings - in particular HTML and SQL, both of which are common in `metadata.json` files.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
610408908,MDU6SXNzdWU2MTA0MDg5MDg=,34,Command for retrieving dependents for a repo,9599,closed,0,,,6,2020-04-30T21:47:51Z,2020-05-03T15:53:01Z,2020-05-03T15:53:01Z,MEMBER,,"I really, really want to start grabbing this data: https://github.com/simonw/datasette/network/dependents",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
613755043,MDU6SXNzdWU2MTM3NTUwNDM=,110,Support decimal.Decimal type,134771,closed,0,,,6,2020-05-07T03:57:19Z,2020-05-11T01:58:20Z,2020-05-11T01:50:11Z,NONE,,"Decimal types in Postgres cause a failure in db.py data type selection
---
I have a Django app using a MoneyField, which uses a `numeric(14,0)` data type in Postgres (https://www.postgresql.org/docs/9.3/datatype-numeric.html). When attempting to export that table I get the following error:
```bash
$ db-to-sqlite --table isaweb_proposal ""postgres://connection"" test.db
....
column_type=COLUMN_TYPE_MAPPING[column_type],
KeyError:
```
Looking at `sql_utils.db.py` at 292-ish it's clear that there is no matching type for what I assume SQLAlchemy interprets as Python decimal.Decimal.
From the [SQLite docs](https://www.sqlite.org/datatype3.html#affinity_name_examples) it looks like DECIMAL in other DBs are considered numeric.
I'm not quite sure if it's as simple as adding a data type to that list or if there are repercussions beyond it.
Thanks for a great tool!",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
631932926,MDU6SXNzdWU2MzE5MzI5MjY=,801,allow_by_query setting for configuring permissions with a SQL statement,9599,closed,0,,3268330,6,2020-06-05T20:30:19Z,2020-06-11T18:58:56Z,2020-06-11T18:58:49Z,OWNER,,"> Idea: an `""allow_sql""` key with a SQL query that gets passed the actor JSON as `:actor` and can extract the relevant keys from it and return 1 or 0.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/698#issuecomment-639787304_
See also #800",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/801/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
632673972,MDU6SXNzdWU2MzI2NzM5NzI=,804,python tests/fixtures.py command has a bug,9599,closed,0,,5512395,6,2020-06-06T19:17:36Z,2020-06-09T20:01:30Z,2020-06-09T19:58:34Z,OWNER,,"This command is meant to write out `fixtures.db`, `metadata.json` and a plugins directory:
```
$ python tests/fixtures.py /tmp/fixtures.db /tmp/metadata.json /tmp/plugins/
Test tables written to /tmp/fixtures.db
- metadata written to /tmp/metadata.json
Traceback (most recent call last):
File ""tests/fixtures.py"", line 833, in
(""my_plugin.py"", PLUGIN1),
NameError: name 'PLUGIN1' is not defined
```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/804/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
634139848,MDU6SXNzdWU2MzQxMzk4NDg=,813,Mechanism for specifying allow_sql permission in metadata.json,9599,closed,0,,5512395,6,2020-06-08T04:57:19Z,2020-06-09T00:09:57Z,2020-06-09T00:07:19Z,OWNER,,"Split from #811. It would be useful if finely-grained permissions configured in `metadata.json` could be used to specify if a user is allowed to execute arbitrary SQL queries.
We have a permission check call for this already: https://github.com/simonw/datasette/blob/9397d718345c4b35d2a5c55bfcbd1468876b5ab9/datasette/views/database.py#L159
But there's currently no way to implement this check without writing a plugin.
I think a `""allow_sql"": {...}` block at the database level in `metadata.json` (sibling to the current `""allow""` block for that database implemented in #811) would be a good option for this.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/813/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
635147716,MDU6SXNzdWU2MzUxNDc3MTY=,825,Way to enable a default=False permission for anonymous users,9599,closed,0,,5512395,6,2020-06-09T06:26:27Z,2020-06-09T17:19:19Z,2020-06-09T17:01:10Z,OWNER,,"I'd like plugins to be able to ship with a default that says ""anonymous users cannot do this"", but allow site administrators to over-ride that such that anonymous users can use the feature after all.
This is tricky because right now the anonymous user doesn't have an actor dictionary at all, so there's no key to match to an allow block.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/825/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
636426530,MDU6SXNzdWU2MzY0MjY1MzA=,829,Ability to set ds_actor cookie such that it expires,9599,closed,0,,5512395,6,2020-06-10T17:31:40Z,2020-06-10T19:41:35Z,2020-06-10T19:40:05Z,OWNER,,I need this for `datasette-auth-github`: https://github.com/simonw/datasette-auth-github/issues/62#issuecomment-642152076,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/829/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
637342551,MDU6SXNzdWU2MzczNDI1NTE=,834,startup() plugin hook,9599,closed,0,,5533512,6,2020-06-11T21:48:14Z,2020-06-28T19:38:50Z,2020-06-13T17:56:12Z,OWNER,,"It might be useful to have an `startup` hook which gets passed the `datasette` object as soon as Datasette has finished initializing.
My initial use-case for this is configuration verification - checking that the `""plugins""` configuration block for this plugin contains valid details.
I imagine there are plenty of other potential uses for this as well.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/834/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
638241779,MDU6SXNzdWU2MzgyNDE3Nzk=,846,"""Too many open files"" error running tests",9599,closed,0,,,6,2020-06-13T22:11:40Z,2020-06-14T00:26:31Z,2020-06-14T00:26:31Z,OWNER,,"I got this on my laptop:
```pytest
...
/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/loaders.py:171: in get_source
f = open_if_exists(filename)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
filename = '/Users/simon/Dropbox/Development/datasette/datasette/templates/400.html', mode = 'rb'
def open_if_exists(filename, mode='rb'):
""""""Returns a file descriptor for the filename if that file exists,
otherwise `None`.
""""""
try:
> return open(filename, mode)
E OSError: [Errno 24] Too many open files: '/Users/simon/Dropbox/Development/datasette/datasette/templates/400.html'
/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.7/site-packages/jinja2/utils.py:154: OSError
```
Based on the conversation in https://github.com/pytest-dev/pytest/issues/2970 I'm worried that my tests are opening too many files without closing them.
In particular... I call `sqlite3.connect(filepath)` a LOT - and I don't ever call `conn.close()` on those opened connections:
https://github.com/simonw/datasette/blob/cf7a2bdb404734910ec07abc7571351a2d934828/datasette/database.py#L58-L60
Could this be resulting in my tests eventually opening too many unclosed file handles? How could I confirm this?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/846/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
644309017,MDU6SXNzdWU2NDQzMDkwMTc=,864,datasette.add_message() doesn't work inside plugins,9599,closed,0,,5533512,6,2020-06-24T04:30:06Z,2020-06-29T00:51:01Z,2020-06-29T00:51:01Z,OWNER,,"Similar problem to #863 - calling `datasette.add_message()` in a view registered using the `register_routes()` plugin hook doesn't work, because the code that writes accumulated messages to the `ds_messages` signed cookie lives in the `BaseView` class here:
https://github.com/simonw/datasette/blob/28bb1c51897f3956861755e345e18b8e0b1423ac/datasette/views/base.py#L94-L97",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/864/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
677326155,MDU6SXNzdWU2NzczMjYxNTU=,930,Datasette sdist is missing templates (hence broken when installing from Homebrew),9599,closed,0,,,6,2020-08-12T02:20:16Z,2020-08-12T03:30:59Z,2020-08-12T03:30:59Z,OWNER,,Pretty nasty bug this: I'm getting 500 errors for all pages that try to render a template after installing the newly released Datasette 0.47 - both from `pip install` and via Homebrew.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/930/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
679779797,MDU6SXNzdWU2Nzk3Nzk3OTc=,939,extra_ plugin hooks should take the same arguments,9599,closed,0,,,6,2020-08-16T16:04:54Z,2020-08-16T18:25:05Z,2020-08-16T16:50:29Z,OWNER,,"- [x] `extra_css_urls(template, database, table, datasette)`
- [x] `extra_js_urls(template, database, table, datasette)`
- [x] `extra_body_script(template, database, table, view_name, datasette)`
- [x] `extra_template_vars(template, database, table, view_name, request, datasette)`
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/938#issuecomment-674544691_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/939/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
691265198,MDU6SXNzdWU2OTEyNjUxOTg=,7,"Mechanism for differentiating between ""by me"" and ""liked by me""",9599,closed,0,,,6,2020-09-02T17:44:37Z,2020-09-02T21:06:28Z,2020-09-02T21:06:28Z,MEMBER,,"Some of the content I'm indexing is by me - photos I've taken, tweets I wrote, commits, comments I posted.
Some of it is stuff that I've ""liked"" or ""bookmarked"" in some way - favourited tweets, Pocket articles, starred GitHub repos.
It woud be useful to be able to differentiate between the two.",197431109,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
707478649,MDU6SXNzdWU3MDc0Nzg2NDk=,173,Progress bar for sqlite-utils insert,9599,closed,0,,,6,2020-09-23T15:43:56Z,2021-11-01T08:42:24Z,2020-10-27T18:16:04Z,OWNER,,"It would be nice if `sqlite-utils insert` had a progress bar, for when it's churning through huge CSV files.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/173/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
710506708,MDU6SXNzdWU3MTA1MDY3MDg=,978,Rendering glitch with column headings on mobile,9599,closed,0,,5971510,6,2020-09-28T19:04:45Z,2020-10-08T23:54:40Z,2020-09-28T22:43:01Z,OWNER,,"
https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_parse%28%2210+october+2020+3pm%22%29%2C%0D%0A++dateutil_easter%28%222020%22%29%2C%0D%0A++dateutil_parse_fuzzy%28%22This+is+due+10+september%22%29%2C%0D%0A++dateutil_parse%28%221%2F2%2F2020%22%29%2C%0D%0A++dateutil_parse%28%222020-03-04%22%29%2C%0D%0A++dateutil_parse_dayfirst%28%222020-03-04%22%29%2C%0D%0A++dateutil_easter%282020%29",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/978/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
714449879,MDU6SXNzdWU3MTQ0NDk4Nzk=,992,"Change ""--config foo:bar"" to ""--setting foo bar""",9599,closed,0,,6055094,6,2020-10-05T01:27:45Z,2020-11-24T20:01:54Z,2020-11-24T20:01:54Z,OWNER,,I designed the config format before I had a good feel for CLI design using Click. `--config max_page_size 2000` is better than `--config max_page_size:2000`.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/992/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
718723543,MDU6SXNzdWU3MTg3MjM1NDM=,1014,Add Link: pagination HTTP headers,9599,closed,0,,6026070,6,2020-10-10T23:42:40Z,2020-10-23T19:44:05Z,2020-10-11T00:18:51Z,OWNER,,Spun off from #782. These can go on all of the JSON endpoints that support pagination.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1014/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
729096595,MDU6SXNzdWU3MjkwOTY1OTU=,1051,Better display of binary data on arbitrary query results page,9599,closed,0,,,6,2020-10-25T19:38:06Z,2020-10-29T22:12:16Z,2020-10-29T22:01:39Z,OWNER,,"https://latest.datasette.io/fixtures?sql=select+rowid%2C+data+from+binary_data+order+by+rowid+limit+101
Problem: if these were larger fields that HTML page could have multiple megabytes of Python binary string representations on it.
It should behave more like the regular table view does:
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
732905360,MDU6SXNzdWU3MzI5MDUzNjA=,1067,"Table actions menu on view pages, not on query pages",9599,closed,0,,6026070,6,2020-10-30T05:56:39Z,2020-10-31T17:51:31Z,2020-10-31T17:40:14Z,OWNER,,Follow-on from #1066.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
733796942,MDU6SXNzdWU3MzM3OTY5NDI=,1075,PrefixedUrlString mechanism broke everything,9599,closed,0,,6026070,6,2020-10-31T19:58:05Z,2020-10-31T20:48:51Z,2020-10-31T20:48:51Z,OWNER,,Added in 7a67bc7a569509d65b3a8661e0ad2c65f0b09166 refs #1026. Lots of tests are failing now.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1075/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
737476423,MDU6SXNzdWU3Mzc0NzY0MjM=,198,Support order by relevance against FTS4,9599,closed,0,,,6,2020-11-06T05:36:31Z,2020-11-06T18:30:44Z,2020-11-06T18:30:44Z,OWNER,,"For #192 and #197 I've decided I want to be able to order by relevance in FTS4 as well as FTS5.
This means I need to port over my work on bm25() from https://github.com/simonw/sqlite-fts4 (since I don't want to add a full dependency).",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
906356331,MDU6SXNzdWU5MDYzNTYzMzE=,263,`sqlite-utils indexes` command,9599,closed,0,,,6,2021-05-29T04:52:34Z,2021-06-03T04:34:38Z,2021-06-03T04:34:38Z,OWNER,,"While working on #260 I realized there's no command to show indexes in a database, even though there is one for showing tables and one for triggers.
I should implement #261 first.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
907645813,MDU6SXNzdWU5MDc2NDU4MTM=,57,"Error: Use either --since or --since_id, not both",42904,closed,0,,,6,2021-05-31T18:11:04Z,2021-08-20T00:01:31Z,2021-08-20T00:01:31Z,CONTRIBUTOR,,"I'm using the following command:
```
twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since
```
Which gives the following error:
```
Error: Use either --since or --since_id, not both
```
Running without `--since`.
```
Traceback (most recent call last):
File ""/usr/local/bin/twitter-to-sqlite"", line 8, in
sys.exit(cli())
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__
return self.main(*args, **kwargs)
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1062, in main
rv = self.invoke(ctx)
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 763, in invoke
return __callback(*args, **kwargs)
File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 317, in user_timeline
for tweet in bar:
File ""/usr/local/lib/python3.9/site-packages/click/_termui_impl.py"", line 328, in generator
for rv in self.iter:
File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 234, in fetch_user_timeline
yield from fetch_timeline(
File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline
raise Exception(str(tweets[""errors""]))
Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}]
```
```
Python 3.9.5
twitter-to-sqlite, version 0.21.3
```",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
926777310,MDU6SXNzdWU5MjY3NzczMTA=,290,`db.query()` method (renamed `db.execute_returning_dicts()`),9599,closed,0,,,6,2021-06-22T03:03:54Z,2021-06-24T23:17:38Z,2021-06-24T22:54:43Z,OWNER,,"Most of this library deals with lists of Python dictionaries - `.insert_all()`, `.rows`, `.rows_where()`, `.search()`.
The `db.execute()` method is the only thing that returns a `sqlite3` cursor.
There is a clumsily named `db.execute_returning_dicts(sql)` method but it's not currently mentioned in the documentation.
It needs a better name, and needs to be properly documented.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
940077168,MDU6SXNzdWU5NDAwNzcxNjg=,1389,"""searchmode"": ""raw"" in table metadata",9599,closed,0,,,6,2021-07-08T17:32:10Z,2021-07-10T18:33:13Z,2021-07-10T18:33:13Z,OWNER,,"> http://localhost:8001/index/summary?_search=language%3Aeng&_sort=title&_searchmode=raw
>
> But I'm not able to manage it in the metadata file. Here is mine (note that the sort column is taken into account)
> Here it is:
>
> ```
> {
> ""databases"": {
> ""index"": {
> ""tables"": {
> ""summary"": {
> ""sort"": ""title"",
> ""searchmode"": ""raw""
> }
> }
> }
> }
> }
_Originally posted by @Krazybug in https://github.com/simonw/datasette/issues/759#issuecomment-624860451_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
944326512,MDU6SXNzdWU5NDQzMjY1MTI=,296,"`table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option",32427188,closed,0,,,6,2021-07-14T11:26:47Z,2021-08-18T20:13:12Z,2021-08-18T20:10:48Z,NONE,,"Hi,
Recently got this error:
```
Traceback (most recent call last):
File """", line 1, in
File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 38, in
start(""/home/ethan/git/music-metadata-indexer/sample"", ""/home/ethan/git/music-metadata-indexer/test.db"")
File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 23, in start
scanner.build_database()
File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 79, in build_database
_import_song(self.db, Path(dirpath).joinpath(f), self.logger)
File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 23, in _import_song
db.add_song(filepath)
File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/index.py"", line 166, in add_song
for match in self.search(""albums"", album):
File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1625, in search
cursor = self.db.execute(
File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 243, in execute
return self.conn.execute(sql, parameters)
sqlite3.OperationalError: fts5: syntax error near "".""
```
So, the error seems to suggest there was a ""."" character somewhere in the SQL command that was causing the error. I did a little digging and found this in the docs: https://www.sqlite.org/fts5.html#fts5_strings. ""."" is one of the many prohibited characters.
My solution was to just strip these out of the query using this line
`query = query.translate({e: None for e in itertools.chain(range(0,26), range(27, 48), range(58,65), range(91,95), [96], range(123,128))})`
Perhaps this could be included into the `table.search()` function?
",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
963897111,MDU6SXNzdWU5NjM4OTcxMTE=,309,"sqlite-utils insert errors should show SQL and parameters, if possible",16622642,closed,0,,,6,2021-08-09T11:24:14Z,2021-08-09T23:40:29Z,2021-08-09T22:25:58Z,NONE,,"I've tried several approaches, but this is the current one:
```sh
echo $json-line | sqlite-utils insert json.db jsontable --truncate --alter --detect-types -
```
In all cases, I get this error:
```sh
OverflowError: Python int too large to convert to SQLite INTEGER
Traceback (most recent call last):
File ""/home/sean/.local/bin/sqlite-utils"", line 8, in
sys.exit(cli())
File ""/usr/lib/python3/dist-packages/click/core.py"", line 764, in __call__
return self.main(*args, **kwargs)
File ""/usr/lib/python3/dist-packages/click/core.py"", line 717, in main
rv = self.invoke(ctx)
File ""/usr/lib/python3/dist-packages/click/core.py"", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/usr/lib/python3/dist-packages/click/core.py"", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/usr/lib/python3/dist-packages/click/core.py"", line 555, in invoke
return callback(*args, **kwargs)
File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 841, in insert
insert_upsert_implementation(
File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 780, in insert_upsert_implementation
db[table].insert_all(
File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2145, in insert_all
self.insert_chunk(
File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1957, in insert_chunk
result = self.db.execute(query, params)
File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 257, in execute
return self.conn.execute(sql, parameters)
```
I googled the error and checked SO answers and advice, all good. I changed my JSON file to not use integers so I no longer get this error. Of course, that makes using the database a bit harder, so I also tried to solve the problem by modifying DB structure (while using integers in JSON).
If change all `INTEGER` Data Types to something else (`STRING`, `TEXT`) and try to import again using `--truncate`, I still get this error. I suppose I should tell sqlite-utils which columns should use non-INTEGER Data Type rather than rely on it to check my SQL table configuration.
If that is the case, can this error be a bit more specific for easier troubleshooting - maybe tell us which which record caused the problem when that error is thrown?
My table has 60+ columns, many of which use 64-bit integers (not all records are large or known in advance), so while I can modify JSON to use strings instead of integers, it decreases usability and finding out which records have values for which SQLite integers aren't sufficient requires some work (I'm thinking about parsing all integers with `jq` and sorting output by length to identify those columns, but I'd prefer if sqlite-utils could tell me that).
My environment:
- Python 3.8.10
- sqlite-utils 3.14
- pandas 1.3.1
- numpy 1.21.1
- sqlite-fts4 1.0.1
- sqlite 3.31.1-4ubuntu0.2
",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
974987856,MDU6SXNzdWU5NzQ5ODc4NTY=,1442,Mechanism to cause specific branches to deploy their own demos,9599,closed,0,,,6,2021-08-19T19:41:39Z,2021-08-19T21:11:45Z,2021-08-19T21:09:40Z,OWNER,,"A useful capability would be if it was super-easy to say ""any pushes to branch X should be deployed to `latest-X.datasette.io`"".
I'd like to use this for the column query information work in #1434",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1052851176,I_kwDOBm6k_c4-wTvo,1507,ReadTheDocs build failed for 0.59.2 release,9599,closed,0,,,6,2021-11-14T05:24:34Z,2021-11-14T05:41:55Z,2021-11-14T05:41:55Z,OWNER,,"I had to cancel the 0.59.2 release because ReadTheDocs was failing to build the documentation.
https://readthedocs.org/projects/datasette/builds/15268454/
```
/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/bin/python -m sphinx -T -b html -d _build/doctrees -D language=en . _build/html
Running Sphinx v1.8.5
loading translations [en]... done
making output directory...
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 27 source files that are out of date
updating environment: 27 added, 0 changed, 0 removed
reading sources... [ 3%] authentication
Traceback (most recent call last):
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/cmd/build.py"", line 304, in build_main
app.build(args.force_all, filenames)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/application.py"", line 341, in build
self.builder.build_update()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 347, in build_update
len(to_build))
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 360, in build
updated_docnames = set(self.read())
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 468, in read
self._read_serial(docnames)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 490, in _read_serial
self.read_doc(docname)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/builders/__init__.py"", line 534, in read_doc
doctree = read_doc(self.app, self.env, self.env.doc2path(docname))
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/io.py"", line 318, in read_doc
pub.publish()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 219, in publish
self.apply_transforms()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/core.py"", line 200, in apply_transforms
self.document.transformer.apply_transforms()
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 90, in apply_transforms
Transformer.apply_transforms(self)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/docutils/transforms/__init__.py"", line 171, in apply_transforms
transform.apply(**kwargs)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/transforms/__init__.py"", line 245, in apply
apply_source_workaround(n)
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround
for classifier in reversed(node.parent.traverse(nodes.classifier)):
TypeError: argument to reversed() must be a sequence
Exception occurred:
File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/0.59.2/lib/python2.7/site-packages/sphinx/util/nodes.py"", line 94, in apply_source_workaround
for classifier in reversed(node.parent.traverse(nodes.classifier)):
TypeError: argument to reversed() must be a sequence
The full traceback has been saved in /tmp/sphinx-err-vkl0oE.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at . Thanks!
```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1067771698,I_kwDOCGYnMM4_pOcy,348,Command for creating an empty database,9599,closed,0,,7558727,6,2021-11-30T23:24:27Z,2022-01-13T07:06:59Z,2022-01-09T20:33:20Z,OWNER,,"I sometimes find the need to create an empty SQLite database file - for example if I want to enable WAL on it before using it with another script. I currently do that like this:
sqlite3 my.db vacuum
sqlite-utils enable-wal my.db
It would be nice if `sqlite-utils` had a convenience command for doing this.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1076388044,I_kwDOBm6k_c5AKGDM,1547,Writable canned queries fail to load custom templates,127565,closed,0,,7571612,6,2021-12-10T03:31:48Z,2022-01-13T22:27:59Z,2021-12-19T21:12:00Z,CONTRIBUTOR,,"I've created a canned query with `""write"": true` set. I've also created a custom template for it, but the template doesn't seem to be found. If I look in the HTML I see (`stock_exchange` is the db name):
``
My non-writeable canned queries pick up custom templates as expected, and if I look at their HTML I see the canned query name added to the templates considered (the canned query here is `date_search`):
``
So it seems like the writeable canned query is behaving differently for some reason. Is it an authentication thing? I'm using the built in `--root` authentication.
Thanks!
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1083669410,I_kwDOBm6k_c5Al3ui,1566,Release Datasette 0.60,9599,closed,0,,7571612,6,2021-12-17T22:58:12Z,2022-01-14T01:59:55Z,2022-01-14T01:59:55Z,OWNER,,Using this as a tracking issue. I'm hoping to get the bulk of the JSON redesign work from the refactor in #1554 in for this release.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1087913724,I_kwDOBm6k_c5A2D78,1577,Drop support for Python 3.6,9599,closed,0,,3268330,6,2021-12-23T18:17:03Z,2022-01-25T23:30:03Z,2022-01-20T04:31:41Z,OWNER,,"*Original title: Decide when to drop support for Python 3.6*
> `context_vars` can solve this but they were introduced in Python 3.7: https://www.python.org/dev/peps/pep-0567/
>
> Python 3.6 support ends in a few days time, and it looks like Glitch has updated to 3.7 now - so maybe I can get away with Datasette needing 3.7 these days?
>
> Tweeted about that here: https://twitter.com/simonw/status/1473761478155010048
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1576#issuecomment-999878907_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1097128334,I_kwDOCGYnMM5BZNmO,371,Support mutating row in `--convert` without returning it,9599,closed,0,,7558727,6,2022-01-09T07:38:44Z,2022-01-10T19:27:30Z,2022-01-09T20:06:15Z,OWNER,,"Currently you have to do this:
```
$ sqlite-utils insert dogs.db dogs dogs.json --convert '
row[""is_good""] = 1
return row'
```
Would be neat if this worked too:
```
$ sqlite-utils insert dogs.db dogs dogs.json \
--convert 'row[""is_good""] = 1'
```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1126604194,I_kwDOBm6k_c5DJp2i,1632,"datasette one.db one.db opens database twice, as one and one_2",9599,closed,0,,3268330,6,2022-02-07T23:14:47Z,2022-03-19T04:04:49Z,2022-02-07T23:50:01Z,OWNER,,"> ```
> % mkdir /tmp/data
> % cp ~/Dropbox/Development/datasette/fixtures.db /tmp/data
> % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852
> ...
> INFO: Uvicorn running on http://127.0.0.1:8852 (Press CTRL+C to quit)
> ^CINFO: Shutting down
> % datasette /tmp/data/*.db /tmp/data/created.db --create -p 8852
> ...
> INFO: 127.0.0.1:49533 - ""GET / HTTP/1.1"" 200 OK
> ```
> The first time I ran Datasette I got two databases - `fixtures` and `created`
>
> BUT... when I ran Datasette the second time it looked like this:
>
>
>
> This is the same result you get if you run:
>
> datasette /tmp/data/fixtures.db /tmp/data/created.db /tmp/data/created.db
>
> This is caused by this Datasette issue:
> - https://github.com/simonw/datasette/issues/509
>
> So... either I teach Datasette to de-duplicate multiple identical file paths passed to the command, or I can't use `/data/*.db` in the `Dockerfile` here and I need to go back to other solutions for the challenge described in this comment: https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1031971831
_Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/pull/12#issuecomment-1032029874_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1145882578,I_kwDOCGYnMM5ETMfS,408,`deterministic=True` fails on versions of SQLite prior to 3.8.3,24938923,closed,0,,,6,2022-02-21T14:36:43Z,2022-03-13T16:54:09Z,2022-03-02T00:38:11Z,NONE,,"Hi, love your work.
I am unable to lookup indexes in a database using sqlite-utils:
`
sqlite-utils indexes city_spec.db --table`
or
`sqlite-utils indexes city_spec.db MyTable
`
**Software**
sqlite-utils, version 3.24
sqlite3 --version: 3.36.0
**Output:**
Traceback (most recent call last):
File ""/opt/app-root/bin/sqlite-utils"", line 8, in
sys.exit(cli())
File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1128, in __call__
return self.main(*args, **kwargs)
File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1053, in main
rv = self.invoke(ctx)
File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke
return __callback(*args, **kwargs)
File ""/opt/app-root/lib64/python3.8/site-packages/click/decorators.py"", line 26, in new_func
return f(get_current_context(), *args, **kwargs)
File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 2123, in indexes
ctx.invoke(
File ""/opt/app-root/lib64/python3.8/site-packages/click/core.py"", line 754, in invoke
return __callback(*args, **kwargs)
File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py"", line 1624, in query
db.register_fts4_bm25()
File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 403, in register_fts4_bm25
self.register_function(rank_bm25, deterministic=True)
File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 399, in register_function
register(fn)
File ""/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py"", line 392, in register
self.conn.create_function(name, arity, fn, **kwargs)
sqlite3.NotSupportedError: deterministic=True requires SQLite 3.8.3 or higher
",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1174423568,I_kwDOBm6k_c5GAEgQ,1670,Ship Datasette 0.61,9599,closed,0,,,6,2022-03-20T02:47:54Z,2022-03-23T18:32:32Z,2022-03-23T18:32:03Z,OWNER,,"Let the alpha bake for a while, since #1668 is a big last-minute change.
After shipping, release a new `datasette-hashed-urls` that depends on it, also this:
- https://github.com/simonw/datasette-hashed-urls/issues/11",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1670/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1223699280,I_kwDOBm6k_c5I8CtQ,1739,.db downloads should be served with an ETag,9599,closed,0,,,6,2022-05-03T05:11:21Z,2022-05-04T18:21:18Z,2022-05-03T14:59:51Z,OWNER,,"I noticed that my Pyodide Datasette prototype is downloading the same database file every single time rather than browser caching it:

",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1352932038,I_kwDOCGYnMM5QpBrG,470,Upgrade `--load-extension` to accept entrypoints like Datasette,9599,closed,0,,8355157,6,2022-08-27T03:53:20Z,2022-08-27T05:55:49Z,2022-08-27T05:55:48Z,OWNER,,"Imitate:
- https://github.com/simonw/datasette/pull/1789
```
# would load default entrypoint like before
datasette data.db --load-extension ext
# loads the extensions with the ""sqlite3_foo_init"" entrpoint
datasette data.db --load-extension ext:sqlite3_foo_init
# loads the extensions with the ""sqlite3_bar_init"" entrpoint
datasette data.db --load-extension ext:sqlite3_bar_init
```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1363440999,I_kwDOBm6k_c5RRHVn,1804,Ability to set a custom facet_size per table,9599,closed,0,,,6,2022-09-06T15:11:40Z,2022-09-07T00:21:56Z,2022-09-06T18:06:53Z,OWNER,,"Suggestion from Discord: https://discord.com/channels/823971286308356157/823971286941302908/1016725586351247430
> Is it possible to limit the facet size per database or even per table?
This is a really good idea, it could be done in `metadata.yml`.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1804/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1468709531,I_kwDOBm6k_c5Xirqb,1915,Interactive demo of Datasette 1.0 write APIs,9599,closed,0,,,6,2022-11-29T21:16:03Z,2022-11-30T04:05:46Z,2022-11-30T04:05:46Z,OWNER,,"I'm going to try to get this working on https://latest.datasette.io/ - it already has a way for people to sign in as root, but none of the databases there are writable.
So I'm going to build a plugin which adds a writable named in-memory database.
And some kind of mechanism for clearing out that database on a regular basis - maybe tables in that database get deleted automatically an hour after they are created?
(Would be neat to display their time-left-until-deleted too)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1915/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1495431932,I_kwDOBm6k_c5ZInr8,1951,`datasette.create_token(...)` method for creating signed API tokens,9599,closed,0,,8711695,6,2022-12-14T01:25:34Z,2022-12-14T02:43:45Z,2022-12-14T02:42:05Z,OWNER,,"I need this for:
- #1947
And I can refactor this to use it too:
- #1855
By making this a documented internal API it can be used by other plugins too. It's also going to be really useful for writing tests.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1951/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1615862295,I_kwDOBm6k_c5gUBoX,2036,"`publish cloudrun` reuses image tags, which can lead to very surprising deploy problems",9599,closed,0,,,6,2023-03-08T20:11:44Z,2023-03-08T20:57:34Z,2023-03-08T20:57:34Z,OWNER,,"See this issue:
- https://github.com/simonw/datasette.io/issues/141",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,closed,0,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1718517882,I_kwDOCGYnMM5mboB6,545,Try out Trogon for a tui interface,9599,closed,0,,,6,2023-05-21T14:08:25Z,2023-05-21T19:33:13Z,2023-05-21T18:41:58Z,OWNER,,https://github.com/Textualize/trogon,140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/545/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
1718607907,I_kwDOCGYnMM5mb-Aj,551,Make as many examples in the CLI docs as possible copy-and-pastable,9599,closed,0,,,6,2023-05-21T19:04:10Z,2023-05-21T21:04:04Z,2023-05-21T20:57:24Z,OWNER,,"e.g. in this section:
https://sqlite-utils.datasette.io/en/stable/cli.html#running-queries-directly-against-csv-or-json
The little copy button will also copy the `$ ` which breaks the examples when copied.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed