issues
290 rows where comments = 2, state = "closed" and type = "issue"
This data as json, CSV (advanced)
Suggested facets: milestone, author_association, repo, state_reason, created_at (date), updated_at (date), closed_at (date)
id ▼ | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
267517348 | MDU6SXNzdWUyNjc1MTczNDg= | 9 | Initial test suite | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-10-23T01:28:46Z | 2017-10-24T05:55:33Z | 2017-10-24T05:55:33Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/9/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267828746 | MDU6SXNzdWUyNjc4Mjg3NDY= | 24 | Implement full URL design | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-10-23T21:49:05Z | 2017-10-24T14:12:00Z | 2017-10-24T14:12:00Z | OWNER | Full URL design: /database-name /database-name.json /database-name-7sha256 /database-name-7sha256.json /database-name/table-name /database-name/table-name.json /database-name-7sha256/table-name /database-name-7sha256/table-name.json /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/24/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267857622 | MDU6SXNzdWUyNjc4NTc2MjI= | 25 | Endpoint that returns SQL ready to be piped into DB | simonw 9599 | closed | 0 | 2 | 2017-10-24T00:19:26Z | 2017-11-15T05:11:12Z | 2017-11-15T05:11:11Z | OWNER | It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/25/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268453968 | MDU6SXNzdWUyNjg0NTM5Njg= | 37 | Ability to serialize massive JSON without blocking event loop | simonw 9599 | closed | 0 | 2 | 2017-10-25T15:58:03Z | 2020-05-30T17:29:20Z | 2020-05-30T17:29:20Z | OWNER | We run the risk of someone attempting a select statement that returns thousands of rows and hence takes several seconds just to JSON encode the response, effectively blocking the event loop and pausing all other traffic. The Twisted community have a solution for this, can we adapt that in some way? http://as.ynchrono.us/2010/06/asynchronous-json_18.html?m=1 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/37/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
272694136 | MDU6SXNzdWUyNzI2OTQxMzY= | 50 | Unit tests against application itself | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-09T19:31:49Z | 2017-11-11T22:23:22Z | 2017-11-11T22:23:22Z | OWNER | Use Sanic’s testing mechanism. Test should create a temporary SQLite database file on disk by executing sql that is stored in the test themselves. For the moment we can just test the JSON API more thoroughly and just sanity check that the HTML output doesn’t throw any errors. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/50/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273026602 | MDU6SXNzdWUyNzMwMjY2MDI= | 52 | Solution for temporarily uploading DB so it can be built by docker | simonw 9599 | closed | 0 | 2 | 2017-11-10T18:55:25Z | 2017-12-10T03:02:57Z | 2017-12-10T03:02:57Z | OWNER | For the `datasette publish` command I ideally need a way of uploading the specified DB to somewhere temporary on the internet so that when the Dockerfile is built by the final hosting location it can download that database as part of the build process. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/52/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273127117 | MDU6SXNzdWUyNzMxMjcxMTc= | 55 | Ship first version to PyPI | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-11T07:38:48Z | 2017-11-13T21:19:43Z | 2017-11-13T21:19:43Z | OWNER | Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/55/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273192789 | MDU6SXNzdWUyNzMxOTI3ODk= | 67 | Command that builds a local docker container | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-12T02:13:29Z | 2017-11-13T16:17:52Z | 2017-11-13T16:17:52Z | OWNER | Be nice to indicate that this isn't just for Now. Shouldn't be too hard either. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/67/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273296178 | MDU6SXNzdWUyNzMyOTYxNzg= | 73 | _nocache=1 query string option for use with sort-by-random | simonw 9599 | closed | 0 | 2 | 2017-11-13T02:57:10Z | 2018-05-28T17:25:15Z | 2018-05-28T17:25:15Z | OWNER | The one place where we wouldn’t want cdching is if we have something which uses sort by random to return random items. We can offer a _nocache=1 querystring argument to support this. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/73/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273569477 | MDU6SXNzdWUyNzM1Njk0Nzc= | 80 | Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination) | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-13T20:37:46Z | 2017-11-13T22:09:46Z | 2017-11-13T22:09:46Z | OWNER | Final versions should be deployed using the first released version of datasette. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/80/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
274160723 | MDU6SXNzdWUyNzQxNjA3MjM= | 100 | TemplateAssertionError: no filter named 'tojson' | coisnepe 13304454 | closed | 0 | 2 | 2017-11-15T13:43:41Z | 2017-11-16T09:25:10Z | 2017-11-16T00:14:13Z | NONE | A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/sanic/app.py", line 503, in handle_request response = await response File "/usr/local/lib/python3.5/dist-packages/datasette/app.py", line 155, in get return await self.view_get(request, name, hash, **kwargs) File "/usr/local/lib/python3.5/dist-packages/datasette/app.py", line 219, in view_get **context, File "/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py", line 84, in render return html(self.render_string(template, request, **context)) File "/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py", line 81, in render_string return self.env.get_template(template).render(**context) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 786, in _load_template template = self.loader.load(self, name, globals) File "/usr/lib/python3/dist-packages/jinja2/loaders.py", line 125, in load code = environment.compile(source, name, filename) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 754, in handle_exception reraise(exc_type, exc_value, tb) File "/usr/lib/python3/dist-packages/jinja2/_compat.py", line 37, in reraise raise value.with_traceback(tb) File "/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html", line 29, in template <pre>params = {{ query.params|tojson(4) }}</pre> File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 515, i… | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/100/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
274578142 | MDU6SXNzdWUyNzQ1NzgxNDI= | 110 | Add --load-extension option to datasette for loading extra SQLite extensions | simonw 9599 | closed | 0 | 2 | 2017-11-16T16:26:19Z | 2017-11-16T18:38:30Z | 2017-11-16T16:58:50Z | OWNER | This would allow users with extra SQLite extensions installed (like spatialite) to load them at runtime. Inspired by this comment: https://github.com/simonw/datasette/issues/46#issuecomment-344810525 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/110/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
275135393 | MDU6SXNzdWUyNzUxMzUzOTM= | 125 | Plot rows on a map with Leaflet and Leaflet.markercluster | simonw 9599 | closed | 0 | 2 | 2017-11-19T06:05:05Z | 2018-04-26T15:14:31Z | 2018-04-26T15:14:31Z | OWNER | https://github.com/Leaflet/Leaflet.markercluster would allow us to paginate-load in an enormous set of rows with latitude/longitude points, e.g. https://australian-dunnies.now.sh/ Here's a demo of it loading 50,000 markers: https://leaflet.github.io/Leaflet.markercluster/example/marker-clustering-realworld.50000.html - and it looks like it's easy to support progress bars for if we were iteratively loading 1,000 markers at a time using datasette pagination. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/125/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
275135719 | MDU6SXNzdWUyNzUxMzU3MTk= | 127 | Filtered tables should show count of all matching rows, if fast enough | simonw 9599 | closed | 0 | Foreign key edition 2919870 | 2 | 2017-11-19T06:13:29Z | 2017-11-24T22:02:01Z | 2017-11-24T22:02:01Z | OWNER | Relates to #86. If you are viewing a filtered page e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/bob-ross%2Felements-by-episode?CLOUDS=1 we should show the count of matching rows. Since this could be an expensive operation, we will run it with a strict time limit (maybe 50ms). If the time limit is exceeded we will display "many" instead, perhaps? Maybe even link to a count(*) query that would get the full 1000ms time limit which the user can click on if they like (that could even Ajax-in the result). | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/127/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
275164558 | MDU6SXNzdWUyNzUxNjQ1NTg= | 129 | Hide FTS-created tables by default on the database index page | simonw 9599 | closed | 0 | 2 | 2017-11-19T14:50:42Z | 2017-11-22T20:22:02Z | 2017-11-22T20:19:04Z | OWNER | SQLite databases that use FTS include a number of automatically generated tables, e.g.: https://sf-trees-search.now.sh/sf-trees-search-a899b92 <img width="730" alt="sf-trees-search_and_sf-trees-search" src="https://user-images.githubusercontent.com/9599/32991960-bf05abee-ccf9-11e7-8bd1-54dcde4ca491.png"> Of these, only the `Street_Tree_List` table is actually relevant to the user. We can detect which tables are FTS tables by first finding the virtual tables: sqlite> .headers on sqlite> select * from sqlite_master where rootpage = 0; type|name|tbl_name|rootpage|sql table|Search|Search|0|CREATE VIRTUAL TABLE "Street_Tree_List_fts" USING FTS4 ("qAddress", "qCaretaker", "qSpecies") Then parsing the above to figure out which ones are USING FTS? - then assume that any table which starts with that `Street_Tree_List_fts` prefix was created to support search: sqlite> select * from sqlite_master where type='table' and tbl_name like 'Street_Tree_List_fts%'; type|name|tbl_name|rootpage|sql table|Search_content|Search_content|10355|CREATE TABLE 'Street_Tree_List_fts_content'(docid INTEGER PRIMARY KEY, 'c0qAddress', 'c1qCaretaker', 'c2qSpecies') table|Search_segments|Search_segments|10356|CREATE TABLE 'Street_Tree_List_fts_segments'(blockid INTEGER PRIMARY KEY, block BLOB) table|Search_segdir|Search_segdir|10357|CREATE TABLE 'Street_Tree_List_fts_segdir'(level INTEGER,idx INTEGER,start_block INTEGER,leaves_end_block INTEGER,end_block INTEGER,root BLOB,PRIMARY KEY(level, idx)) table|Search_docsize|Search_docsize|10359|CREATE TABLE 'Street_Tree_List_fts_docsize'(docid INTEGER PRIMARY KEY, size BLOB) table|Search_stat|Search_stat|10360|CREATE TABLE 'Street_Tree_List_fts_stat'(id INTEGER PRIMARY KEY, value BLOB) We won't hide these completely - instead, we'll default the database index view to not showing them with a message that says "5 hidden tables" and support ?_hidden=1 to display them. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/129/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
275493851 | MDU6SXNzdWUyNzU0OTM4NTE= | 139 | Build a visualization plugin for Vega | simonw 9599 | closed | 0 | 2 | 2017-11-20T20:47:41Z | 2018-07-10T17:48:18Z | 2018-07-10T17:48:18Z | OWNER | https://vega.github.io/vega/examples/population-pyramid/ for example looks pretty easy to hook up to Datasette. Depends on #14 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/139/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
276455748 | MDU6SXNzdWUyNzY0NTU3NDg= | 146 | datasette publish gcloud | simonw 9599 | closed | 0 | 2 | 2017-11-23T18:55:03Z | 2019-06-24T06:48:20Z | 2019-06-24T06:48:20Z | OWNER | See also #103 It looks like you can start a Google Cloud VM with a "docker container" option - and the Google Cloud Registry is easy to push containers to. So it would be feasible to have `datasette publish gcloud ...` automatically build a container, push it to GCR, then start a new VM instance with it: https://cloud.google.com/container-registry/docs/pushing-and-pulling | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/146/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
276718605 | MDU6SXNzdWUyNzY3MTg2MDU= | 151 | Set up a pattern portfolio | simonw 9599 | closed | 0 | 2 | 2017-11-25T02:09:49Z | 2020-07-02T00:13:24Z | 2020-05-03T03:13:16Z | OWNER | https://www.slideshare.net/nataliedowne/practical-maintainable-css/75 This will be a single page that demonstrates all of the different CSS styles and classes available to Datasette. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/151/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
279547886 | MDU6SXNzdWUyNzk1NDc4ODY= | 163 | Document the querystring argument for setting a different time limit | simonw 9599 | closed | 0 | 2 | 2017-12-05T22:05:08Z | 2021-03-23T02:44:33Z | 2017-12-06T15:06:57Z | OWNER | http://datasette.readthedocs.io/en/latest/sql_queries.html#query-limits Need to explain why this is useful too. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/163/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
280014287 | MDU6SXNzdWUyODAwMTQyODc= | 165 | metadata.json support for per-database and per-table information | simonw 9599 | closed | 0 | Custom templates edition 2949431 | 2 | 2017-12-07T06:15:34Z | 2017-12-07T16:48:34Z | 2017-12-07T16:47:29Z | OWNER | Every database and every table should be able to support the following optional metadata: title description description_html license license_url source source_url If `description_html` is provided it over-rides `description` and will be displayed unescaped. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/165/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
291639118 | MDU6SXNzdWUyOTE2MzkxMTg= | 183 | Custom Queries - escaping strings | psychemedia 82988 | closed | 0 | 2 | 2018-01-25T16:49:13Z | 2019-06-24T06:45:07Z | 2019-06-24T06:45:07Z | CONTRIBUTOR | If a SQLite table column name contains spaces, they are usually referred to in double quotes: `SELECT * FROM mytable WHERE "gappy column name"="my value";` In the JSON metadata file, this is passed by escaping the double quotes: `"queries": {"my query": "SELECT * FROM mytable WHERE \"gappy column name\"=\"my value\";"}` When specifying a custom query in `metadata.json` using double quotes, these are then rendered in the *datasette* query box using single quotes: `SELECT * FROM mytable WHERE 'gappy column name'='my value';` which does not work. Alternatively, a valid custom query can be passed using backticks (\`) to quote the column name and single (unescaped) quotes for the matched value: ``"queries": {"my query": "SELECT * FROM mytable WHERE `gappy column name`='my value';"}`` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/183/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
312312125 | MDU6SXNzdWUzMTIzMTIxMjU= | 194 | Rename table_rows and filtered_table_rows to have _count suffix | simonw 9599 | closed | 0 | 2 | 2018-04-08T14:53:37Z | 2018-04-09T05:25:22Z | 2018-04-09T05:25:22Z | OWNER | These fields represent counts of items: "table_rows": 131, "filtered_table_rows": 8, But the names make it sound like they might be arrays full of rows. Adding a `_count` suffix would make this more clear: "table_rows_count": 131, "filtered_table_rows_count": 8, | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/194/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
314847571 | MDU6SXNzdWUzMTQ4NDc1NzE= | 220 | Investigate syntactic sugar for plugins | simonw 9599 | closed | 0 | 2 | 2018-04-16T23:01:39Z | 2020-06-11T21:50:06Z | 2020-06-11T21:49:55Z | OWNER | Suggested by @andrewhayward on Twitter: https://twitter.com/arhayward/status/986015118965268480?s=21 > Have you considered a basic abstraction on top of that, for standard hook features? ``` @sql_function random_integer(a,b): return random.randint(a,b) @template_filter uppercase(str): return str.upper() ``` Maybe `from datasette.plugins import template_filter`? Would have to work out how to get this to play well with pluggy | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/220/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
315738696 | MDU6SXNzdWUzMTU3Mzg2OTY= | 226 | Unit tests for installable plugins | simonw 9599 | closed | 0 | 2 | 2018-04-19T06:05:32Z | 2020-11-24T19:52:51Z | 2020-11-24T19:52:46Z | OWNER | I'd like more thorough unit test coverage of the plugins mechanism - in particular for installable plugins. I think I can do this while still having the code live in the same repo, by creating a subdirectory in tests/example_plugin with its own setup.py and then running `python setup.py install` as part of the test runner. I imagine I will need to bump the version number every time I change the plugin in case someone runs the test again in the same virtual environment. If that doesn't work I can instead ship a datasette-plugins-tests two to PyPI and add that as a tests_require dependency. Refs #14 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/226/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
317475156 | MDU6SXNzdWUzMTc0NzUxNTY= | 237 | Support for ?_search_colname=blah searches | simonw 9599 | closed | 0 | 2 | 2018-04-25T04:29:53Z | 2018-05-05T22:56:42Z | 2018-05-05T22:33:23Z | OWNER | Right now the `_search=` argument searches across all fields in a full-text index, for example: https://san-francisco.datasettes.com/sf-film-locations-84594a7/Film_Locations_in_San_Francisco?_search=justin SQLite FTS also supports searches within a specified field, for example: https://san-francisco.datasettes.com/sf-film-locations-84594a7?sql=select+rowid%2C+*+from+Film_Locations_in_San_Francisco+where+rowid+in+%28select+rowid+from+%5BFilm_Locations_in_San_Francisco_fts%5D+where+%5BLocations%5D+match+%3Asearch%29+order+by+rowid+limit+101&search=justin ``` select rowid, * from Film_Locations_in_San_Francisco where rowid in ( select rowid from [Film_Locations_in_San_Francisco_fts] where [Locations] match :search ) order by rowid limit 101 ``` The `_search=` parameter could be extended to support this using `_search_colname=`. This should also be able to support columns with spaces and special characters in their names, something like this: `_search_Column%20With%20Spaces=foo` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/237/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
317760361 | MDU6SXNzdWUzMTc3NjAzNjE= | 239 | Support for hidden tables in metadata.json | simonw 9599 | closed | 0 | 2 | 2018-04-25T19:21:17Z | 2018-04-26T03:45:12Z | 2018-04-26T03:43:10Z | OWNER | Since we already have a hidden feature, let's expose it more to our users | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/239/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
318737808 | MDU6SXNzdWUzMTg3Mzc4MDg= | 243 | --spatialite option for datasette publish commands | simonw 9599 | closed | 0 | 2 | 2018-04-29T18:19:32Z | 2018-05-31T14:17:53Z | 2018-05-31T14:17:53Z | OWNER | Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/243/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
319954545 | MDU6SXNzdWUzMTk5NTQ1NDU= | 248 | /-/plugins should show version of each installed plugin | simonw 9599 | closed | 0 | 2 | 2018-05-03T14:50:45Z | 2018-05-04T18:25:40Z | 2018-05-04T18:05:04Z | OWNER | Refs #244 https://stackoverflow.com/questions/20180543/how-to-check-version-of-python-modules ``` >>> import pkg_resources >>> pkg_resources.get_distribution('datasette_cluster_map').version '0.4' ``` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/248/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
324451322 | MDU6SXNzdWUzMjQ0NTEzMjI= | 273 | Figure out a way to have /-/version return current git commit hash | simonw 9599 | closed | 0 | 2 | 2018-05-18T15:16:56Z | 2018-05-22T19:35:22Z | 2018-05-22T19:35:22Z | OWNER | https://fivethirtyeight.datasettes.com/-/versions reports Datasette version `0.21` This isn't actually correct. The deploy script for that site actually deploys current master using `https://github.com/simonw/datasette/archive/master.zip`: https://github.com/simonw/fivethirtyeight-datasette/blob/66b4b0dfedd7237bc8c02d3e26d905bca7b84069/Dockerfile#L9 Ideally this would show the current commit hash, but I'm not at all sure if it's possible to derive that from `pip install https://github.com/simonw/datasette/archive/master.zip`. Is there another mechanism that could be used to reliably `pip install` current master but still provide access to the most recent commit hash? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/273/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
324652142 | MDU6SXNzdWUzMjQ2NTIxNDI= | 274 | Rename --limit to --config, add --help-config | simonw 9599 | closed | 0 | 2 | 2018-05-19T18:57:42Z | 2018-05-20T17:04:55Z | 2018-05-20T17:04:11Z | OWNER | #270 introduced `--limit` but on further thought it should be called `--config` instead. `--page_size` should becomes `--config default_page_size:1000` Add `--help-config` to show full help showing all config settings. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/274/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
326189744 | MDU6SXNzdWUzMjYxODk3NDQ= | 285 | num_threads and cache_max_age should be --config options | simonw 9599 | closed | 0 | 2 | 2018-05-24T16:04:51Z | 2018-05-27T00:53:35Z | 2018-05-27T00:43:33Z | OWNER | https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/app.py#L106 And https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/views/base.py#L325 Refs #275 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/285/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
327383759 | MDU6SXNzdWUzMjczODM3NTk= | 295 | Extract unit tests for inspect out to test_inspect.py | simonw 9599 | closed | 0 | 2 | 2018-05-29T15:55:04Z | 2019-05-11T21:40:32Z | 2019-05-11T21:40:32Z | OWNER | Right now they are bundled up as API unit tests for a relatively unimportant endpoint. They should be their own thing. Blocks #294 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/295/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
327459829 | MDU6SXNzdWUzMjc0NTk4Mjk= | 298 | URLify URLs in results from custom SQL statements / views | simonw 9599 | closed | 0 | 2 | 2018-05-29T19:41:07Z | 2018-07-24T04:53:20Z | 2018-07-24T03:56:50Z | OWNER | Consider this custom query: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+user%2C+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+user%29+as+user_url%2C+created_at%2C+text%2C+url+from+%5Btwitter-ratio%2Fsenators%5D+limit+10%3B ```select user, ('https://twitter.com/' || user) as user_url, created_at, text, url from [twitter-ratio/senators] limit 10;```  It would be nice if these URLs were turned into links, as happens on the table view page: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/twitter-ratio%2Fsenators  This currently does not happen because the table view render logic takes a different path through `display_columns_and_rows()` which includes this bit: https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/views/table.py#L195-L202 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/298/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
328171513 | MDU6SXNzdWUzMjgxNzE1MTM= | 302 | test-2.3.sqlite database filename throws a 404 | simonw 9599 | closed | 0 | 0.23.1 3439337 | 2 | 2018-05-31T14:50:58Z | 2018-06-21T15:21:17Z | 2018-06-21T15:21:16Z | OWNER | The following almost works: datasette test-2.3.sqlite http://127.0.0.1:8001test-2.3-c88bc35/HighWays loads OK, but http://127.0.0.1:8001test-2.3-c88bc35 throws a 404:  | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/302/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
329147284 | MDU6SXNzdWUzMjkxNDcyODQ= | 305 | Add contributor guidelines to docs | simonw 9599 | closed | 0 | 2 | 2018-06-04T17:25:30Z | 2019-06-24T06:40:19Z | 2019-06-24T06:40:19Z | OWNER | https://channels.readthedocs.io/en/latest/contributing.html is a nice example of this done well. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/305/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
331343824 | MDU6SXNzdWUzMzEzNDM4MjQ= | 309 | On 404s with a trailing slash redirect to that page without a trailing slash | simonw 9599 | closed | 0 | 0.23.1 3439337 | 2 | 2018-06-11T20:46:49Z | 2018-06-21T15:22:02Z | 2018-06-21T15:13:15Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/309/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
333326107 | MDU6SXNzdWUzMzMzMjYxMDc= | 317 | Travis CI fails to upload new releases to PyPI | simonw 9599 | closed | 0 | 0.23.1 3439337 | 2 | 2018-06-18T15:44:26Z | 2018-06-21T15:45:47Z | 2018-06-21T15:45:47Z | OWNER | https://travis-ci.org/simonw/datasette/jobs/393684139 ``` ... removing build/bdist.linux-x86_64/wheel Uploading distributions to https://upload.pypi.org/legacy/ Uploading datasette-0.23-py3-none-any.whl 100%|██████████| 201k/201k [00:00<00:00, 1.02MB/s] HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://upload.pypi.org/legacy/ ``` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/317/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
334149717 | MDU6SXNzdWUzMzQxNDk3MTc= | 319 | Incorrect display of compound primary keys with foreign key relationships | simonw 9599 | closed | 0 | 0.23.1 3439337 | 2 | 2018-06-20T16:09:36Z | 2018-06-21T15:58:15Z | 2018-06-21T14:56:41Z | OWNER | https://registry.datasette.io/registry-7d4f81f/datasette_tags  Underlying JSON looks [like this](https://registry.datasette.io/registry-7d4f81f/datasette_tags.json?_labels=on): ``` { "database": "registry", "table": "datasette_tags", "is_view": false, "human_description_en": "", "rows": [ { "datasette_id": { "value": 1, "label": "Global Power Plant Database" }, "tag": { "value": "geospatial", "label": "geospatial" } }, ```` Bug is likely somewhere in here: https://github.com/simonw/datasette/blob/e04f5b0d348ef7275a0a5ab9eb53527105132885/datasette/views/table.py#L143-L207 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/319/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
336936010 | MDU6SXNzdWUzMzY5MzYwMTA= | 331 | Datasette throws error when loading spatialite db without extension loaded | psychemedia 82988 | closed | 0 | 2 | 2018-06-29T09:51:14Z | 2022-01-20T21:29:40Z | 2018-07-10T15:13:36Z | CONTRIBUTOR | When starting datasette on a SpatialLite database *without* loading the SpatiaLite extension (using eg `--load-extension=/usr/local/lib/mod_spatialite.dylib`) an error is thrown and the server fails to start: ``` datasette -p 8003 adminboundaries.db Serve! files=('adminboundaries.db',) on port 8003 Traceback (most recent call last): File "/Users/ajh59/anaconda3/bin/datasette", line 11, in <module> sys.exit(cli()) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 722, in __call__ return self.main(*args, **kwargs) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 697, in main rv = self.invoke(ctx) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py", line 535, in invoke return callback(*args, **kwargs) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/cli.py", line 552, in serve ds.inspect() File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/app.py", line 273, in inspect "tables": inspect_tables(conn, self.metadata.get("databases", {}).get(name, {})) File "/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/inspect.py", line 79, in inspect_tables "PRAGMA table_info({});".format(escape_sqlite(table)) sqlite3.OperationalError: no such module: VirtualSpatialIndex ``` It would be nice to trap this and return a message saying something like: ``` It looks like you're trying to load a SpatiaLite database? Make sure you load in the SpatiaLite extension when starting datasette. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/331/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
339095976 | MDU6SXNzdWUzMzkwOTU5NzY= | 334 | extra_options not passed to heroku publisher | kamicut 719357 | closed | 0 | 2 | 2018-07-06T23:26:12Z | 2018-07-24T04:53:21Z | 2018-07-10T01:46:04Z | NONE | I might be wrong but I was not able to publish to `heroku` with `--extra-options`, I think `extra_options` is not being used in this function [here](https://github.com/simonw/datasette/blob/master/datasette/utils.py#L369). Any help appreciated! | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/334/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
344701755 | MDU6SXNzdWUzNDQ3MDE3NTU= | 350 | Don't list default plugins on /-/plugins | simonw 9599 | closed | 0 | 2 | 2018-07-26T05:38:00Z | 2018-08-28T17:13:50Z | 2018-08-28T16:48:19Z | OWNER | https://dbbe707.datasette.io/-/plugins is showing "datasette.publish.now" and "datasette.publish.heroku" | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/350/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
369716228 | MDU6SXNzdWUzNjk3MTYyMjg= | 366 | Default built image size over Zeit Now 100MiB limit | gfrmin 416374 | closed | 0 | 2 | 2018-10-12T21:27:17Z | 2018-11-05T06:23:32Z | 2018-11-05T06:23:32Z | CONTRIBUTOR | Using `dataset publish now` with no other custom options on a small (43KB) sqlite database leads to the error "The built image size (373.5M) exceeds the 100MiB limit". I think this is because of a recent Zeit change: https://github.com/zeit/now-cli/issues/1523 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/366/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
398089089 | MDU6SXNzdWUzOTgwODkwODk= | 399 | /-/versions for official Docker image returns wrong Datasette version | simonw 9599 | closed | 0 | 2 | 2019-01-11T01:19:58Z | 2019-01-13T23:31:59Z | 2019-01-13T23:10:45Z | OWNER | ``` docker run -p 8001:8001 datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 ``` http://0.0.0.0:8001/-/versions returns this: ``` { "datasette": { "version": "0+unknown" }, ... ``` This is because the Docker image is built by copying in the Datasette source code, which confuses versioneer. Maybe the Docker image should install the code using a wheel or similar? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/399/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
400511206 | MDU6SXNzdWU0MDA1MTEyMDY= | 403 | How does persistence work? | ccorcos 1794527 | closed | 0 | 2 | 2019-01-17T23:41:57Z | 2019-01-19T05:47:55Z | 2019-01-18T06:51:14Z | NONE | I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/403/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
403617881 | MDU6SXNzdWU0MDM2MTc4ODE= | 405 | .json?_nl=on option for exporting newline-delimited JSON | simonw 9599 | closed | 0 | 2 | 2019-01-28T01:10:45Z | 2019-01-28T01:49:00Z | 2019-01-28T01:48:37Z | OWNER | The neat thing about newline-delimited JSON is that you don't have to read an entire array (of potentially thousands of objects) into memory in order to parse it - you can parse things a line at a time instead. It will look like this: `https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on` ``` {"pk": 1, "planet_int": 1, "on_earth": 1, "state": "CA", "city_id": 1, "neighborhood": "Mission"} {"pk": 2, "planet_int": 1, "on_earth": 1, "state": "CA", "city_id": 1, "neighborhood": "Dogpatch"} {"pk": 3, "planet_int": 1, "on_earth": 1, "state": "CA", "city_id": 1, "neighborhood": "SOMA"} {"pk": 4, "planet_int": 1, "on_earth": 1, "state": "CA", "city_id": 1, "neighborhood": "Tenderloin"} {"pk": 5, "planet_int": 1, "on_earth": 1, "state": "CA", "city_id": 1, "neighborhood": "Bernal Heights"} ``` I added this as part of the `sqlite-utils json` CLI command is this commit - I think Datasette should offer it as well: https://github.com/simonw/sqlite-utils/commit/5466c9745dfef858286146ea158ffd5a71391d10 It can be offered alongside `_stream=on` (which currently only works for CSV, but it could work for JSON as well thanks to this trick). | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/405/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
406055201 | MDU6SXNzdWU0MDYwNTUyMDE= | 406 | Support nullable foreign keys in _labels mode | simonw 9599 | closed | 0 | simonw 9599 | 2 | 2019-02-03T05:34:20Z | 2019-11-02T22:39:28Z | 2019-11-02T22:30:27Z | OWNER | Currently if there's a null in a foreign key we get "None" displayed in the inflated view: <img width="412" alt="screen shot 2019-02-02 at 9 33 37 pm" src="https://user-images.githubusercontent.com/9599/52173123-46221e80-2732-11e9-8dcb-f6f4bc768c33.png"> | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/406/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
413740684 | MDU6SXNzdWU0MTM3NDA2ODQ= | 11 | Detect numpy types when creating tables | simonw 9599 | closed | 0 | 2 | 2019-02-23T21:09:35Z | 2019-02-24T04:02:20Z | 2019-02-24T04:02:20Z | OWNER | Inspired by #8 | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/11/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
413871266 | MDU6SXNzdWU0MTM4NzEyNjY= | 18 | .insert/.upsert/.insert_all/.upsert_all should add missing columns | simonw 9599 | closed | 0 | 1.0 4348046 | 2 | 2019-02-24T21:36:11Z | 2019-05-25T00:42:11Z | 2019-05-25T00:42:11Z | OWNER | This is a larger change, but it would be incredibly useful: if you attempt to insert or update a document with a field that does not currently exist in the underlying table, sqlite-utils should add the appropriate column for you. | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/18/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
421985685 | MDU6SXNzdWU0MjE5ODU2ODU= | 421 | Documentation for ?_hash=1 and Datasette's hashed URL caching | simonw 9599 | closed | 0 | 0.28 4305096 | 2 | 2019-03-17T23:08:36Z | 2019-05-19T05:32:37Z | 2019-05-19T05:31:27Z | OWNER | Follow on from #418 - the Datasette documentation needs an entire section (probably a new page) describing exactly how the hash-in-URL caching mechanism works. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/421/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
430103450 | MDU6SXNzdWU0MzAxMDM0NTA= | 425 | Submitting SQL on hide page is broken | simonw 9599 | closed | 0 | 2 | 2019-04-07T04:21:31Z | 2019-04-12T05:12:13Z | 2019-04-12T05:00:53Z | OWNER | Clicking the submit button here doesn't work correctly: https://3a208a4.datasette.io/fixtures?sql=select+%2A+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101&_hide_sql=1 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/425/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
432371762 | MDU6SXNzdWU0MzIzNzE3NjI= | 428 | Make ?_fts_table=x and ?_fts_pk=y available as URL parameters on table view | simonw 9599 | closed | 0 | 2 | 2019-04-12T03:30:55Z | 2019-04-12T04:30:29Z | 2019-04-12T04:21:25Z | OWNER | These can currently only be set using `metadata.json`: https://datasette.readthedocs.io/en/0.27/full_text_search.html#configuring-full-text-search-for-a-table-or-view There's no reason not to support these as URL parameters as well. That way it would be easy to use FTS search against a view without having to use `metadata.json`. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/428/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
438200529 | MDU6SXNzdWU0MzgyMDA1Mjk= | 438 | Plugins are loaded when running pytest | russss 45057 | closed | 0 | 2 | 2019-04-29T08:25:58Z | 2019-05-02T05:09:18Z | 2019-05-02T05:09:11Z | CONTRIBUTOR | If I have a datasette plugin installed on my system, its hooks are called when running the main datasette tests. This is probably undesirable, especially with the inspect hook in #437, as the plugin may rely on inspected state that the tests don't know about. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/438/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
442330564 | MDU6SXNzdWU0NDIzMzA1NjQ= | 457 | Ability to "publish cloudrun" with no user input | simonw 9599 | closed | 0 | 2 | 2019-05-09T16:42:51Z | 2019-05-09T19:41:31Z | 2019-05-09T16:45:08Z | OWNER | If you attempt to deploy a new version of a cloudrun deployment, the script currently pauses and asks for user input for the service name like this: ```77d4d7de-3dfc-4acc-9a23-efe16230f318 2019-05-09T15:01:48+00:00 52S gs://datasette-222320_cloudbuild/source/1557414063.1-3a82df8096e9434b93511b0588d8d155.tgz gcr.io/datasette-222320/sf-trees (+1 more) SUCCESS Service name: (sf-trees): USER INPUT REQUIRED HERE Deploying container to Cloud Run service [sf-trees] in project [datasette-222320] region [us-central1] ✓ Deploying... Done. ✓ Creating Revision... ✓ Routing traffic... ✓ Setting IAM Policy... ``` This is incompatible with running under CI. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/457/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
443040665 | MDU6SXNzdWU0NDMwNDA2NjU= | 466 | Move "no such module: VirtualSpatialIndex" code elsewhere | simonw 9599 | closed | 0 | 0.28 4305096 | 2 | 2019-05-11T22:09:00Z | 2022-01-20T21:29:41Z | 2019-05-11T22:57:22Z | OWNER | We currently show a useful warning (from #331) when the user tries to open a spatialite database without first loading the module: https://github.com/simonw/datasette/blob/c692cd291111050483a32bea1ee08e994a0b781b/datasette/app.py#L547-L554 This code is part of `.inspect()` which is going away - see #462 - so I need to find somewhere else for it to live. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/466/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
444711254 | MDU6SXNzdWU0NDQ3MTEyNTQ= | 467 | Index page row counts only for DBs with < 30 tables (10ms count limit per table) | simonw 9599 | closed | 0 | 0.28 4305096 | 2 | 2019-05-16T01:21:36Z | 2019-05-16T03:03:45Z | 2019-05-16T03:03:45Z | OWNER | Split out from #460. If a database is mutable, calculating row counts gets expensive. I'm only going to calculate row counts for the index page if it has less than X tables (both hidden and non-hidden) AND each table can be counted in less than 10ms. If any count takes longer than 10ms I'll cancel the counting entirely. We currently show an inaccurate count if this happens, which is just confusing. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/467/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
448189298 | MDU6SXNzdWU0NDgxODkyOTg= | 486 | Ability to add extra routes and related templates | clausjuhl 2181410 | closed | 0 | 2 | 2019-05-24T14:04:25Z | 2019-05-24T14:43:28Z | 2019-05-24T14:43:09Z | NONE | Hi Simon Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics). | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/486/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
448395665 | MDU6SXNzdWU0NDgzOTU2NjU= | 22 | Release notes for 1.0 | simonw 9599 | closed | 0 | 1.0 4348046 | 2 | 2019-05-25T00:58:03Z | 2019-05-25T01:18:27Z | 2019-05-25T01:06:52Z | OWNER | https://github.com/simonw/sqlite-utils/compare/0.14...251e473 | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
457201907 | MDU6SXNzdWU0NTcyMDE5MDc= | 513 | Is it possible to publish to Heroku despite slug size being too large? | chrismp 7936571 | closed | 0 | 2 | 2019-06-18T00:12:02Z | 2019-06-21T22:35:54Z | 2019-06-21T22:35:54Z | NONE | I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command. Compiled slug size: 535.5M is too large (max is 500M). Can I publish the databases and make datasette work on Heroku despite the large slug size? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/513/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
459627549 | MDU6SXNzdWU0NTk2Mjc1NDk= | 523 | Show total/unfiltered row count when filtering | rixx 2657547 | closed | 0 | 2 | 2019-06-23T22:56:48Z | 2019-06-24T01:38:14Z | 2019-06-24T01:38:14Z | CONTRIBUTOR | When I'm seeing a filtered view of a table, I'd like to be able to see something like '2 rows where status != "closed" (of 1000 total)' to have a context for the data I'm seeing – e.g. currently my database is being filled by an importer, so this information would be super helpful. Since this information would be a performance hit, maybe something like '12 rows where status != "closed" (of ??? total)' with lazy-loading on-click(?) could be applied (Or via a "How many total?" tooltip, or …) | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/523/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
459714943 | MDU6SXNzdWU0NTk3MTQ5NDM= | 525 | Add section on sqite-utils enable-fts to the search documentation | simonw 9599 | closed | 0 | simonw 9599 | 2 | 2019-06-24T06:39:16Z | 2019-06-24T16:36:35Z | 2019-06-24T16:29:43Z | OWNER | https://datasette.readthedocs.io/en/stable/full_text_search.html already has a section about csvs-to-sqlite, sqlite-utils is even more relevant. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/525/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
462423839 | MDU6SXNzdWU0NjI0MjM4Mzk= | 33 | index_foreign_keys / index-foreign-keys utilities | simonw 9599 | closed | 0 | 2 | 2019-06-30T16:42:03Z | 2019-06-30T23:54:11Z | 2019-06-30T23:50:55Z | OWNER | Sometimes it's good to have indices on all columns that are foreign keys, to allow for efficient reverse lookups. This would be a useful utility: $ sqlite-utils index-foreign-keys database.db | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/33/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
462430920 | MDU6SXNzdWU0NjI0MzA5MjA= | 35 | table.update(...) method | simonw 9599 | closed | 0 | 2 | 2019-06-30T18:06:15Z | 2019-07-28T15:43:52Z | 2019-07-28T15:43:52Z | OWNER | Spun off from #23 - this method will allow a user to update a specific row. Currently the only way to do that it is to call `.upsert({full record})` with the primary key field matching an existing record - but this does not support partial updates. ```python db["events"].update(3, {"name": "Renamed"}) ``` This method only works on an existing table, so there's no need for a `pk="id"` specifier - it can detect the primary key by looking at the table. If the primary key is compound the first argument can be a tuple: ```python db["events_venues"].update((3, 2), {"custom_label": "Label"}) ``` The method can be called without the second dictionary argument. Doing this selects the row specified by the primary key (throwing an error if it does not exist) and remembers it so that chained operations can be carried out - see proposal in https://github.com/simonw/sqlite-utils/issues/23#issuecomment-507055345 | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/35/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
464779810 | MDU6SXNzdWU0NjQ3Nzk4MTA= | 541 | Plugin hook for adding extra template context variables | simonw 9599 | closed | 0 | 2 | 2019-07-05T21:37:05Z | 2019-07-06T00:05:59Z | 2019-07-06T00:05:59Z | OWNER | It turns out I need this for https://github.com/simonw/datasette-auth-github/issues/5 It can be modelled on the `extra_body_script` hook: https://datasette.readthedocs.io/en/stable/plugins.html#extra-body-script-template-database-table-view-name-datasette | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/541/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
464990184 | MDU6SXNzdWU0NjQ5OTAxODQ= | 547 | Release notes for 0.29 | simonw 9599 | closed | 0 | Datasette 0.29 4471010 | 2 | 2019-07-07T20:30:28Z | 2019-07-08T03:31:59Z | 2019-07-08T03:31:59Z | OWNER | There's a lot of stuff... https://github.com/simonw/datasette/compare/0.28...master | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/547/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
470542938 | MDU6SXNzdWU0NzA1NDI5Mzg= | 562 | Facet by array shouldn't suggest for arrays that are not arrays-of-strings | simonw 9599 | closed | 0 | 2 | 2019-07-19T20:51:29Z | 2019-11-01T19:42:10Z | 2019-11-01T19:37:55Z | OWNER | It's triggering for arrays that look like this at the moment: ```json [ { "type": "HKWorkoutEventTypeSegment", "date": "2019-05-21 09:43:50 -0700", "duration": "12.2780519704024", "durationUnit": "min" }, { "type": "HKWorkoutEventTypeSegment", "date": "2019-05-21 09:43:50 -0700", "duration": "19.467273102204", "durationUnit": "min" } ] ``` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/562/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
470691622 | MDU6SXNzdWU0NzA2OTE2MjI= | 5 | Add progress bar | simonw 9599 | closed | 0 | 2 | 2019-07-20T16:29:07Z | 2019-07-22T03:30:13Z | 2019-07-22T02:49:22Z | MEMBER | Showing a progress bar would be nice, using Click. The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser: https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10 | healthkit-to-sqlite 197882382 | issue | {"url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
471628483 | MDU6SXNzdWU0NzE2Mjg0ODM= | 44 | Utilities for building lookup tables | simonw 9599 | closed | 0 | 2 | 2019-07-23T10:59:58Z | 2019-07-23T13:07:01Z | 2019-07-23T13:07:01Z | OWNER | While building https://github.com/dogsheep/healthkit-to-sqlite I found a need for a neat mechanism for easily building lookup tables - tables where each unique value in a column is replaced by a foreign key to a separate table. csvs-to-sqlite currently creates those with its "extract" mechanism - but that's written as custom code against Pandas. I'd like to eventually replace Pandas with sqlite-utils there. See also #42 | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/44/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
471818939 | MDU6SXNzdWU0NzE4MTg5Mzk= | 48 | Jupyter notebook demo of the library, launchable on Binder | simonw 9599 | closed | 0 | 2 | 2019-07-23T17:05:05Z | 2022-01-26T02:08:46Z | 2022-01-26T02:08:39Z | OWNER | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||||
488835586 | MDU6SXNzdWU0ODg4MzU1ODY= | 4 | Command for importing data from a Twitter Export file | simonw 9599 | closed | 0 | 2 | 2019-09-03T21:34:13Z | 2019-10-11T06:45:02Z | 2019-10-11T06:45:02Z | MEMBER | Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful. $ twitter-to-sqlite import twitter.db path-to-archive.zip | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
491219910 | MDU6SXNzdWU0OTEyMTk5MTA= | 61 | importing CSV to SQLite as library | witeshadow 17739 | closed | 0 | 2 | 2019-09-09T17:12:40Z | 2019-11-04T16:25:01Z | 2019-11-04T16:25:01Z | NONE | CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
493670426 | MDU6SXNzdWU0OTM2NzA0MjY= | 3 | Command to fetch all repos belonging to a user or organization | simonw 9599 | closed | 0 | 2 | 2019-09-14T21:54:21Z | 2019-09-17T00:17:53Z | 2019-09-17T00:17:53Z | MEMBER | How about this: $ github-to-sqlite repos simonw | github-to-sqlite 207052882 | issue | {"url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
493671014 | MDU6SXNzdWU0OTM2NzEwMTQ= | 5 | Add "incomplete" boolean to users table for incomplete profiles | simonw 9599 | closed | 0 | 2 | 2019-09-14T22:01:50Z | 2020-03-23T19:23:31Z | 2020-03-23T19:23:30Z | MEMBER | User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc. Add a `incomplete` boolean flag to the `users` table to record this. Then later I can add a `backfill-users` command which loops through and fetches missing data for those incomplete profiles. | github-to-sqlite 207052882 | issue | {"url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
494685791 | MDU6SXNzdWU0OTQ2ODU3OTE= | 574 | Improve usage description of --host option | terrycojones 132978 | closed | 0 | 2 | 2019-09-17T15:12:12Z | 2019-11-01T21:58:17Z | 2019-11-01T21:57:54Z | NONE | It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for "example running datasette on a host" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/574/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
503190241 | MDU6SXNzdWU1MDMxOTAyNDE= | 584 | Codec error in some CSV exports | simonw 9599 | closed | 0 | 2 | 2019-10-07T01:15:34Z | 2021-06-17T18:13:20Z | 2019-10-18T05:23:16Z | OWNER | Got this exploring my Swarm checkins:  `/swarm/stickers.csv?stickerType=messageOnly&_size=max` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/584/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
503218205 | MDU6SXNzdWU1MDMyMTgyMDU= | 586 | Enable browser caching for plugin statics with datasette-auth | simonw 9599 | closed | 0 | 2 | 2019-10-07T03:47:14Z | 2019-10-07T15:46:04Z | 2019-10-07T15:46:03Z | OWNER | An authenticated Datasette I run is seeing delays on every page load. On looking at the network inspector it turns out it's because datasette-vega is nearly 1MB and a `cache-control: private` is preventing it from being cached! <img width="1071" alt="github__repos__32_rows_where_where_private___1" src="https://user-images.githubusercontent.com/9599/66284064-f96f8400-e87a-11e9-875f-5a6c4a8c4f41.png"> This may well turn out to be a bug in `datasette-auth-github` but it's still worth tracking here because caching of static assets from plugins is very important. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/586/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
504238461 | MDU6SXNzdWU1MDQyMzg0NjE= | 6 | sqlite3.OperationalError: table users has no column named bio | dazzag24 1055831 | closed | 0 | 2 | 2019-10-08T19:39:52Z | 2019-10-13T05:31:28Z | 2019-10-13T05:30:19Z | NONE | ``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite", line 10, in <module> sys.exit(cli()) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py", line 106, in starred utils.save_stars(db, user, stars) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 177, in save_stars user_id = save_user(db, user) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 61, in save_user return db["users"].upsert(to_save, pk="id").last_pk File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 1067, in upsert extracts=extracts, File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 916, in insert extracts=extracts, File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 1024, in insert_all result = self.db.conn.execute(sql, values)… | github-to-sqlite 207052882 | issue | {"url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
505674949 | MDU6SXNzdWU1MDU2NzQ5NDk= | 17 | import command should empty all archive-* tables first | simonw 9599 | closed | 0 | 2 | 2019-10-11T06:58:43Z | 2019-10-11T15:40:08Z | 2019-10-11T15:40:08Z | MEMBER | Can have a CLI option for NOT doing that. | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
508190730 | MDU6SXNzdWU1MDgxOTA3MzA= | 23 | Extremely simple migration system | simonw 9599 | closed | 0 | 2 | 2019-10-17T02:13:57Z | 2019-10-17T16:57:17Z | 2019-10-17T16:57:17Z | MEMBER | Needed for #12. This is going to be an incredibly simple version of the Django migration system. * A `migrations` table, keeping track of which migrations were applied (and when) * A `migrate()` function which applies any pending migrations * A `MIGRATIONS` constant which is a list of functions to be applied The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the `migrate()` function before doing anything else. Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist. | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
508578780 | MDU6SXNzdWU1MDg1Nzg3ODA= | 25 | Ensure migrations don't accidentally create foreign key twice | simonw 9599 | closed | 0 | 2 | 2019-10-17T16:08:50Z | 2019-10-17T16:56:47Z | 2019-10-17T16:56:47Z | MEMBER | Is it possible for these lines to run against a database table that already has these foreign keys? https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22 | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
516748849 | MDU6SXNzdWU1MTY3NDg4NDk= | 612 | CSV export is broken for tables with null foreign keys | simonw 9599 | closed | 0 | 2 | 2019-11-02T22:52:47Z | 2021-06-17T18:13:20Z | 2019-11-02T23:12:53Z | OWNER | Following on from #406 - this CSV export appears to be broken: https://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max ```csv pk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label 1,1,hello,1,1 2,, ``` That second row should have 5 values, but it only has 4. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/612/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
518506242 | MDU6SXNzdWU1MTg1MDYyNDI= | 616 | Datasette FTS detection bug | null92 49656826 | closed | 0 | 2 | 2019-11-06T14:25:47Z | 2019-11-08T15:31:33Z | 2019-11-08T02:06:56Z | NONE | I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here: With search: http://teste-templates.herokuapp.com/amazonia_protege/car Without search: http://bases.vortex.media/amazonia_protege/car  | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/616/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
518739697 | MDU6SXNzdWU1MTg3Mzk2OTc= | 30 | `followers` fails because `transform_user` is called twice | jacobian 21148 | closed | 0 | 2 | 2019-11-06T20:44:52Z | 2019-11-09T20:15:28Z | 2019-11-09T19:55:52Z | CONTRIBUTOR | Trying to run `twitter-to-sqlite followers` errors out: ``` Traceback (most recent call last): File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite", line 10, in <module> sys.exit(cli()) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 130, in followers go(bar.update) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 116, in go utils.save_users(db, [profile]) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 302, in save_users transform_user(user) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 181, in transform_user user["created_at"] = parser.parse(user["created_at"]) File "/Users/jacob/Librar… | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
520507306 | MDU6SXNzdWU1MjA1MDczMDY= | 618 | Mechanism for seeing indexes on a specific table | simonw 9599 | closed | 0 | 2 | 2019-11-09T20:10:41Z | 2019-11-10T01:40:05Z | 2019-11-10T01:30:25Z | OWNER | The only way to see the indexes that apply to a specific table at the moment is to run the following SQL manually: ```sql select * from sqlite_master where type = 'index' and tbl_name=? ``` For example: <img width="964" alt="f__select___from_sqlite_master_where_tbl_name____following__and_type__index_" src="https://user-images.githubusercontent.com/9599/68534478-d4db5180-02e9-11ea-9b4d-44dab2c314c9.png"> It would be good if this list of indexes was displayed in a neater way on the table page. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/618/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
520508502 | MDU6SXNzdWU1MjA1MDg1MDI= | 31 | "friends" command (similar to "followers") | simonw 9599 | closed | 0 | 2 | 2019-11-09T20:20:20Z | 2022-09-20T05:05:03Z | 2020-02-07T07:03:28Z | MEMBER | Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
521329771 | MDU6SXNzdWU1MjEzMjk3NzE= | 628 | Render jinja2 templates in async mode | simonw 9599 | closed | 0 | 2 | 2019-11-12T05:01:55Z | 2019-11-14T23:28:09Z | 2019-11-14T23:14:24Z | OWNER | I started playing with this in #404 and got good results but it didn't work in Python 3.5. As of #627 I don't support 3.5 any more so this can go ahead. Rendering templates in async mode will mean that template plugins can include async code... which opens the door to custom template functions that execute SQL queries! | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/628/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
521995039 | MDU6SXNzdWU1MjE5OTUwMzk= | 632 | Upgrade datasette publish Heroku runtime | simonw 9599 | closed | 0 | 2 | 2019-11-13T06:46:19Z | 2019-11-13T16:44:07Z | 2019-11-13T16:43:23Z | OWNER | ``` Python has released a security update! Please consider upgrading to python-3.6.9 ``` https://devcenter.heroku.com/articles/python-support#supported-runtimes shows 3.8.0 is now supported. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/632/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
522352520 | MDU6SXNzdWU1MjIzNTI1MjA= | 634 | Don't run tests twice when releasing a tag | simonw 9599 | closed | 0 | 2 | 2019-11-13T17:02:42Z | 2020-09-15T20:37:58Z | 2020-09-15T20:37:58Z | OWNER | Shipping a release currently runs the tests twice: https://travis-ci.org/simonw/datasette/builds/611463728 It does a regular test run on Python 3.6/7/8 - then the "Release tagged version" step runs the tests again before publishing to PyPI! This second run is not necessary. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/634/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
534530973 | MDU6SXNzdWU1MzQ1MzA5NzM= | 649 | Reduce table counts on index page with many databases | simonw 9599 | closed | 0 | 2 | 2019-12-08T11:56:37Z | 2020-02-29T01:08:29Z | 2020-02-29T01:08:29Z | OWNER | Since #467 the index page has attempted to optimistically count times. My personal Dogsheep has enough connected databases and tables that the page can still take way too long to load - sometimes more than twenty seconds. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/649/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
557825032 | MDU6SXNzdWU1NTc4MjUwMzI= | 77 | Ability to insert data that is transformed by a SQL function | simonw 9599 | closed | 0 | 2 | 2020-01-30T23:45:55Z | 2022-02-05T00:04:25Z | 2020-01-31T00:24:32Z | OWNER | I want to be able to run the equivalent of this SQL insert: ```python # Convert to "Well Known Text" format wkt = shape(geojson['geometry']).wkt # Insert and commit the record conn.execute("INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))", ( "Wales", wkt )) conn.commit() ``` From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`. | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
562787785 | MDU6SXNzdWU1NjI3ODc3ODU= | 667 | Allow injecting configuration data from plugins | xrotwang 870184 | closed | 0 | 2 | 2020-02-10T19:50:15Z | 2020-02-12T16:18:22Z | 2020-02-12T09:21:22Z | NONE | I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md). Part of this customization would be support for the "special" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file. It would be nice to be able to programmatically inject config data from plugins as well. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/667/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
563347679 | MDU6SXNzdWU1NjMzNDc2Nzk= | 668 | Make it easier to load SpatiaLite | simonw 9599 | closed | 0 | 2 | 2020-02-11T17:03:43Z | 2022-01-20T21:29:41Z | 2021-01-04T20:18:39Z | OWNER | ``` $ datasette spatial.db Serve! files=('spatial.db',) (immutables=()) on port 8001 ERROR: conn=<sqlite3.Connection object at 0x11e388f10>, sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex Usage: datasette serve [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it: ``` datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib ``` Even better: if Datasette had a `--spatialite` option which automatically loads the extension from common locations, if it can find it. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/668/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
565837965 | MDU6SXNzdWU1NjU4Mzc5NjU= | 87 | Should detect collections.OrderedDict as a regular dictionary | simonw 9599 | closed | 0 | 2 | 2020-02-16T02:06:34Z | 2020-02-16T02:20:59Z | 2020-02-16T02:20:59Z | OWNER | ``` File "...python3.7/site-packages/sqlite_utils/db.py", line 292, in create_table column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: <class 'collections.OrderedDict'> ``` | sqlite-utils 140912432 | issue | {"url": "https://api.github.com/repos/simonw/sqlite-utils/issues/87/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
569268612 | MDU6SXNzdWU1NjkyNjg2MTI= | 679 | Release 0.36 | simonw 9599 | closed | 0 | 2 | 2020-02-22T02:41:01Z | 2020-02-22T03:52:13Z | 2020-02-22T03:52:13Z | OWNER | I think we have enough changes to warrant a release - and I want to take advantage of the changes to the `prepare_connection()` plugin hook in #678 Changes since 0.35 so far: https://github.com/simonw/datasette/compare/0.35...be2265b0e811d0ac2875c2f748125c17b0f9289e - [x] Update ecosystem page - [x] Write release notes - [x] Ship the release | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/679/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
576722115 | MDU6SXNzdWU1NzY3MjIxMTU= | 696 | Single failing unit test when run inside the Docker image | simonw 9599 | closed | 0 | Datasette 1.0 3268330 | 2 | 2020-03-06T06:16:36Z | 2021-03-29T17:04:19Z | 2021-03-07T07:41:18Z | OWNER | ``` docker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash root@0e1928cfdf79:/# cd /mnt root@0e1928cfdf79:/mnt# pip install -e .[test] root@0e1928cfdf79:/mnt# pytest ``` I get one failure! It was for `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]` ``` def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json["rows"] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E + [] E - [[1, 'barry cat', 'terry dog', 'panther'], E - [2, 'terry dog', 'sara weasel', 'puma']] ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_ | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/696/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
578883725 | MDU6SXNzdWU1Nzg4ODM3MjU= | 17 | Command for importing commits | simonw 9599 | closed | 0 | 2 | 2020-03-10T21:55:12Z | 2020-03-11T02:47:37Z | 2020-03-11T02:47:37Z | MEMBER | Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits | github-to-sqlite 207052882 | issue | {"url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
585266763 | MDU6SXNzdWU1ODUyNjY3NjM= | 34 | IndexError running user-timeline command | simonw 9599 | closed | 0 | 2 | 2020-03-20T18:54:08Z | 2020-03-20T19:20:52Z | 2020-03-20T19:20:37Z | MEMBER | ``` $ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines Traceback (most recent call last): File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite", line 11, in <module> load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py", line 256, in user_timeline utils.save_tweets(db, chunk) File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 289, in save_tweets db["users"].upsert(user, pk="id", alter=True) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1128, in upsert conversions=conversions, File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1157, in upsert_all upsert=True, File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1096, in insert_all row = lis… | twitter-to-sqlite 206156866 | issue | {"url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
585850715 | MDU6SXNzdWU1ODU4NTA3MTU= | 19 | Enable full-text search for more stuff (like commits, issues and issue_comments) | simonw 9599 | closed | 0 | 1.0 5225818 | 2 | 2020-03-23T00:19:56Z | 2020-03-23T19:06:39Z | 2020-03-23T19:06:39Z | MEMBER | Currently FTS is only enabled for repos and releases. | github-to-sqlite 207052882 | issue | {"url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/19/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
586561727 | MDU6SXNzdWU1ODY1NjE3Mjc= | 21 | Turn GitHub API errors into exceptions | simonw 9599 | closed | 0 | 1.0 5225818 | 2 | 2020-03-23T22:37:24Z | 2020-03-23T23:48:23Z | 2020-03-23T23:48:22Z | MEMBER | This would have really helped in debugging the mess in #13. Running with this `auth.json` is a useful demo: ```json {"github_personal_token": ""} ``` | github-to-sqlite 207052882 | issue | {"url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/21/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
587322443 | MDU6SXNzdWU1ODczMjI0NDM= | 710 | Remove Zeit Now v1 support | simonw 9599 | closed | 0 | 2 | 2020-03-24T22:39:49Z | 2020-04-04T23:05:12Z | 2020-04-04T23:05:12Z | OWNER | It will remain supported as a plugin but since no-one can sign up for Docker hosting any more (for over a year now) there's no point including it in Datasette core. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/710/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
587398703 | MDU6SXNzdWU1ODczOTg3MDM= | 711 | Release notes for Datasette 0.39 | simonw 9599 | closed | 0 | Datasette 0.39 5234079 | 2 | 2020-03-25T02:31:13Z | 2020-03-25T04:06:55Z | 2020-03-25T04:06:55Z | OWNER | Then I can ship it. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/711/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);