home / github

Menu
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

30 rows where reactions = "{"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}"

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: user, author_association, updated_at (date)

reactions 1 ✖

  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} · 30 ✖
id ▼ html_url issue_url node_id user created_at updated_at author_association body reactions issue performed_via_github_app
344810525 https://github.com/simonw/datasette/issues/46#issuecomment-344810525 https://api.github.com/repos/simonw/datasette/issues/46 MDEyOklzc3VlQ29tbWVudDM0NDgxMDUyNQ== ingenieroariel 54999 2017-11-16T04:11:25Z 2017-11-16T04:11:25Z CONTRIBUTOR @simonw On the spatialite support, here is some info to make it work and a screenshot: <img width="1230" alt="screen shot 2017-11-15 at 11 08 14 pm" src="https://user-images.githubusercontent.com/54999/32873420-f8a6d5a0-ca59-11e7-8a73-7d58d467e413.png"> I used the following Dockerfile: ``` FROM prolocutor/python3-sqlite-ext:3.5.1-spatialite as build RUN mkdir /code ADD . /code/ RUN pip install /code/ EXPOSE 8001 CMD ["datasette", "serve", "/code/ne.sqlite", "--host", "0.0.0.0"] ``` and added this to `prepare_connection`: ``` conn.enable_load_extension(True) conn.execute("SELECT load_extension('/usr/local/lib/mod_spatialite.so')") ``` {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468  
379595253 https://github.com/simonw/datasette/issues/185#issuecomment-379595253 https://api.github.com/repos/simonw/datasette/issues/185 MDEyOklzc3VlQ29tbWVudDM3OTU5NTI1Mw== simonw 9599 2018-04-09T00:24:10Z 2018-04-09T00:24:10Z OWNER @carlmjohnson in case you aren't following along with #189 I've shipped the first working prototype of sort-by-column - you can try it out here: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Metadata should be a nested arbitrary KV store 299760684  
400903871 https://github.com/simonw/datasette/issues/57#issuecomment-400903871 https://api.github.com/repos/simonw/datasette/issues/57 MDEyOklzc3VlQ29tbWVudDQwMDkwMzg3MQ== simonw 9599 2018-06-28T04:01:38Z 2018-06-28T04:01:38Z OWNER Shipped to Docker Hub: https://hub.docker.com/r/datasetteproject/datasette/ I did this manually the first time. I'll set Travis up to do this automatically in #329 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Ship a Docker image of the whole thing 273127694  
403672561 https://github.com/simonw/datasette/issues/334#issuecomment-403672561 https://api.github.com/repos/simonw/datasette/issues/334 MDEyOklzc3VlQ29tbWVudDQwMzY3MjU2MQ== simonw 9599 2018-07-10T01:45:28Z 2018-07-10T01:45:28Z OWNER Tested with `datasette publish heroku fixtures.db --extra-options="--config sql_time_limit_ms:4000"` https://blooming-anchorage-31561.herokuapp.com/-/config {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} extra_options not passed to heroku publisher 339095976  
453262703 https://github.com/simonw/datasette/issues/271#issuecomment-453262703 https://api.github.com/repos/simonw/datasette/issues/271 MDEyOklzc3VlQ29tbWVudDQ1MzI2MjcwMw== simonw 9599 2019-01-10T21:35:18Z 2019-01-10T21:35:18Z OWNER It turns out this was much easier to support than I expected: https://github.com/simonw/datasette/commit/eac08f0dfc61a99e8887442fc247656d419c76f8 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Mechanism for automatically picking up changes when on-disk .db file changes 324162476  
489104146 https://github.com/simonw/datasette/pull/434#issuecomment-489104146 https://api.github.com/repos/simonw/datasette/issues/434 MDEyOklzc3VlQ29tbWVudDQ4OTEwNDE0Ng== simonw 9599 2019-05-03T13:56:45Z 2019-05-03T13:56:45Z OWNER This is amazing - works an absolute treat. Thank you very much! {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} "datasette publish cloudrun" command to publish to Google Cloud Run 434321685  
509066513 https://github.com/simonw/datasette/issues/478#issuecomment-509066513 https://api.github.com/repos/simonw/datasette/issues/478 MDEyOklzc3VlQ29tbWVudDUwOTA2NjUxMw== simonw 9599 2019-07-08T03:30:41Z 2019-07-08T03:30:41Z OWNER This worked as intended - thanks @glasnt! https://travis-ci.org/simonw/datasette/builds/555580006 <img width="904" alt="Build__1058_-_simonw_datasette_-_Travis_CI" src="https://user-images.githubusercontent.com/9599/60781155-0b178300-a0f6-11e9-91c8-219334a1c213.png"> The release has been deployed to PyPI even while the Docker image is still being built. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Make it so Docker build doesn't delay PyPI release 445868234  
582211745 https://github.com/simonw/datasette/pull/653#issuecomment-582211745 https://api.github.com/repos/simonw/datasette/issues/653 MDEyOklzc3VlQ29tbWVudDU4MjIxMTc0NQ== simonw 9599 2020-02-05T02:28:05Z 2020-02-05T02:28:05Z OWNER This is shipped in Datasette 0.35. Here's a demo of it working: https://latest.datasette.io/fixtures?sql=--+this+is+a+comment%0D%0Aselect+*+from+%5B123_starts_with_digits%5D Compare with https://v0-34.datasette.io/fixtures?sql=--+this+is+a+comment%0D%0Aselect+*+from+%5B123_starts_with_digits%5D which returned an error. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} allow leading comments in SQL input field 541331755  
620095649 https://github.com/simonw/datasette/issues/731#issuecomment-620095649 https://api.github.com/repos/simonw/datasette/issues/731 MDEyOklzc3VlQ29tbWVudDYyMDA5NTY0OQ== simonw 9599 2020-04-27T16:32:44Z 2020-04-27T16:32:44Z OWNER Documentation: https://datasette.readthedocs.io/en/latest/config.html#configuration-directory-mode {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Option to automatically configure based on directory layout 605110015  
623027889 https://github.com/dogsheep/github-to-sqlite/issues/38#issuecomment-623027889 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38 MDEyOklzc3VlQ29tbWVudDYyMzAyNzg4OQ== simonw 9599 2020-05-02T23:15:11Z 2020-05-02T23:15:11Z MEMBER This is one of the use-cases for the `repos_starred` view: it allows you to easily run this kid of query without having to construct the SQL by hand. Here's a demo: https://github-to-sqlite.dogsheep.net/github/repos_starred?name__contains=twitter My philosophy here is to keep the raw tables (like `stars`) as normalized as possible, then use SQL views which expose the data in a form that's easier to query. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} [Feature Request] Support Repo Name in Search 🥺 611284481  
626431484 https://github.com/simonw/sqlite-utils/issues/110#issuecomment-626431484 https://api.github.com/repos/simonw/sqlite-utils/issues/110 MDEyOklzc3VlQ29tbWVudDYyNjQzMTQ4NA== simonw 9599 2020-05-11T01:58:20Z 2020-05-11T01:58:20Z OWNER Released in 2.9 https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v2-9 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Support decimal.Decimal type 613755043  
735283033 https://github.com/simonw/datasette/pull/1112#issuecomment-735283033 https://api.github.com/repos/simonw/datasette/issues/1112 MDEyOklzc3VlQ29tbWVudDczNTI4MzAzMw== simonw 9599 2020-11-28T19:53:36Z 2020-11-28T19:53:36Z OWNER Thanks! {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Fix --metadata doc usage 752749485  
742024588 https://github.com/simonw/datasette/issues/1134#issuecomment-742024588 https://api.github.com/repos/simonw/datasette/issues/1134 MDEyOklzc3VlQ29tbWVudDc0MjAyNDU4OA== simonw 9599 2020-12-09T20:19:59Z 2020-12-09T20:20:33Z OWNER https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search=sundhedsfrem%2A is an absolutely beautiful example of a themed Datasette! Very excited to show this to people. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 760312579  
748562330 https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-748562330 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 MDEyOklzc3VlQ29tbWVudDc0ODU2MjMzMA== RhetTbull 41546558 2020-12-20T04:45:08Z 2020-12-20T04:45:08Z CONTRIBUTOR Fixes the issue mentioned here: https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Update for Big Sur 771511344  
753219407 https://github.com/simonw/datasette/issues/983#issuecomment-753219407 https://api.github.com/repos/simonw/datasette/issues/983 MDEyOklzc3VlQ29tbWVudDc1MzIxOTQwNw== simonw 9599 2020-12-31T22:38:45Z 2020-12-31T22:39:10Z OWNER You'll be able to add JavaScript plugins using a bunch of different mechanisms: - In a custom template, dropping the code in to a `<script>` block - A bookmarklet that injects an extra script (I'm really excited to try this out) - A separate `script.js` file that's loaded into Datasette using the `"extra_js_urls"` metadata option, documented here: https://docs.datasette.io/en/stable/custom_templates.html#custom-css-and-javascript - A plugin you can install, like `datasette-vega` or `datasette-cluster-map` - since plugins can bundle their own script files that then get loaded on pages via this hook: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-js-urls-template-database-table-columns-view-name-request-datasette {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} JavaScript plugin hooks mechanism similar to pluggy 712260429  
756453945 https://github.com/simonw/datasette/issues/1091#issuecomment-756453945 https://api.github.com/repos/simonw/datasette/issues/1091 MDEyOklzc3VlQ29tbWVudDc1NjQ1Mzk0NQ== simonw 9599 2021-01-07T23:42:50Z 2021-01-07T23:42:50Z OWNER @henry501 it looks like you spotted a bug in the documentation - I just addressed that, the fix is now live here: https://docs.datasette.io/en/latest/deploying.html#running-datasette-behind-a-proxy {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} .json and .csv exports fail to apply base_url 742011049  
778854808 https://github.com/simonw/sqlite-utils/issues/227#issuecomment-778854808 https://api.github.com/repos/simonw/sqlite-utils/issues/227 MDEyOklzc3VlQ29tbWVudDc3ODg1NDgwOA== simonw 9599 2021-02-14T22:46:54Z 2021-02-14T22:46:54Z OWNER Fix is released in 3.5. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Error reading csv files with large column data 807174161  
829885904 https://github.com/simonw/datasette/issues/1310#issuecomment-829885904 https://api.github.com/repos/simonw/datasette/issues/1310 MDEyOklzc3VlQ29tbWVudDgyOTg4NTkwNA== ColinMaudry 3747136 2021-04-30T06:58:46Z 2021-04-30T07:26:11Z NONE I made it work with openpyxl. I'm not sure all the code under `@hookimpl` is necessary... but it works :) ```python from datasette import hookimpl from datasette.utils.asgi import Response from openpyxl import Workbook from openpyxl.writer.excel import save_virtual_workbook from openpyxl.cell import WriteOnlyCell from openpyxl.styles import Alignment, Font, PatternFill from tempfile import NamedTemporaryFile def render_spreadsheet(rows): wb = Workbook(write_only=True) ws = wb.create_sheet() ws = wb.active ws.title = "decp" columns = rows[0].keys() headers = [] for col in columns : c = WriteOnlyCell(ws, col) c.fill = PatternFill("solid", fgColor="DDEFFF") headers.append(c) ws.append(headers) for row in rows: wsRow = [] for col in columns: c = WriteOnlyCell(ws, row[col]) if col == "objet" : c.alignment = Alignment(wrapText = True) wsRow.append(c) ws.append(wsRow) with NamedTemporaryFile() as tmp: wb.save(tmp.name) tmp.seek(0) return Response( tmp.read(), headers={ 'Content-Disposition': 'attachment; filename=decp.xlsx', 'Content-type': 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' } ) @hookimpl def register_output_renderer(): return {"extension": "xlsx", "render": render_spreadsheet, "can_render": lambda: False} ``` The key part was to find the right function to wrap the spreadsheet object `wb`. `NamedTemporaryFile()` did it! I'll update this issue when the plugin is packaged and ready for broader use. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} I'm creating a plugin to export a spreadsheet file (.ods or .xlsx) 870125126  
847237524 https://github.com/simonw/datasette/issues/1320#issuecomment-847237524 https://api.github.com/repos/simonw/datasette/issues/1320 MDEyOklzc3VlQ29tbWVudDg0NzIzNzUyNA== simonw 9599 2021-05-24T18:15:56Z 2021-05-24T18:15:56Z OWNER Added some new documentation about that here: https://github.com/simonw/datasette/blob/c0a748e5c3f498fa8c139b420d07dd3dea612379/docs/installation.rst#installing-plugins {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Can't use apt-get in Dockerfile when using datasetteproj/datasette as base 884952179  
850057694 https://github.com/simonw/datasette/issues/619#issuecomment-850057694 https://api.github.com/repos/simonw/datasette/issues/619 MDEyOklzc3VlQ29tbWVudDg1MDA1NzY5NA== simonw 9599 2021-05-28T02:03:05Z 2021-05-28T02:03:05Z OWNER I nearly got this working, but I ran into one last problem: the code path for when an error is raised but the user specified `?_shape=array`. I'll open a draft PR with where I've got to so far. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} "Invalid SQL" page should let you edit the SQL 520655983  
852712106 https://github.com/simonw/datasette/issues/1127#issuecomment-852712106 https://api.github.com/repos/simonw/datasette/issues/1127 MDEyOklzc3VlQ29tbWVudDg1MjcxMjEwNg== simonw 9599 2021-06-02T04:28:55Z 2021-06-02T04:28:55Z OWNER This became resizable in #1236. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Make the custom SQL query text box larger or resizable 756818250  
895587441 https://github.com/simonw/sqlite-utils/issues/309#issuecomment-895587441 https://api.github.com/repos/simonw/sqlite-utils/issues/309 IC_kwDOCGYnMM41YZRx simonw 9599 2021-08-09T22:15:45Z 2021-08-09T22:15:45Z OWNER ``` OverflowError: Python int too large to convert to SQLite INTEGER >>> import sys >>> def find_variables(tb, vars): to_find = list(vars) found = {} for var in to_find: if var in tb.tb_frame.f_locals: vars.remove(var) found[var] = tb.tb_frame.f_locals[var] if vars and tb.tb_next: found.update(find_variables(tb.tb_next, vars)) return found ... >>> find_variables(sys.last_traceback, ["sql", "params"]) {'params': [34223049823094832094802398430298048240], 'sql': 'INSERT INTO [row] ([v]) VALUES (?);'} ``` {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} sqlite-utils insert errors should show SQL and parameters, if possible 963897111  
943620649 https://github.com/simonw/datasette/pull/1458#issuecomment-943620649 https://api.github.com/repos/simonw/datasette/issues/1458 IC_kwDOBm6k_c44PoIp simonw 9599 2021-10-14T18:38:58Z 2021-10-14T18:38:58Z OWNER This is a great idea, thanks. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Rework the `--static` documentation a bit 988555009  
967801997 https://github.com/simonw/datasette/issues/1380#issuecomment-967801997 https://api.github.com/repos/simonw/datasette/issues/1380 IC_kwDOBm6k_c45r3yN Segerberg 7094907 2021-11-13T08:05:37Z 2021-11-13T08:09:11Z NONE @glasnt yeah I guess that could be an option. I run datasette on large databases > 75gb and the startup time is a bit slow for me even with -i --inspect-file options. Here's a quick sketch for a plugin that will reload db's in a folder that you set for the plugin in metadata.json. If you request /-reload-db new db's will be added. (You probably want to implement some authentication for this =) ) https://gist.github.com/Segerberg/b96a0e0a5389dce2396497323cda7042 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Serve all db files in a folder 924748955  
968380387 https://github.com/simonw/sqlite-utils/issues/335#issuecomment-968380387 https://api.github.com/repos/simonw/sqlite-utils/issues/335 IC_kwDOCGYnMM45uE_j simonw 9599 2021-11-14T22:55:56Z 2021-11-14T22:55:56Z OWNER OK, this should fix it. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} sqlite-utils index-foreign-keys fails due to pre-existing index 1042569687  
974765825 https://github.com/simonw/datasette/issues/93#issuecomment-974765825 https://api.github.com/repos/simonw/datasette/issues/93 IC_kwDOBm6k_c46Gb8B simonw 9599 2021-11-21T07:00:21Z 2021-11-21T07:00:21Z OWNER Closing this in favour of Datasette Desktop: https://datasette.io/desktop {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Package as standalone binary 273944952  
1040959312 https://github.com/simonw/sqlite-utils/pull/407#issuecomment-1040959312 https://api.github.com/repos/simonw/sqlite-utils/issues/407 IC_kwDOCGYnMM4-C8dQ simonw 9599 2022-02-16T00:58:32Z 2022-02-16T00:58:32Z OWNER This is honestly one of the most complete PRs I've ever seen for a feature of this size. Thanks so much for this! {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add SpatiaLite helpers to CLI 1138948786  
1065245831 https://github.com/simonw/sqlite-utils/issues/413#issuecomment-1065245831 https://api.github.com/repos/simonw/sqlite-utils/issues/413 IC_kwDOCGYnMM4_flyH simonw 9599 2022-03-11T15:59:14Z 2022-03-11T15:59:14Z OWNER Hint from https://twitter.com/AdamChainz/status/1502311047612575745 > Try: > > `autodoc_typehints = 'description'` > > For a list-of-arguments format > > https://sphinx-doc.org/en/master/usage/extensions/autodoc.html#confval-autodoc_typehints {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Display autodoc type information more legibly 1166587040  
1073468996 https://github.com/simonw/sqlite-utils/issues/415#issuecomment-1073468996 https://api.github.com/repos/simonw/sqlite-utils/issues/415 IC_kwDOCGYnMM4_-9ZE simonw 9599 2022-03-21T04:14:42Z 2022-03-21T04:14:42Z OWNER I can fix this like so: ``` % sqlite-utils convert demo.db demo foo '{"foo": "bar"}' --multi --dry-run abc --- becomes: {"foo": "bar"} Would affect 1 row ``` Diff is this: ```diff diff --git a/sqlite_utils/cli.py b/sqlite_utils/cli.py index 0cf0468..b2a0440 100644 --- a/sqlite_utils/cli.py +++ b/sqlite_utils/cli.py @@ -2676,7 +2676,10 @@ def convert( raise click.ClickException(str(e)) if dry_run: # Pull first 20 values for first column and preview them - db.conn.create_function("preview_transform", 1, lambda v: fn(v) if v else v) + preview = lambda v: fn(v) if v else v + if multi: + preview = lambda v: json.dumps(fn(v), default=repr) if v else v + db.conn.create_function("preview_transform", 1, preview) sql = """ select [{column}] as value, ``` {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Convert with `--multi` and `--dry-run` flag does not work 1171599874  
1080141111 https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1080141111 https://api.github.com/repos/simonw/sqlite-utils/issues/420 IC_kwDOCGYnMM5AYaU3 simonw 9599 2022-03-28T03:25:57Z 2022-03-28T03:54:37Z OWNER So now this should solve your problem: ``` echo '[{"name": "notaword"}, {"name": "word"}] ' | python3 -m sqlite_utils insert listings.db listings - --convert ' import enchant d = enchant.Dict("en_US") def convert(row): global d row["is_dictionary_word"] = d.check(row["name"]) ' ``` {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Document how to use a `--convert` function that runs initialization code first 1178546862  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 479.007ms · About: simonw/datasette-graphql