issue_comments
9,947 rows sorted by body descending
This data as json, CSV (advanced)
Suggested facets: user, author_association, reactions
created_at (date) >1000 ✖
- 2021-03-22 66
- 2021-11-19 60
- 2022-11-16 59
- 2020-10-15 52
- 2020-09-22 51
- 2020-10-30 49
- 2022-10-26 47
- 2022-03-21 46
- 2020-12-18 43
- 2020-06-09 42
- 2022-10-27 42
- 2022-12-13 42
- 2020-06-18 41
- 2022-10-25 41
- 2020-10-20 40
- 2022-01-09 40
- 2022-06-14 40
- 2020-05-27 39
- 2021-11-16 39
- 2021-12-16 39
- 2020-12-30 38
- 2022-12-15 37
- 2023-03-08 37
- 2020-10-09 36
- 2021-11-20 36
- 2022-01-20 36
- 2022-03-19 36
- 2020-09-15 34
- 2021-11-29 34
- 2022-11-18 34
- 2023-05-08 34
- 2020-06-08 33
- 2021-01-04 33
- 2021-05-27 33
- 2022-02-06 33
- 2020-06-01 32
- 2022-03-05 32
- 2022-12-14 32
- 2019-06-24 31
- 2020-09-21 31
- 2021-08-13 31
- 2022-04-28 31
- 2022-08-27 31
- 2023-03-09 31
- 2019-06-23 30
- 2021-08-09 30
- 2022-11-14 30
- 2022-12-18 30
- 2018-04-16 29
- 2020-06-16 29
- 2020-10-16 29
- 2021-12-18 29
- 2020-02-14 28
- 2020-06-06 28
- 2020-11-24 28
- 2020-12-31 28
- 2021-01-03 28
- 2021-12-19 28
- 2022-02-02 28
- 2022-05-20 28
- 2022-12-16 28
- 2017-11-13 27
- 2017-11-14 27
- 2020-03-23 27
- 2020-05-30 27
- 2022-09-26 27
- 2022-10-24 27
- 2022-11-15 27
- 2022-11-30 27
- 2023-05-21 27
- 2023-05-25 27
- 2021-02-14 26
- 2021-06-16 26
- 2022-04-26 26
- 2022-07-15 26
- 2022-10-14 26
- 2020-06-29 25
- 2020-09-24 25
- 2020-12-29 25
- 2021-01-25 25
- 2021-06-03 25
- 2022-02-03 25
- 2022-03-15 25
- 2020-02-24 24
- 2020-04-18 24
- 2020-08-18 24
- 2020-09-14 24
- 2020-10-31 24
- 2022-08-14 24
- 2017-11-19 23
- 2020-05-03 23
- 2020-10-19 23
- 2020-10-29 23
- 2021-02-18 23
- 2021-11-14 23
- 2020-05-31 22
- 2020-09-30 22
- 2020-11-05 22
- 2020-11-30 22
- 2020-12-17 22
- 2022-09-15 22
- 2022-11-03 22
- 2022-11-29 22
- 2020-05-06 21
- 2020-05-28 21
- 2021-02-25 21
- 2021-06-19 21
- 2021-08-18 21
- 2021-12-17 21
- 2022-01-06 21
- 2022-03-11 21
- 2022-04-27 21
- 2022-10-30 21
- 2022-12-09 21
- 2017-11-16 20
- 2018-05-27 20
- 2018-07-10 20
- 2019-05-03 20
- 2019-05-11 20
- 2019-11-11 20
- 2020-03-24 20
- 2020-06-11 20
- 2020-06-23 20
- 2020-07-08 20
- 2020-09-07 20
- 2020-10-11 20
- 2020-10-23 20
- 2021-01-06 20
- 2021-02-20 20
- 2021-03-04 20
- 2021-08-24 20
- 2022-01-26 20
- 2022-05-02 20
- 2022-09-06 20
- 2023-02-07 20
- 2019-11-04 19
- 2020-03-25 19
- 2020-05-02 19
- 2020-08-12 19
- 2021-10-24 19
- 2021-12-11 19
- 2022-01-08 19
- 2022-02-18 19
- 2022-03-07 19
- 2022-08-23 19
- 2018-04-09 18
- 2018-05-22 18
- 2019-05-23 18
- 2019-06-13 18
- 2019-06-22 18
- 2020-03-26 18
- 2020-04-05 18
- 2020-05-01 18
- 2020-05-04 18
- 2020-05-05 18
- 2020-06-03 18
- 2020-06-13 18
- 2020-07-01 18
- 2020-09-23 18
- 2020-10-21 18
- 2021-01-01 18
- 2021-05-23 18
- 2021-05-29 18
- 2021-08-10 18
- 2022-01-13 18
- 2022-02-07 18
- 2022-10-07 18
- 2017-10-24 17
- 2018-05-24 17
- 2018-06-21 17
- 2020-06-28 17
- 2020-08-28 17
- 2020-12-09 17
- 2021-02-12 17
- 2021-03-23 17
- 2021-03-29 17
- 2021-08-19 17
- 2021-12-23 17
- 2022-09-05 17
- 2022-09-07 17
- 2022-09-27 17
- 2023-06-29 17
- 2017-10-23 16
- 2017-11-11 16
- 2018-05-28 16
- 2019-03-15 16
- 2020-04-27 16
- 2020-05-08 16
- 2020-05-10 16
- 2020-08-11 16
- 2020-08-16 16
- 2020-10-14 16
- 2020-11-29 16
- 2020-12-03 16
- 2021-03-27 16
- 2021-06-05 16
- 2021-07-10 16
- 2021-07-31 16
- 2021-11-15 16
- 2022-03-20 16
- 2022-11-01 16
- 2022-11-11 16
- 2022-12-06 16
- 2017-11-15 15
- 2018-05-16 15
- 2018-06-18 15
- 2019-10-14 15
- 2020-04-29 15
- 2020-05-11 15
- 2020-06-05 15
- 2020-06-10 15
- 2020-06-12 15
- 2020-08-09 15
- 2020-09-17 15
- 2020-10-22 15
- 2021-01-05 15
- 2021-02-19 15
- 2021-05-31 15
- 2021-06-26 15
- 2022-01-11 15
- 2017-12-10 14
- 2019-03-17 14
- 2019-07-05 14
- 2020-02-04 14
- 2020-06-07 14
- 2020-09-12 14
- 2020-10-08 14
- 2020-10-25 14
- 2020-11-06 14
- 2020-12-16 14
- 2021-08-01 14
- 2021-08-28 14
- 2022-01-21 14
- 2022-02-05 14
- 2022-11-04 14
- 2022-12-31 14
- 2023-04-11 14
- 2023-04-27 14
- 2017-11-17 13
- 2018-03-28 13
- 2018-04-08 13
- 2019-02-24 13
- 2019-07-08 13
- 2019-11-12 13
- 2019-11-13 13
- 2020-02-25 13
- 2020-05-20 13
- 2020-09-03 13
- 2020-10-10 13
- 2021-04-03 13
- 2021-06-06 13
- 2021-12-20 13
- 2022-03-06 13
- 2022-04-24 13
- 2022-06-13 13
- 2017-10-25 12
- 2017-11-12 12
- 2018-04-17 12
- 2019-04-12 12
- 2019-10-16 12
- 2019-11-09 12
- 2020-02-23 12
- 2020-04-16 12
- 2020-04-26 12
- 2020-04-30 12
- 2020-10-17 12
- 2020-12-04 12
- 2020-12-14 12
- 2021-02-26 12
- 2021-05-19 12
- 2021-06-15 12
- 2021-07-16 12
- 2022-01-14 12
- 2022-12-02 12
- 2022-12-08 12
- 2023-01-25 12
- 2023-07-08 12
- 2017-12-07 11
- 2018-05-29 11
- 2018-05-31 11
- 2018-07-24 11
- 2019-05-16 11
- 2019-06-30 11
- 2019-10-11 11
- 2019-10-17 11
- 2019-11-08 11
- 2020-02-22 11
- 2020-06-02 11
- 2020-06-30 11
- 2020-07-18 11
- 2020-07-27 11
- 2020-08-30 11
- 2020-09-28 11
- 2020-11-02 11
- 2020-11-12 11
- 2020-12-12 11
- 2021-02-27 11
- 2021-04-02 11
- 2021-04-04 11
- 2021-04-05 11
- 2021-06-02 11
- 2021-06-18 11
- 2021-08-12 11
- 2021-12-10 11
- 2021-12-22 11
- 2022-01-10 11
- 2022-02-09 11
- 2022-09-14 11
- 2022-12-03 11
- 2023-04-06 11
- 2018-04-14 10
- 2018-05-23 10
- 2018-06-15 10
- 2018-06-16 10
- 2019-04-29 10
- 2019-07-07 10
- 2019-07-19 10
- 2019-11-03 10
- 2019-11-26 10
- 2020-04-02 10
- 2020-06-24 10
- 2020-07-26 10
- 2020-08-21 10
- 2020-09-08 10
- 2021-05-24 10
- 2021-08-02 10
- 2021-08-17 10
- 2022-01-12 10
- 2022-01-19 10
- 2022-03-23 10
- 2022-12-17 10
- 2023-01-09 10
- 2023-04-13 10
- 2023-06-25 10
- 2017-12-09 9
- 2018-04-15 9
- 2018-04-20 9
- 2018-11-05 9
- 2019-05-28 9
- 2019-06-25 9
- 2020-02-13 9
- 2020-03-08 9
- 2020-03-20 9
- 2020-04-06 9
- 2020-04-13 9
- 2020-04-28 9
- 2020-05-07 9
- 2020-07-24 9
- 2020-09-02 9
- 2020-12-19 9
- 2021-01-02 9
- 2021-01-22 9
- 2021-02-02 9
- 2021-03-10 9
- 2021-03-20 9
- 2021-05-28 9
- 2021-06-22 9
- 2021-07-08 9
- 2021-07-15 9
- 2021-12-12 9
- 2022-02-04 9
- 2022-03-08 9
- 2022-04-13 9
- 2022-05-03 9
- 2022-05-17 9
- 2022-06-21 9
- 2022-07-05 9
- 2022-07-18 9
- 2022-08-18 9
- 2022-09-28 9
- 2023-01-11 9
- 2023-01-21 9
- 2023-02-06 9
- 2017-11-29 8
- 2017-11-30 8
- 2018-03-27 8
- 2018-03-30 8
- 2018-05-25 8
- 2019-05-19 8
- 2019-05-21 8
- 2019-05-25 8
- 2019-05-29 8
- 2019-06-15 8
- 2019-07-20 8
- 2019-10-21 8
- 2020-01-31 8
- 2020-02-16 8
- 2020-03-16 8
- 2020-04-10 8
- 2020-04-21 8
- 2020-06-14 8
- 2020-06-21 8
- 2020-07-02 8
- 2020-07-31 8
- 2020-08-19 8
- 2020-10-01 8
- 2020-10-12 8
- 2021-01-12 8
- 2021-01-24 8
- 2021-02-15 8
- 2021-02-23 8
- 2021-06-07 8
- 2021-07-02 8
- 2021-07-11 8
- 2021-09-23 8
- 2021-10-08 8
- 2021-10-13 8
- 2021-12-14 8
- 2022-01-07 8
- 2022-02-16 8
- 2022-03-24 8
- 2022-03-25 8
- 2022-04-11 8
- 2022-06-20 8
- 2022-08-21 8
- 2022-09-02 8
- 2022-10-18 8
- 2022-10-28 8
- 2023-01-29 8
- 2023-03-22 8
- 2023-03-26 8
- 2023-05-07 8
- 2017-11-09 7
- 2017-11-20 7
- 2017-11-21 7
- 2018-06-17 7
- 2018-07-26 7
- 2019-01-28 7
- 2019-05-02 7
- 2019-06-09 7
- 2019-06-18 7
- 2019-10-07 7
- 2019-10-18 7
- 2019-10-30 7
- 2020-01-29 7
- 2020-02-27 7
- 2020-03-06 7
- 2020-03-21 7
- 2020-03-31 7
- 2020-04-01 7
- 2020-04-15 7
- 2020-04-24 7
- 2020-05-15 7
- 2020-05-21 7
- 2020-05-29 7
- 2020-06-20 7
- 2020-07-06 7
- 2020-08-29 7
- 2020-09-16 7
- 2020-10-24 7
- 2020-10-27 7
- 2020-11-04 7
- 2020-11-07 7
- 2021-01-18 7
- 2021-02-11 7
- 2021-02-22 7
- 2021-05-17 7
- 2021-07-14 7
- 2021-07-25 7
- 2021-07-30 7
- 2021-08-04 7
- 2021-08-14 7
- 2021-08-25 7
- 2021-09-21 7
- 2021-09-22 7
- 2021-11-17 7
- 2021-11-30 7
- 2021-12-01 7
- 2021-12-07 7
- 2021-12-13 7
- 2022-02-08 7
- 2022-03-13 7
- 2022-03-14 7
- 2022-03-17 7
- 2022-04-08 7
- 2022-04-21 7
- 2022-09-01 7
- 2022-09-09 7
- 2022-09-29 7
- 2022-10-08 7
- 2022-11-02 7
- 2022-11-12 7
- 2023-03-10 7
- 2017-10-26 6
- 2017-11-22 6
- 2017-11-23 6
- 2017-12-08 6
- 2018-04-03 6
- 2018-04-18 6
- 2018-04-26 6
- 2018-05-12 6
- 2018-05-13 6
- 2018-05-14 6
- 2018-05-21 6
- 2018-05-26 6
- 2018-06-04 6
- 2018-07-12 6
- 2019-04-13 6
- 2019-07-03 6
- 2019-07-09 6
- 2019-07-11 6
- 2019-07-14 6
- 2019-09-02 6
- 2019-11-01 6
- 2019-11-07 6
- 2019-12-18 6
- 2019-12-22 6
- 2020-02-15 6
- 2020-03-05 6
- 2020-03-14 6
- 2020-03-22 6
- 2020-04-23 6
- 2020-06-22 6
- 2020-06-27 6
- 2020-09-01 6
- 2020-09-09 6
- 2020-10-06 6
- 2020-10-26 6
- 2020-12-13 6
- 2021-01-15 6
- 2021-03-09 6
- 2021-03-12 6
- 2021-03-21 6
- 2021-03-28 6
- 2021-06-01 6
- 2021-06-11 6
- 2021-07-22 6
- 2021-08-08 6
- 2021-08-20 6
- 2021-08-23 6
- 2021-09-08 6
- 2021-10-07 6
- 2021-10-14 6
- 2021-11-28 6
- 2022-01-25 6
- 2022-03-02 6
- 2022-03-22 6
- 2022-04-25 6
- 2022-07-10 6
- 2022-07-20 6
- 2022-08-02 6
- 2022-09-23 6
- 2022-10-02 6
- 2022-10-12 6
- 2022-10-31 6
- 2022-11-17 6
- 2023-01-07 6
- 2023-01-17 6
- 2023-03-06 6
- 2023-03-20 6
- 2023-05-26 6
- 2023-07-14 6
- 2017-11-18 5
- 2017-11-27 5
- 2018-05-17 5
- 2018-05-30 5
- 2018-06-05 5
- 2018-07-11 5
- 2018-08-11 5
- 2018-09-19 5
- 2019-03-19 5
- 2019-06-28 5
- 2019-07-23 5
- 2019-07-24 5
- 2019-11-14 5
- 2019-11-27 5
- 2019-12-26 5
- 2020-01-30 5
- 2020-02-28 5
- 2020-03-27 5
- 2020-03-30 5
- 2020-04-17 5
- 2020-05-25 5
- 2020-06-19 5
- 2020-06-26 5
- 2020-07-30 5
- 2020-08-10 5
- 2020-08-15 5
- 2020-10-28 5
- 2020-11-13 5
- 2020-11-28 5
- 2020-12-01 5
- 2020-12-05 5
- 2020-12-08 5
- 2020-12-22 5
- 2020-12-23 5
- 2021-01-07 5
- 2021-01-17 5
- 2021-01-28 5
- 2021-01-29 5
- 2021-03-08 5
- 2021-03-18 5
- 2021-03-31 5
- 2021-06-12 5
- 2021-06-13 5
- 2021-06-14 5
- 2021-06-25 5
- 2021-08-05 5
- 2021-08-06 5
- 2021-08-07 5
- 2021-09-13 5
- 2021-12-06 5
- 2021-12-15 5
- 2022-02-15 5
- 2022-03-16 5
- 2022-03-26 5
- 2022-03-30 5
- 2022-04-22 5
- 2022-04-30 5
- 2022-07-27 5
- 2022-08-13 5
- 2022-08-28 5
- 2022-08-30 5
- 2022-08-31 5
- 2022-09-17 5
- 2022-09-22 5
- 2022-10-05 5
- 2022-10-13 5
- 2022-11-13 5
- 2022-12-01 5
- 2023-02-08 5
- 2023-02-22 5
- 2023-03-12 5
- 2023-03-29 5
- 2023-04-12 5
- 2023-07-17 5
- 2017-11-10 4
- 2017-12-01 4
- 2017-12-04 4
- 2017-12-05 4
- 2018-04-11 4
- 2018-04-12 4
- 2018-06-28 4
- 2018-06-29 4
- 2018-07-13 4
- 2018-07-14 4
- 2018-07-18 4
- 2019-01-02 4
- 2019-01-17 4
- 2019-04-07 4
- 2019-04-11 4
- 2019-05-05 4
- 2019-05-09 4
- 2019-06-04 4
- 2019-07-04 4
- 2019-07-06 4
- 2019-07-22 4
- 2019-07-28 4
- 2019-08-17 4
- 2019-08-23 4
- 2019-09-03 4
- 2019-09-04 4
- 2019-10-13 4
- 2019-11-18 4
- 2019-11-19 4
- 2019-12-08 4
- 2019-12-27 4
- 2020-02-01 4
- 2020-02-29 4
- 2020-04-22 4
- 2020-05-12 4
- 2020-06-17 4
- 2020-07-16 4
- 2020-08-24 4
- 2020-08-25 4
- 2020-09-11 4
- 2020-09-18 4
- 2020-09-20 4
- 2020-10-05 4
- 2020-10-07 4
- 2020-11-03 4
- 2020-11-15 4
- 2020-11-21 4
- 2020-12-02 4
- 2020-12-21 4
- 2020-12-24 4
- 2021-01-26 4
- 2021-03-05 4
- 2021-03-07 4
- 2021-04-29 4
- 2021-06-20 4
- 2021-06-23 4
- 2021-07-07 4
- 2021-07-18 4
- 2021-08-03 4
- 2021-08-26 4
- 2021-09-07 4
- 2021-10-18 4
- 2021-10-19 4
- 2021-10-30 4
- 2021-11-21 4
- 2021-11-22 4
- 2021-12-08 4
- 2022-02-11 4
- 2022-03-01 4
- 2022-03-18 4
- 2022-05-16 4
- 2022-05-27 4
- 2022-06-22 4
- 2022-07-02 4
- 2022-07-17 4
- 2022-08-15 4
- 2022-08-20 4
- 2022-09-19 4
- 2022-09-21 4
- 2022-10-01 4
- 2022-10-06 4
- 2022-11-19 4
- 2022-11-23 4
- 2023-01-02 4
- 2023-02-10 4
- 2023-06-14 4
- 2023-07-02 4
- 2023-07-09 4
- 2017-11-24 3
- 2017-12-02 3
- 2018-03-21 3
- 2018-04-13 3
- 2018-04-22 3
- 2018-05-03 3
- 2018-05-06 3
- 2018-05-11 3
- 2018-05-18 3
- 2018-06-20 3
- 2018-07-31 3
- 2018-08-12 3
- 2018-08-16 3
- 2018-08-28 3
- 2018-11-19 3
- 2019-01-10 3
- 2019-01-13 3
- 2019-01-18 3
- 2019-02-06 3
- 2019-02-23 3
- 2019-04-15 3
- 2019-04-18 3
- 2019-05-01 3
- 2019-05-13 3
- 2019-05-14 3
- 2019-05-20 3
- 2019-05-26 3
- 2019-05-27 3
- 2019-06-08 3
- 2019-06-11 3
- 2019-06-14 3
- 2019-07-12 3
- 2019-07-15 3
- 2019-09-14 3
- 2019-10-02 3
- 2019-10-06 3
- 2019-11-02 3
- 2019-11-15 3
- 2019-11-22 3
- 2019-12-03 3
- 2019-12-09 3
- 2020-01-10 3
- 2020-03-01 3
- 2020-03-02 3
- 2020-03-03 3
- 2020-03-09 3
- 2020-05-19 3
- 2020-05-26 3
- 2020-06-15 3
- 2020-07-21 3
- 2020-07-22 3
- 2020-08-01 3
- 2020-08-13 3
- 2020-10-02 3
- 2020-11-01 3
- 2020-11-11 3
- 2020-11-22 3
- 2020-12-10 3
- 2021-01-09 3
- 2021-02-05 3
- 2021-02-06 3
- 2021-03-01 3
- 2021-03-13 3
- 2021-03-14 3
- 2021-03-19 3
- 2021-03-25 3
- 2021-04-20 3
- 2021-04-24 3
- 2021-04-28 3
- 2021-05-12 3
- 2021-05-18 3
- 2021-05-21 3
- 2021-06-09 3
- 2021-06-17 3
- 2021-06-27 3
- 2021-07-13 3
- 2021-07-26 3
- 2021-07-28 3
- 2021-09-04 3
- 2021-10-02 3
- 2021-10-12 3
- 2021-10-16 3
- 2021-10-28 3
- 2021-12-02 3
- 2022-01-15 3
- 2022-01-28 3
- 2022-02-23 3
- 2022-02-24 3
- 2022-03-12 3
- 2022-03-28 3
- 2022-03-29 3
- 2022-04-12 3
- 2022-04-14 3
- 2022-04-29 3
- 2022-07-07 3
- 2022-08-17 3
- 2022-09-16 3
- 2022-09-20 3
- 2022-09-24 3
- 2022-10-03 3
- 2022-10-11 3
- 2022-10-29 3
- 2022-11-20 3
- 2022-12-23 3
- 2023-01-20 3
- 2023-01-26 3
- 2023-01-28 3
- 2023-03-11 3
- 2023-03-18 3
- 2023-03-24 3
- 2023-03-30 3
- 2023-03-31 3
- 2023-04-01 3
- 2023-04-05 3
- 2023-04-15 3
- 2023-05-09 3
- 2023-05-12 3
- 2023-05-15 3
- 2023-06-26 3
- 2023-07-18 3
- 2017-10-27 2
- 2017-11-06 2
- 2017-11-07 2
- 2017-11-26 2
- 2017-12-03 2
- 2018-01-09 2
- 2018-04-10 2
- 2018-04-19 2
- 2018-04-21 2
- 2018-05-04 2
- 2018-05-05 2
- 2018-05-15 2
- 2018-06-07 2
- 2018-08-06 2
- 2019-01-03 2
- 2019-01-11 2
- 2019-03-14 2
- 2019-03-20 2
- 2019-03-28 2
- 2019-03-31 2
- 2019-04-14 2
- 2019-04-21 2
- 2019-05-04 2
- 2019-05-24 2
- 2019-06-05 2
- 2019-06-06 2
- 2019-06-16 2
- 2019-07-16 2
- 2019-07-18 2
- 2019-07-26 2
- 2019-08-03 2
- 2019-08-31 2
- 2019-09-08 2
- 2019-10-10 2
- 2019-10-12 2
- 2019-10-24 2
- 2019-10-25 2
- 2019-10-27 2
- 2019-11-06 2
- 2019-11-10 2
- 2019-11-23 2
- 2019-11-29 2
- 2019-11-30 2
- 2019-12-01 2
- 2019-12-04 2
- 2019-12-31 2
- 2020-01-05 2
- 2020-01-06 2
- 2020-01-12 2
- 2020-01-19 2
- 2020-01-21 2
- 2020-02-03 2
- 2020-02-11 2
- 2020-02-12 2
- 2020-03-10 2
- 2020-03-17 2
- 2020-03-28 2
- 2020-04-04 2
- 2020-04-14 2
- 2020-04-19 2
- 2020-06-04 2
- 2020-07-03 2
- 2020-07-07 2
- 2020-07-09 2
- 2020-07-12 2
- 2020-07-17 2
- 2020-07-25 2
- 2020-07-29 2
- 2020-09-26 2
- 2020-09-27 2
- 2020-09-29 2
- 2020-11-09 2
- 2020-11-16 2
- 2020-11-17 2
- 2020-11-20 2
- 2020-11-25 2
- 2020-12-20 2
- 2020-12-27 2
- 2020-12-28 2
- 2021-01-08 2
- 2021-01-11 2
- 2021-01-14 2
- 2021-01-19 2
- 2021-01-30 2
- 2021-02-01 2
- 2021-02-03 2
- 2021-02-07 2
- 2021-02-08 2
- 2021-02-17 2
- 2021-02-28 2
- 2021-03-03 2
- 2021-03-15 2
- 2021-03-24 2
- 2021-04-12 2
- 2021-04-14 2
- 2021-04-18 2
- 2021-05-11 2
- 2021-05-26 2
- 2021-06-10 2
- 2021-06-21 2
- 2021-06-28 2
- 2021-07-09 2
- 2021-07-19 2
- 2021-07-24 2
- 2021-08-16 2
- 2021-08-22 2
- 2021-09-14 2
- 2021-09-28 2
- 2021-09-29 2
- 2021-10-05 2
- 2021-10-09 2
- 2021-10-20 2
- 2021-10-22 2
- 2021-10-26 2
- 2021-10-27 2
- 2021-11-01 2
- 2021-11-03 2
- 2021-11-13 2
- 2021-11-18 2
- 2021-11-23 2
- 2021-11-25 2
- 2021-12-04 2
- 2021-12-24 2
- 2021-12-26 2
- 2021-12-27 2
- 2022-02-13 2
- 2022-02-17 2
- 2022-03-09 2
- 2022-04-10 2
- 2022-05-12 2
- 2022-05-31 2
- 2022-06-15 2
- 2022-06-23 2
- 2022-06-28 2
- 2022-07-01 2
- 2022-07-04 2
- 2022-07-19 2
- 2022-07-22 2
- 2022-07-28 2
- 2022-08-16 2
- 2022-09-03 2
- 2022-09-08 2
- 2022-09-25 2
- 2022-10-04 2
- 2022-10-15 2
- 2022-11-10 2
- 2022-12-10 2
- 2023-01-05 2
- 2023-01-06 2
- 2023-01-10 2
- 2023-01-19 2
- 2023-01-22 2
- 2023-01-23 2
- 2023-01-24 2
- 2023-02-05 2
- 2023-03-19 2
- 2023-03-21 2
- 2023-04-07 2
- 2023-04-16 2
- 2023-05-02 2
- 2023-05-03 2
- 2023-05-22 2
- 2023-07-01 2
- 2023-07-03 2
- 2023-07-10 2
- 2017-10-30 1
- …
id | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body ▲ | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
541664602 | https://github.com/simonw/datasette/pull/595#issuecomment-541664602 | https://api.github.com/repos/simonw/datasette/issues/595 | MDEyOklzc3VlQ29tbWVudDU0MTY2NDYwMg== | tomchristie 647359 | 2019-10-14T13:03:10Z | 2019-10-14T13:03:10Z | NONE | 🤷♂️ @stonebig's suggestion would be the best I got too, *if* you want to support 3.5->3.8. It's either that, or hold off on 3.8 support until you're ready to go to 3.6->3.8. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | bump uvicorn to 0.9.0 to be Python-3.8 friendly 506300941 | |
1257063174 | https://github.com/simonw/sqlite-utils/issues/490#issuecomment-1257063174 | https://api.github.com/repos/simonw/sqlite-utils/issues/490 | IC_kwDOCGYnMM5K7UMG | jeqo 6180701 | 2022-09-24T20:50:15Z | 2022-09-24T20:50:15Z | NONE | 🤯 this is beautiful. Thanks @simonw ! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ability to insert multi-line files 1382457780 | |
359697938 | https://github.com/simonw/datasette/issues/176#issuecomment-359697938 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA== | gijs 7193 | 2018-01-23T07:17:56Z | 2018-01-23T07:17:56Z | NONE | 👍 I'd like this too! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add GraphQL endpoint 285168503 | |
548508237 | https://github.com/simonw/datasette/issues/176#issuecomment-548508237 | https://api.github.com/repos/simonw/datasette/issues/176 | MDEyOklzc3VlQ29tbWVudDU0ODUwODIzNw== | eads 634572 | 2019-10-31T18:25:44Z | 2019-10-31T18:25:44Z | NONE | 👋 I'd be interested in building this out in Q1 or Q2 of 2020 if nobody has tackled it by then. I would love to integrate Datasette into @thechicagoreporter's practice, but we're also fully committed to GraphQL moving forward. | {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0} | Add GraphQL endpoint 285168503 | |
1190995982 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1190995982 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W85G_SgO | jakewilkins 19231792 | 2022-07-21T03:26:38Z | 2023-04-14T22:41:31Z | NONE | 👋 Any update on getting this merged? Alternatively, is there a work around for this issue to unblock myself? edit to add: huge fan of both this project and `osxphotos`, thanks so much for your work here 🙏 If I had any experience with Python I would offer to help but somehow I've managed to not write any Python in 10+ years of programming 😅 Edit again to add: > Alternatively, is there a work around for this issue to unblock myself? Yes, there is. I was able to apply the patch of this PR and it applies (mostly) cleanly and works. - verified I have a high enough version of `osxphotos` - downloaded the .patch of this (by appending `.patch` to the URL) - edited the patch to remove the `setup.py` changes - `cd` to the directory containing `dogsheep-photos` and `git apply 31.patch` | {"total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Update for Big Sur 771511344 | |
811362316 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-811362316 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | MDEyOklzc3VlQ29tbWVudDgxMTM2MjMxNg== | PabloLerma 871250 | 2021-03-31T19:14:39Z | 2021-03-31T19:14:39Z | NONE | 👋 could I help somehow for this to be merged? As Big Sur is going to be more used as the time goes I think it would be nice to merge and publish a new version. Nice work! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Update for Big Sur 771511344 | |
708520800 | https://github.com/simonw/datasette/issues/1019#issuecomment-708520800 | https://api.github.com/repos/simonw/datasette/issues/1019 | MDEyOklzc3VlQ29tbWVudDcwODUyMDgwMA== | jsfenfen 639012 | 2020-10-14T16:37:19Z | 2020-10-14T16:37:19Z | CONTRIBUTOR | 🎉 Thanks so much @simonw ! 🎉 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | "Edit SQL" button on canned queries 721050815 | |
552134876 | https://github.com/dogsheep/twitter-to-sqlite/issues/29#issuecomment-552134876 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29 | MDEyOklzc3VlQ29tbWVudDU1MjEzNDg3Ng== | jacobian 21148 | 2019-11-09T20:33:38Z | 2019-11-09T20:33:38Z | CONTRIBUTOR | ❤️ thanks! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | `import` command fails on empty files 518725064 | |
813134227 | https://github.com/simonw/datasette/issues/1293#issuecomment-813134227 | https://api.github.com/repos/simonw/datasette/issues/1293 | MDEyOklzc3VlQ29tbWVudDgxMzEzNDIyNw== | simonw 9599 | 2021-04-05T01:19:31Z | 2021-04-05T01:19:31Z | OWNER | | addr | opcode | p1 | p2 | p3 | p4 | p5 | comment | |--------|---------------|------|------|------|-----------------------|------|-----------| | 0 | Init | 0 | 47 | 0 | | 00 | | | 1 | OpenRead | 0 | 51 | 0 | 15 | 00 | | | 2 | Integer | 15 | 2 | 0 | | 00 | | | 3 | Once | 0 | 15 | 0 | | 00 | | | 4 | OpenEphemeral | 2 | 1 | 0 | k(1,) | 00 | | | 5 | VOpen | 1 | 0 | 0 | vtab:3E692C362158 | 00 | | | 6 | String8 | 0 | 5 | 0 | CPAD_2020a_SuperUnits | 00 | | | 7 | SCopy | 7 | 6 | 0 | | 00 | | | 8 | Integer | 2 | 3 | 0 | | 00 | | | 9 | Integer | 2 | 4 | 0 | | 00 | | | 10 | VFilter | 1 | 15 | 3 | | 00 | | | 11 | Rowid | 1 | 8 | 0 | | 00 | | | 12 | MakeRecord | 8 | 1 | 9 | C | 00 | | | 13 | IdxInsert | 2 | 9 | 8 | 1 | 00 | | | 14 | VNext | 1 | 11 | 0 | | 00 | | | 15 | Return | 2 | 0 | 0 | | 00 | | | 16 | Rewind | 2 | 46 | 0 | | 00 | | | 17 | Column | 2 | 0 | 1 | | 00 | | | 18 | IsNull | 1 | 45 | 0 | | 00 | | | 19 | SeekRowid | 0 | 45 | 1 | | 00 | | | 20 | Column … | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Show column metadata plus links for foreign keys on arbitrary query results 849978964 | |
1499797384 | https://github.com/simonw/datasette/issues/2054#issuecomment-1499797384 | https://api.github.com/repos/simonw/datasette/issues/2054 | IC_kwDOBm6k_c5ZZReI | dsisnero 6213 | 2023-04-07T00:46:50Z | 2023-04-07T00:46:50Z | NONE | you should have a look at Roda written in ruby . | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Make detailed notes on how table, query and row views work right now 1657861026 | |
1032126353 | https://github.com/simonw/sqlite-utils/issues/403#issuecomment-1032126353 | https://api.github.com/repos/simonw/sqlite-utils/issues/403 | IC_kwDOCGYnMM49hP-R | fgregg 536941 | 2022-02-08T01:45:15Z | 2022-02-08T01:45:31Z | CONTRIBUTOR | you can hack something like this to achieve this result: `sqlite-utils convert my_database my_table rowid "{'id': value}" --multi` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Document how to add a primary key to a rowid table using `sqlite-utils transform --pk` 1126692066 | |
1271008997 | https://github.com/simonw/datasette/issues/1836#issuecomment-1271008997 | https://api.github.com/repos/simonw/datasette/issues/1836 | IC_kwDOBm6k_c5Lwg7l | fgregg 536941 | 2022-10-07T02:00:37Z | 2022-10-07T02:00:49Z | CONTRIBUTOR | yes, and i also think that this is causing the apparent memory problems in #1480. when the container starts up, it will make some operation on the database in `immutable` mode which apparently makes some small change to the db file. if that's so, then the db files will be copied to the read/write layer which counts against cloudrun's memory allocation! running a test of that now. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | docker image is duplicating db files somehow 1400374908 | |
892276385 | https://github.com/simonw/datasette/issues/1419#issuecomment-892276385 | https://api.github.com/repos/simonw/datasette/issues/1419 | IC_kwDOBm6k_c41Lw6h | fgregg 536941 | 2021-08-04T00:58:49Z | 2021-08-04T00:58:49Z | CONTRIBUTOR | yes, [filter clause on aggregate queries were added to sqlite3 in 3.30](https://www.sqlite.org/releaselog/3_30_1.html) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | `publish cloudrun` should deploy a more recent SQLite version 959710008 | |
1440811364 | https://github.com/simonw/datasette/issues/2027#issuecomment-1440811364 | https://api.github.com/repos/simonw/datasette/issues/2027 | IC_kwDOBm6k_c5V4Qlk | gk7279 19700859 | 2023-02-22T21:19:47Z | 2023-02-22T21:19:47Z | NONE | yes @dmick . How did you make your public IP redirect to your uvicorn server? Instead of nginx, I have apache2 on my GCP VM. Any pointers here are helpful too. Thanks. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | How to redirect from "/" to a specific db/table 1590183272 | |
1251845216 | https://github.com/dogsheep/twitter-to-sqlite/issues/31#issuecomment-1251845216 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31 | IC_kwDODEm0Qs5KnaRg | dckc 150986 | 2022-09-20T05:05:03Z | 2022-09-20T05:05:03Z | NONE | yay! Thanks a bunch for the `twitter-to-sqlite friends` command! The twitter "Download an archive of your data" feature doesn't include who I follow, so this is particularly handy. The whole Dogsheep thing is great :) I've written about similar things under [cloud-services](https://www.madmode.com/search/label/cloud-services/): - 2021: [Closet Librarian Approach to Cloud Services](https://www.madmode.com/2021/closet-librarian-approach-cloud-services.html) - 2015: [jukekb \- Browse iTunes libraries and upload playlists to Google Music](https://www.madmode.com/2015/jukekb) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | "friends" command (similar to "followers") 520508502 | |
847271122 | https://github.com/simonw/datasette/issues/1327#issuecomment-847271122 | https://api.github.com/repos/simonw/datasette/issues/1327 | MDEyOklzc3VlQ29tbWVudDg0NzI3MTEyMg== | GmGniap 20846286 | 2021-05-24T19:10:21Z | 2021-05-24T19:10:21Z | NONE | wow, thanks a lot @simonw , problem is solved. I converted my current json file into utf-8 format with Python script. It's working now. I'm using with Window 10. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support Unicode characters in metadata.json 892457208 | |
991405755 | https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991405755 | https://api.github.com/repos/simonw/sqlite-utils/issues/353 | IC_kwDOCGYnMM47F6a7 | fgregg 536941 | 2021-12-11T01:38:29Z | 2021-12-11T01:38:29Z | CONTRIBUTOR | wow! that's awesome! thanks so much, @simonw! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Allow passing a file of code to "sqlite-utils convert" 1077102934 | |
846138580 | https://github.com/simonw/datasette/pull/1325#issuecomment-846138580 | https://api.github.com/repos/simonw/datasette/issues/1325 | MDEyOklzc3VlQ29tbWVudDg0NjEzODU4MA== | stonebig 4312421 | 2021-05-21T18:00:10Z | 2021-05-21T18:00:10Z | NONE | would be nice to have | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0 890073989 | |
1078126065 | https://github.com/simonw/datasette/issues/1684#issuecomment-1078126065 | https://api.github.com/repos/simonw/datasette/issues/1684 | IC_kwDOBm6k_c5AQuXx | fgregg 536941 | 2022-03-24T20:08:56Z | 2022-03-24T20:13:19Z | CONTRIBUTOR | would be nice if the behavior was 1. try to facet all the columns 2. for bigger tables try to facet the indexed columns 3. for the biggest tables, turn off autofacetting completely This is based on my assumption that what determines autofaceting is the rarity of unique values. Which may not be true! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Mechanism for disabling faceting on large tables only 1179998071 | |
992986587 | https://github.com/simonw/datasette/issues/1553#issuecomment-992986587 | https://api.github.com/repos/simonw/datasette/issues/1553 | IC_kwDOBm6k_c47L8Xb | fgregg 536941 | 2021-12-13T22:57:04Z | 2021-12-13T22:57:04Z | CONTRIBUTOR | would also be good if the header said the what the max row limit was | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | if csv export is truncated in non streaming mode set informative response header 1079111498 | |
1318777114 | https://github.com/simonw/sqlite-utils/issues/510#issuecomment-1318777114 | https://api.github.com/repos/simonw/sqlite-utils/issues/510 | IC_kwDOCGYnMM5OmvEa | chapmanjacobd 7908073 | 2022-11-17T15:09:47Z | 2022-11-17T15:09:47Z | CONTRIBUTOR | why close? is the only problem that the _config table that incorrectly says 4 for fts5? if so, that's still something that should be fixed | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Cannot enable FTS5 despite it being available 1434911255 | |
1303660293 | https://github.com/simonw/sqlite-utils/issues/50#issuecomment-1303660293 | https://api.github.com/repos/simonw/sqlite-utils/issues/50 | IC_kwDOCGYnMM5NtEcF | chapmanjacobd 7908073 | 2022-11-04T14:38:36Z | 2022-11-04T14:38:36Z | CONTRIBUTOR | where did you see the limit as 999? I believe the limit has been 32766 for quite some time. If you could detect which one this could speed up batch insert of some types of data significantly | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | "Too many SQL variables" on large inserts 473083260 | |
1272357976 | https://github.com/simonw/datasette/issues/1836#issuecomment-1272357976 | https://api.github.com/repos/simonw/datasette/issues/1836 | IC_kwDOBm6k_c5L1qRY | fgregg 536941 | 2022-10-08T16:56:51Z | 2022-10-08T16:56:51Z | CONTRIBUTOR | when you are running from docker, you **always** will want to run as `mode=ro` because the same thing that is causing duplication in the inspect layer will cause duplication in the final container read/write layer when `datasette serve` runs. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | docker image is duplicating db files somehow 1400374908 | |
1271020193 | https://github.com/simonw/datasette/issues/1836#issuecomment-1271020193 | https://api.github.com/repos/simonw/datasette/issues/1836 | IC_kwDOBm6k_c5Lwjqh | fgregg 536941 | 2022-10-07T02:15:05Z | 2022-10-07T02:21:08Z | CONTRIBUTOR | when i hack the connect method to open non mutable files with "mode=ro" and not "immutable=1" https://github.com/simonw/datasette/blob/eff112498ecc499323c26612d707908831446d25/datasette/database.py#L79 then: ```bash 870 B RUN /bin/sh -c datasette inspect nlrb.db --inspect-file inspect-data.json ``` the `datasette inspect` layer is only the size of the json file! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | docker image is duplicating db files somehow 1400374908 | |
541390656 | https://github.com/simonw/datasette/issues/593#issuecomment-541390656 | https://api.github.com/repos/simonw/datasette/issues/593 | MDEyOklzc3VlQ29tbWVudDU0MTM5MDY1Ng== | stonebig 4312421 | 2019-10-13T06:22:07Z | 2019-10-13T06:22:07Z | NONE | well, I succeeded to make uvicorn work. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | make uvicorn optional dependancy (because not ok on windows python yet) 506183241 | |
1125083348 | https://github.com/simonw/datasette/issues/1298#issuecomment-1125083348 | https://api.github.com/repos/simonw/datasette/issues/1298 | IC_kwDOBm6k_c5DD2jU | llimllib 7150 | 2022-05-12T14:43:51Z | 2022-05-12T14:43:51Z | NONE | user report: I found this issue because the first time I tried to use datasette for real, I displayed a large table, and thought there was no horizontal scroll bar at all. I didn't even consider that I had to scroll all the way to the end of the page to find it. Just chipping in to say that this confused me, and I didn't even find the scroll bar until after I saw this issue. I don't know what the right answer is, but IMO the UI should suggest to the user that there is a way to view the data that's hidden to the right. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | improve table horizontal scroll experience 855476501 | |
344424382 | https://github.com/simonw/datasette/issues/93#issuecomment-344424382 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQyNDM4Mg== | atomotic 67420 | 2017-11-14T22:42:16Z | 2017-11-14T22:42:16Z | NONE | tried quickly, this seems working: ``` ~ pip3 install pyinstaller ~ pyinstaller -F --add-data /usr/local/lib/python3.6/site-packages/datasette/templates:datasette/templates --add-data /usr/local/lib/python3.6/site-packages/datasette/static:datasette/static /usr/local/bin/datasette ~ du -h dist/datasette 6.8M dist/datasette ~ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
1258738740 | https://github.com/simonw/datasette/issues/1818#issuecomment-1258738740 | https://api.github.com/repos/simonw/datasette/issues/1818 | IC_kwDOBm6k_c5LBtQ0 | nelsonjchen 5363 | 2022-09-26T22:52:45Z | 2022-09-26T22:55:57Z | NONE | thoughts on order of precedence to use: * sqlite-utils count, if present. closest thing to a standard i guess. * row(max_id) if like, the first and/or last x amount of rows ids are all contiguous. kind of a cheap/dumb/imperfect heuristic to see if the table is dump/not dump. if the check passes, still stick on `est.` after the display. * count(*) if enabled in datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Setting to turn off table row counts entirely 1384549993 | |
1258878311 | https://github.com/simonw/datasette/issues/526#issuecomment-1258878311 | https://api.github.com/repos/simonw/datasette/issues/526 | IC_kwDOBm6k_c5LCPVn | fgregg 536941 | 2022-09-27T02:19:48Z | 2022-09-27T02:19:48Z | CONTRIBUTOR | this sql query doesn't trip up `maximum_returned_rows` but does timeout ```sql with recursive counter(x) as ( select 0 union select x + 1 from counter ) select * from counter LIMIT 10 OFFSET 100000000 ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Stream all results for arbitrary SQL and canned queries 459882902 | |
1577355134 | https://github.com/simonw/sqlite-utils/issues/557#issuecomment-1577355134 | https://api.github.com/repos/simonw/sqlite-utils/issues/557 | IC_kwDOCGYnMM5eBId- | chapmanjacobd 7908073 | 2023-06-05T19:26:26Z | 2023-06-05T19:26:26Z | CONTRIBUTOR | this isn't really actionable... I'm just being a whiny baby. I have tasted the milk of being able to use `upsert_all`, `insert_all`, etc without having to write DDL to create tables. The meat of the issue is that SQLITE doesn't make rowid stable between vacuums so it is not possible to take shortcuts | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Aliased ROWID option for tables created from alter=True commands 1740150327 | |
1264219650 | https://github.com/simonw/sqlite-utils/issues/493#issuecomment-1264219650 | https://api.github.com/repos/simonw/sqlite-utils/issues/493 | IC_kwDOCGYnMM5LWnYC | chapmanjacobd 7908073 | 2022-10-01T03:22:50Z | 2022-10-01T03:23:58Z | CONTRIBUTOR | this is likely what you are looking for: https://stackoverflow.com/a/51076749/697964 but yeah I would say just disable smart quotes | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Tiny typographical error in install/uninstall docs 1386562662 | |
1077047152 | https://github.com/simonw/datasette/pull/1582#issuecomment-1077047152 | https://api.github.com/repos/simonw/datasette/issues/1582 | IC_kwDOBm6k_c5AMm9w | fgregg 536941 | 2022-03-24T04:07:58Z | 2022-03-24T04:07:58Z | CONTRIBUTOR | this has been obviated by the datasette-hashed-urls plugin | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | don't set far expiry if hash is '000' 1090055810 | |
1077047295 | https://github.com/simonw/datasette/issues/1581#issuecomment-1077047295 | https://api.github.com/repos/simonw/datasette/issues/1581 | IC_kwDOBm6k_c5AMm__ | fgregg 536941 | 2022-03-24T04:08:18Z | 2022-03-24T04:08:18Z | CONTRIBUTOR | this has been addressed by the datasette-hashed-urls plugin | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | when hashed urls are turned on, the _memory db has improperly long-lived cache expiry 1089529555 | |
993014772 | https://github.com/simonw/datasette/issues/1553#issuecomment-993014772 | https://api.github.com/repos/simonw/datasette/issues/1553 | IC_kwDOBm6k_c47MDP0 | fgregg 536941 | 2021-12-13T23:46:18Z | 2021-12-13T23:46:18Z | CONTRIBUTOR | these headers would also be relevant for json exports of custom queries | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | if csv export is truncated in non streaming mode set informative response header 1079111498 | |
893114612 | https://github.com/simonw/datasette/issues/1419#issuecomment-893114612 | https://api.github.com/repos/simonw/datasette/issues/1419 | IC_kwDOBm6k_c41O9j0 | fgregg 536941 | 2021-08-05T02:29:06Z | 2021-08-05T02:29:06Z | CONTRIBUTOR | there's a lot of complexity here, that's probably not worth addressing. i got what i needed by patching the dockerfile that cloudrun uses to install a newer version of sqlite. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | `publish cloudrun` should deploy a more recent SQLite version 959710008 | |
1008164116 | https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008164116 | https://api.github.com/repos/simonw/sqlite-utils/issues/365 | IC_kwDOCGYnMM48F10U | fgregg 536941 | 2022-01-08T22:18:57Z | 2022-01-08T22:18:57Z | CONTRIBUTOR | the table with the query ran so bad was about 50k. i think the scenario should not be worse than no stats. i also did not know that sqlite was so different from postgres and needed an explicit analyze call. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | create-index should run analyze after creating index 1096558279 | |
1356657451 | https://github.com/simonw/datasette/issues/1771#issuecomment-1356657451 | https://api.github.com/repos/simonw/datasette/issues/1771 | IC_kwDOBm6k_c5Q3PMr | mustafa0x 1473102 | 2022-12-18T04:04:32Z | 2022-12-18T04:04:32Z | NONE | the problem is: ``` .select-wrapper select:focus { outline: none; } ``` I sometimes add this js: ``` window.addEventListener('keydown', function check_tab(e) { if (e.key === 'Tab') { document.documentElement.classList.add('user-is-tabbing') window.removeEventListener('keydown', check_tab) } }) ``` and then in the css, using a `html.user-is-tabbing` selector undo any outlines I removed. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | minor a11y: <select> has no visual indicator when tabbed to 1306984363 | |
1258803261 | https://github.com/simonw/datasette/pull/1820#issuecomment-1258803261 | https://api.github.com/repos/simonw/datasette/issues/1820 | IC_kwDOBm6k_c5LB9A9 | fgregg 536941 | 2022-09-27T00:03:09Z | 2022-09-27T00:03:09Z | CONTRIBUTOR | the pattern in this PR `max_returned_rows` control the maximum rows rendered through html and json, and the csv render bypasses that. i think it would be better to have each of these different query renderers have more direct control for how many rows to fetch, instead of relying on the internals of the `execute` method. generally, users will not want to paginate through tens of thousands of results, but often will want to download a full query as json or as csv. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | [SPIKE] Don't truncate query CSVs 1386456717 | |
1008164786 | https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008164786 | https://api.github.com/repos/simonw/sqlite-utils/issues/365 | IC_kwDOCGYnMM48F1-y | fgregg 536941 | 2022-01-08T22:24:19Z | 2022-01-08T22:24:19Z | CONTRIBUTOR | the out-of-date scenario you describe could be addressed by automatically adding an analyze to the insert or convert commands if they implicate an index | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | create-index should run analyze after creating index 1096558279 | |
1271035998 | https://github.com/simonw/datasette/issues/1301#issuecomment-1271035998 | https://api.github.com/repos/simonw/datasette/issues/1301 | IC_kwDOBm6k_c5Lwnhe | fgregg 536941 | 2022-10-07T02:38:04Z | 2022-10-07T02:38:04Z | CONTRIBUTOR | the only mode that `publish cloudrun` supports right now is immutable | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Publishing to cloudrun with immutable mode? 860722711 | |
993078038 | https://github.com/simonw/datasette/issues/526#issuecomment-993078038 | https://api.github.com/repos/simonw/datasette/issues/526 | IC_kwDOBm6k_c47MSsW | fgregg 536941 | 2021-12-14T01:46:52Z | 2021-12-14T01:46:52Z | CONTRIBUTOR | the nested query idea is very nice, and i stole if for [my client side paginator](https://observablehq.com/d/1d5da3a3c3f2f347#DatasetteClient). However, it won't do the right thing if the original query orders by random(). If you go the nested query route, maybe raise a 4XX status code if the query has such a clause? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Stream all results for arbitrary SQL and canned queries 459882902 | |
991754237 | https://github.com/simonw/datasette/issues/1549#issuecomment-991754237 | https://api.github.com/repos/simonw/datasette/issues/1549 | IC_kwDOBm6k_c47HPf9 | fgregg 536941 | 2021-12-11T19:14:39Z | 2021-12-11T19:14:39Z | CONTRIBUTOR | that option is not available on [custom queries](https://labordata.bunkum.us/odpr-962a140?sql=with+local_union_filings+as+%28%0D%0A++select+*+from+lm_data+%0D%0A++where%0D%0A++++yr_covered+%3E+cast%28strftime%28%27%25Y%27%2C+%27now%27%2C+%27-5+years%27%29+as+int%29%0D%0A++++and+desig_name+%3D+%27LU%27%0D%0A++order+by+yr_covered+desc%0D%0A%29%2C%0D%0Amost_recent_filing+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from+local_union_filings%0D%0A++group+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++*%0D%0Afrom%0D%0A++most_recent_filing%0D%0Awhere%0D%0A++next_election+%3E%3D+strftime%28%27%25Y-%25m%27%2C+%27now%27%29%0D%0A++and+next_election+%3C+strftime%28%27%25Y-%25m%27%2C+%27now%27%2C+%27%2B1+year%27%29%0D%0Aorder+by%0D%0A++members+desc%3B). | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Redesign CSV export to improve usability 1077620955 | |
983890815 | https://github.com/simonw/datasette/issues/1519#issuecomment-983890815 | https://api.github.com/repos/simonw/datasette/issues/1519 | IC_kwDOBm6k_c46pPt_ | phubbard 157158 | 2021-12-01T17:50:09Z | 2021-12-01T17:50:09Z | NONE | thanks so very much for the prompt attention and fix! Plus, the animated GIF showing the bug is just extra and I love it. Interactions like this are why I love open source. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | base_url is omitted in JSON and CSV views 1058790545 | |
1009548580 | https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1009548580 | https://api.github.com/repos/simonw/sqlite-utils/issues/365 | IC_kwDOCGYnMM48LH0k | fgregg 536941 | 2022-01-11T02:43:34Z | 2022-01-11T02:43:34Z | CONTRIBUTOR | thanks so much! always a pleasure to see how you work through these things | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | create-index should run analyze after creating index 1096558279 | |
915302885 | https://github.com/simonw/datasette/issues/1464#issuecomment-915302885 | https://api.github.com/repos/simonw/datasette/issues/1464 | IC_kwDOBm6k_c42jmnl | ctb 51016 | 2021-09-08T14:44:50Z | 2021-09-08T14:44:50Z | CONTRIBUTOR | thanks for the response! full errors attached; excerpt: ``` ... def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={"databases": {"fixtures": {"tables": {"searchable": table_metadata}}}} ) as client: response = client.get("/fixtures/searchable.json?" + querystring) > assert expected_rows == response.json["rows"] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Use -v to get the full diff /Users/t/dev/datasette/tests/test_api.py:1115: AssertionError ``` [errors.txt](https://github.com/simonw/datasette/files/7129719/errors.txt) A quick scan of #1223 suggests you're right. Unfortunately, pysqlite3-binary isn't available for Mac OS X, so I can't quickly check that that fixes it; will do so later. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | clean checkout & clean environment has test failures 991191951 | |
623623696 | https://github.com/simonw/datasette/pull/725#issuecomment-623623696 | https://api.github.com/repos/simonw/datasette/issues/725 | MDEyOklzc3VlQ29tbWVudDYyMzYyMzY5Ng== | stonebig 4312421 | 2020-05-04T18:16:54Z | 2020-05-04T18:16:54Z | NONE | thanks a lot, Simon | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6 598891570 | |
1269847461 | https://github.com/simonw/datasette/issues/1480#issuecomment-1269847461 | https://api.github.com/repos/simonw/datasette/issues/1480 | IC_kwDOBm6k_c5LsFWl | fgregg 536941 | 2022-10-06T11:21:49Z | 2022-10-06T11:21:49Z | CONTRIBUTOR | thanks @simonw, i'll spend a little more time trying to figure out why this isn't working on cloudrun, and then will flip over to fly if i can't. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Exceeding Cloud Run memory limits when deploying a 4.8G database 1015646369 | |
1258871525 | https://github.com/simonw/datasette/issues/526#issuecomment-1258871525 | https://api.github.com/repos/simonw/datasette/issues/526 | IC_kwDOBm6k_c5LCNrl | fgregg 536941 | 2022-09-27T02:09:32Z | 2022-09-27T02:14:53Z | CONTRIBUTOR | thanks @simonw, i learned something i didn't know about sqlite's execution model! > Imagine if Datasette CSVs did allow unlimited retrievals. Someone could hit the CSV endpoint for that recursive query and tie up Datasette's SQL connection effectively forever. why wouldn't the `sqlite_timelimit` guard prevent that? --- on my local version which has the code to [turn off truncations for query csv](#1820), `sqlite_timelimit` does protect me. ![Screenshot 2022-09-26 at 22-14-31 Error 500](https://user-images.githubusercontent.com/536941/192415680-94b32b7f-868f-4b89-8194-5752d45f6009.png) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Stream all results for arbitrary SQL and canned queries 459882902 | |
1214437408 | https://github.com/simonw/datasette/issues/1779#issuecomment-1214437408 | https://api.github.com/repos/simonw/datasette/issues/1779 | IC_kwDOBm6k_c5IYtgg | fgregg 536941 | 2022-08-14T19:42:58Z | 2022-08-14T19:42:58Z | CONTRIBUTOR | thanks @simonw! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | google cloudrun updated their limits on maxscale based on memory and cpu count 1334628400 | |
1105464661 | https://github.com/simonw/datasette/pull/1574#issuecomment-1105464661 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c5B5A1V | dholth 208018 | 2022-04-21T16:51:24Z | 2022-04-21T16:51:24Z | NONE | tfw you have more ephemeral storage than upstream bandwidth ``` FROM python:3.10-slim AS base RUN apt update && apt -y install zstd ENV DATASETTE_SECRET 'sosecret' RUN --mount=type=cache,target=/root/.cache/pip pip install -U datasette datasette-pretty-json datasette-graphql ENV PORT 8080 EXPOSE 8080 FROM base AS pack COPY . /app WORKDIR /app RUN datasette inspect --inspect-file inspect-data.json RUN zstd --rm *.db FROM base AS unpack COPY --from=pack /app /app WORKDIR /app CMD ["/bin/bash", "-c", "shopt -s nullglob && zstd --rm -d *.db.zst && datasette serve --host 0.0.0.0 --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT *.db"] ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | introduce new option for datasette package to use a slim base image 1084193403 | |
1271101072 | https://github.com/simonw/datasette/issues/1480#issuecomment-1271101072 | https://api.github.com/repos/simonw/datasette/issues/1480 | IC_kwDOBm6k_c5Lw3aQ | fgregg 536941 | 2022-10-07T04:39:10Z | 2022-10-07T04:39:10Z | CONTRIBUTOR | switching from `immutable=1` to `mode=ro` completely addressed this. see https://github.com/simonw/datasette/issues/1836#issuecomment-1271100651 for details. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Exceeding Cloud Run memory limits when deploying a 4.8G database 1015646369 | |
1592110694 | https://github.com/simonw/sqlite-utils/issues/529#issuecomment-1592110694 | https://api.github.com/repos/simonw/sqlite-utils/issues/529 | IC_kwDOCGYnMM5e5a5m | chapmanjacobd 7908073 | 2023-06-14T23:11:47Z | 2023-06-14T23:12:12Z | CONTRIBUTOR | sorry i was wrong. `sqlite-utils --raw-lines` works correctly ``` sqlite-utils --raw-lines :memory: "SELECT * FROM (VALUES ('test'), ('line2'))" | cat -A test$ line2$ sqlite-utils --csv --no-headers :memory: "SELECT * FROM (VALUES ('test'), ('line2'))" | cat -A test$ line2$ ``` I think this was fixed somewhat recently | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Microsoft line endings 1581090327 | |
1296076803 | https://github.com/simonw/datasette/issues/1872#issuecomment-1296076803 | https://api.github.com/repos/simonw/datasette/issues/1872 | IC_kwDOBm6k_c5NQJAD | mroswell 192568 | 2022-10-30T02:50:34Z | 2022-10-30T02:50:34Z | CONTRIBUTOR | should this issue be under https://github.com/simonw/datasette-publish-vercel/issues ? Perhaps I just need to update: datasette-publish-vercel==0.11 in requirements.txt? I'll try that and see what happens... | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | SITE-BUSTING ERROR: "render_template() called before await ds.invoke_startup()" 1428560020 | |
858813675 | https://github.com/simonw/datasette/issues/858#issuecomment-858813675 | https://api.github.com/repos/simonw/datasette/issues/858 | MDEyOklzc3VlQ29tbWVudDg1ODgxMzY3NQ== | rachelll4 56045588 | 2021-06-10T17:27:46Z | 2021-06-10T17:27:46Z | NONE | shell=True is added to line 56 (I guess it used to be 54) of heroku.py as detailed in the original issue. (for posterity) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | publish heroku does not work on Windows 10 642388564 | |
1404065571 | https://github.com/simonw/datasette/pull/2003#issuecomment-1404065571 | https://api.github.com/repos/simonw/datasette/issues/2003 | IC_kwDOBm6k_c5TsFcj | fgregg 536941 | 2023-01-25T18:44:42Z | 2023-01-25T18:44:42Z | CONTRIBUTOR | see this related discussion to a change in API in sqlite-utils https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Show referring tables and rows when the referring foreign key is compound 1555701851 | |
1268613335 | https://github.com/simonw/datasette/issues/1480#issuecomment-1268613335 | https://api.github.com/repos/simonw/datasette/issues/1480 | IC_kwDOBm6k_c5LnYDX | fgregg 536941 | 2022-10-05T15:45:49Z | 2022-10-05T15:45:49Z | CONTRIBUTOR | running into this as i continue to grow my labor data warehouse. Here a CloudRun PM says the container size should **not** count against memory: https://stackoverflow.com/a/56570717 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Exceeding Cloud Run memory limits when deploying a 4.8G database 1015646369 | |
765502845 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765502845 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUwMjg0NQ== | cobiadigital 25372415 | 2021-01-22T15:53:19Z | 2021-01-22T15:53:19Z | NONE | rs7903146 Influences risk of Type-2 diabetes https://www.snpedia.com/index.php/Rs7903146 ``` select rsid, genotype, case genotype when 'CC' then 'Normal (lower) risk of Type 2 Diabetes and Gestational Diabetes.' when 'CT' then '1.4x increased risk for diabetes (and perhaps colon cancer).' when 'TT' then '2x increased risk for Type-2 diabetes' end as interpretation from genome where rsid = 'rs7903146' ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Figure out some interesting example SQL queries 496415321 | |
765523517 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765523517 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUyMzUxNw== | cobiadigital 25372415 | 2021-01-22T16:20:25Z | 2021-01-22T16:20:25Z | NONE | rs53576: the oxytocin receptor (OXTR) gene ``` select rsid, genotype, case genotype when 'AA' then 'Lack of empathy?' when 'AG' then 'Lack of empathy?' when 'GG' then 'Optimistic and empathetic; handle stress well' end as interpretation from genome where rsid = 'rs53576' ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Figure out some interesting example SQL queries 496415321 | |
765525338 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765525338 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUyNTMzOA== | cobiadigital 25372415 | 2021-01-22T16:22:44Z | 2021-01-22T16:22:44Z | NONE | rs1333049 associated with coronary artery disease https://www.snpedia.com/index.php/Rs1333049 ``` select rsid, genotype, case genotype when 'CC' then '1.9x increased risk for coronary artery disease' when 'CG' then '1.5x increased risk for CAD' when 'GG' then 'normal' end as interpretation from genome where rsid = 'rs1333049' ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Figure out some interesting example SQL queries 496415321 | |
704503719 | https://github.com/dogsheep/github-to-sqlite/pull/48#issuecomment-704503719 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/48 | MDEyOklzc3VlQ29tbWVudDcwNDUwMzcxOQ== | adamjonas 755825 | 2020-10-06T19:26:59Z | 2020-10-06T19:26:59Z | CONTRIBUTOR | ref #46 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add pull requests 681228542 | |
688481374 | https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688481374 | https://api.github.com/repos/simonw/sqlite-utils/issues/149 | MDEyOklzc3VlQ29tbWVudDY4ODQ4MTM3NA== | simonw 9599 | 2020-09-07T19:19:08Z | 2020-09-07T19:19:08Z | OWNER | reading through the code for `github-to-sqlite repos` - one of the things it does is calls `save_license` for each repo: https://github.com/dogsheep/github-to-sqlite/blob/39b2234253096bd579feed4e25104698b8ccd2ba/github_to_sqlite/utils.py#L259-L262 ```python def save_license(db, license): if license is None: return None return db["licenses"].insert(license, pk="key", replace=True).last_pk ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | FTS table with 7 rows has _fts_docsize table with 9,141 rows 695319258 | |
1258167564 | https://github.com/simonw/datasette/issues/526#issuecomment-1258167564 | https://api.github.com/repos/simonw/datasette/issues/526 | IC_kwDOBm6k_c5K_h0M | fgregg 536941 | 2022-09-26T14:57:44Z | 2022-09-26T15:08:36Z | CONTRIBUTOR | reading the database execute method i have a few questions. https://github.com/simonw/datasette/blob/cb1e093fd361b758120aefc1a444df02462389a3/datasette/database.py#L229-L242 --- unless i'm missing something (which is very likely!!), the `max_returned_rows` argument doesn't actually offer any protections against running very expensive queries. It's not like adding a `LIMIT max_rows` argument. it make sense that it isn't because, the query could already have an `LIMIT` argument. Doing something like `select * from (query) limit {max_returned_rows}` **might** be protective but wouldn't always. Instead the code executes the full original query, and if still has time it fetches out the first `max_rows + 1` rows. this *does* offer some protection of memory exhaustion, as you won't hydrate a huge result set into python (however, there are [data flow patterns](https://github.com/simonw/datasette/issues/1727#issuecomment-1258129113) that could avoid that too) given the current architecture, i don't see how creating a new connection would be use? --- If we just removed the `max_return_rows` limitation, then i think most things would be fine **except** for the QueryViews. Right now rendering, just [5000 rows takes a lot of client-side memory](https://github.com/simonw/datasette/issues/1655) so some form of pagination would be required. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Stream all results for arbitrary SQL and canned queries 459882902 | |
712397537 | https://github.com/simonw/datasette/issues/1032#issuecomment-712397537 | https://api.github.com/repos/simonw/datasette/issues/1032 | MDEyOklzc3VlQ29tbWVudDcxMjM5NzUzNw== | saulpw 236498 | 2020-10-19T19:37:55Z | 2020-10-19T19:37:55Z | NONE | python-dateutil is awesome, but it can only guess at one date at a time. So if you have a column of dates that are (presumably) in the same format, it can't use the full set of dates to deduce the format. Also, once it has parsed a date, you can't get the format it used, whether to parse or render other dates. These limitations prevent it from being a silver bullet for date parsing, though they're not enough for me to stop using it! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Bring date parsing into Datasette core 724878151 | |
582103280 | https://github.com/simonw/datasette/issues/665#issuecomment-582103280 | https://api.github.com/repos/simonw/datasette/issues/665 | MDEyOklzc3VlQ29tbWVudDU4MjEwMzI4MA== | simonw 9599 | 2020-02-04T20:36:48Z | 2020-02-04T20:36:48Z | OWNER | pyparsing has an example based on SQLite SELECT statements: https://github.com/pyparsing/pyparsing/blob/8d9ab59a2b2767ad56c9b852c325075113718c0a/examples/select_parser.py https://github.com/lark-parser/lark is a relatively new (less than two years old) parsing library that looks promising too. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Introduce a SQL statement parser in Python 559964149 | |
1592052320 | https://github.com/simonw/sqlite-utils/issues/535#issuecomment-1592052320 | https://api.github.com/repos/simonw/sqlite-utils/issues/535 | IC_kwDOCGYnMM5e5Mpg | chapmanjacobd 7908073 | 2023-06-14T22:05:28Z | 2023-06-14T22:05:28Z | CONTRIBUTOR | piping to `jq` is good enough usually | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | rows: --transpose or psql extended view-like functionality 1655860104 | |
1538975545 | https://github.com/simonw/sqlite-utils/issues/538#issuecomment-1538975545 | https://api.github.com/repos/simonw/sqlite-utils/issues/538 | IC_kwDOCGYnMM5buuc5 | xavdid 1231935 | 2023-05-08T20:06:35Z | 2023-05-08T20:06:35Z | NONE | perfect, thank you! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | `table.upsert_all` fails to write rows when `not_null` is present 1695428235 | |
884910320 | https://github.com/simonw/datasette/issues/1401#issuecomment-884910320 | https://api.github.com/repos/simonw/datasette/issues/1401 | IC_kwDOBm6k_c40vqjw | fgregg 536941 | 2021-07-22T13:26:01Z | 2021-07-22T13:26:01Z | CONTRIBUTOR | ordered lists didn't work either, btw | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | unordered list is not rendering bullet points in description_html on database page 950664971 | |
1008166084 | https://github.com/simonw/sqlite-utils/issues/365#issuecomment-1008166084 | https://api.github.com/repos/simonw/sqlite-utils/issues/365 | IC_kwDOCGYnMM48F2TE | fgregg 536941 | 2022-01-08T22:32:47Z | 2022-01-08T22:32:47Z | CONTRIBUTOR | or using “ pragma optimize” | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | create-index should run analyze after creating index 1096558279 | |
641603457 | https://github.com/simonw/datasette/issues/806#issuecomment-641603457 | https://api.github.com/repos/simonw/datasette/issues/806 | MDEyOklzc3VlQ29tbWVudDY0MTYwMzQ1Nw== | simonw 9599 | 2020-06-09T21:57:32Z | 2020-06-09T21:57:32Z | OWNER | operation, procedure, process as alternative words for those menu items? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Release Datasette 0.44 632753851 | |
915279711 | https://github.com/simonw/datasette/issues/1464#issuecomment-915279711 | https://api.github.com/repos/simonw/datasette/issues/1464 | IC_kwDOBm6k_c42jg9f | ctb 51016 | 2021-09-08T14:16:49Z | 2021-09-08T14:16:49Z | CONTRIBUTOR | on commit d57ab156b35ec642 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | clean checkout & clean environment has test failures 991191951 | |
1270988081 | https://github.com/simonw/datasette/issues/1836#issuecomment-1270988081 | https://api.github.com/repos/simonw/datasette/issues/1836 | IC_kwDOBm6k_c5Lwb0x | fgregg 536941 | 2022-10-07T01:19:01Z | 2022-10-07T01:27:35Z | CONTRIBUTOR | okay, some progress!! running some sql against a database file causes that file to get duplicated even if it doesn't apparently change the file. make a little test script like this: ```python # test_sql.py import sqlite3 import sys db_name = sys.argv[1] conn = sqlite3.connect(f'file:/app/{db_name}', uri=True) cur = conn.cursor() cur.execute('select count(*) from filing') print(cur.fetchone()) ``` then ```docker RUN python test_sql.py nlrb.db ``` produced a layer that's the same size as `nlrb.db`!! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | docker image is duplicating db files somehow 1400374908 | |
954384496 | https://github.com/simonw/datasette/pull/1495#issuecomment-954384496 | https://api.github.com/repos/simonw/datasette/issues/1495 | IC_kwDOBm6k_c444sBw | fgregg 536941 | 2021-10-29T03:07:13Z | 2021-10-29T03:07:13Z | CONTRIBUTOR | okay @simonw, made the requested changes. tests are running locally. i think this is ready for you to look at again. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Allow routes to have extra options 1033678984 | |
1264223554 | https://github.com/simonw/sqlite-utils/issues/409#issuecomment-1264223554 | https://api.github.com/repos/simonw/sqlite-utils/issues/409 | IC_kwDOCGYnMM5LWoVC | chapmanjacobd 7908073 | 2022-10-01T03:42:50Z | 2022-10-01T03:42:50Z | CONTRIBUTOR | oh weird. it inserts into db2 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | `with db:` for transactions 1149661489 | |
1250901367 | https://github.com/simonw/datasette/issues/1813#issuecomment-1250901367 | https://api.github.com/repos/simonw/datasette/issues/1813 | IC_kwDOBm6k_c5Kjz13 | adipasquale 883348 | 2022-09-19T11:34:45Z | 2022-09-19T11:34:45Z | CONTRIBUTOR | oh and by writing this I just realized the difference: the URL on fly.io is with a custom SQL command whereas the local one is without. It seems that there is no pagination when using custom SQL commands which makes sense Sorry for this useless issue, maybe this can be useful for someone else / me in the future. Thanks again for this wonderful project ! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | missing next and next_url in JSON responses from an instance deployed on Fly 1377811868 | |
1274153135 | https://github.com/simonw/sqlite-utils/pull/498#issuecomment-1274153135 | https://api.github.com/repos/simonw/sqlite-utils/issues/498 | IC_kwDOCGYnMM5L8giv | chapmanjacobd 7908073 | 2022-10-11T06:34:31Z | 2022-10-11T06:34:31Z | CONTRIBUTOR | nevermind it was because I was running `db[table].transform`. The fts tables would still be there but the triggers would be dropped | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | fix: enable-fts permanently save triggers 1404013495 | |
1049879118 | https://github.com/simonw/datasette/issues/1641#issuecomment-1049879118 | https://api.github.com/repos/simonw/datasette/issues/1641 | IC_kwDOBm6k_c4-k-JO | fgregg 536941 | 2022-02-24T13:49:26Z | 2022-02-24T13:49:26Z | CONTRIBUTOR | maybe worth considering adding buttons for paren, asterisk, etc. under the input text box on mobile? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Tweak mobile keyboard settings 1149310456 | |
541324637 | https://github.com/simonw/datasette/issues/593#issuecomment-541324637 | https://api.github.com/repos/simonw/datasette/issues/593 | MDEyOklzc3VlQ29tbWVudDU0MTMyNDYzNw== | stonebig 4312421 | 2019-10-12T13:22:50Z | 2019-10-12T13:22:50Z | NONE | maybe situation is to change ? I see this in uvicorn https://github.com/encode/uvicorn/pull/423 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | make uvicorn optional dependancy (because not ok on windows python yet) 506183241 | |
1210675046 | https://github.com/simonw/datasette/issues/1779#issuecomment-1210675046 | https://api.github.com/repos/simonw/datasette/issues/1779 | IC_kwDOBm6k_c5IKW9m | fgregg 536941 | 2022-08-10T13:28:37Z | 2022-08-10T13:28:37Z | CONTRIBUTOR | maybe a simpler solution is to set the maxscale to like 2? since datasette is not set up to make use of container scaling anyway? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | google cloudrun updated their limits on maxscale based on memory and cpu count 1334628400 | |
626006493 | https://github.com/simonw/datasette/issues/619#issuecomment-626006493 | https://api.github.com/repos/simonw/datasette/issues/619 | MDEyOklzc3VlQ29tbWVudDYyNjAwNjQ5Mw== | davidszotten 412005 | 2020-05-08T20:29:12Z | 2020-05-08T20:29:12Z | NONE | just trying out datasette and quite like it, thanks! i found this issue annoying enough to have a go at a fix. have you any thoughts on a good approach? (i'm happy to dig in myself if you haven't thought about it yet, but wanted to check if you had an idea for how to fix when you raised the issue) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | "Invalid SQL" page should let you edit the SQL 520655983 | |
1189010812 | https://github.com/simonw/sqlite-utils/issues/423#issuecomment-1189010812 | https://api.github.com/repos/simonw/sqlite-utils/issues/423 | IC_kwDOCGYnMM5G3t18 | fgregg 536941 | 2022-07-19T12:47:39Z | 2022-07-19T12:47:39Z | CONTRIBUTOR | just ran into this! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | .extract() doesn't set foreign key when extracted columns contain NULL value 1199158210 | |
1001222213 | https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1001222213 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62 | IC_kwDODEm0Qs47rXBF | swyxio 6764957 | 2021-12-26T17:59:25Z | 2021-12-26T17:59:25Z | NONE | just confirmed that this error does not occur when i use my public main account. gets more interesting! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | KeyError: 'created_at' for private accounts? 1088816961 | |
992971072 | https://github.com/simonw/datasette/issues/526#issuecomment-992971072 | https://api.github.com/repos/simonw/datasette/issues/526 | IC_kwDOBm6k_c47L4lA | fgregg 536941 | 2021-12-13T22:29:34Z | 2021-12-13T22:29:34Z | CONTRIBUTOR | just came by to open this issue. would make my data analysis in observable a lot better! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Stream all results for arbitrary SQL and canned queries 459882902 | |
1270936982 | https://github.com/simonw/datasette/issues/1836#issuecomment-1270936982 | https://api.github.com/repos/simonw/datasette/issues/1836 | IC_kwDOBm6k_c5LwPWW | fgregg 536941 | 2022-10-07T00:52:41Z | 2022-10-07T00:52:41Z | CONTRIBUTOR | it's not that the inspect command is somehow changing the db files. if i set them to only read-only, the "inspect" layer still has the same very large size. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | docker image is duplicating db files somehow 1400374908 | |
861035862 | https://github.com/dogsheep/github-to-sqlite/issues/64#issuecomment-861035862 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64 | MDEyOklzc3VlQ29tbWVudDg2MTAzNTg2Mg== | khimaros 231498 | 2021-06-14T22:29:20Z | 2021-06-14T22:29:20Z | NONE | it looks like the v4 GraphQL API is the only way to get data beyond 90 days from GitHub. this is significant change, but may be worth considering in the future. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | feature: support "events" 920636216 | |
747130908 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/2#issuecomment-747130908 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDc0NzEzMDkwOA== | khimaros 231498 | 2020-12-17T00:47:04Z | 2020-12-17T00:47:43Z | NONE | it looks like almost all of the memory consumption is coming from `json.load()`. another direction here may be to use the new "Semantic Location History" data which is already broken down by year and month. it also provides much more interesting data, such as estimated address, form of travel, etc. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | killed by oomkiller on large location-history 769376447 | |
463917744 | https://github.com/simonw/datasette/issues/187#issuecomment-463917744 | https://api.github.com/repos/simonw/datasette/issues/187 | MDEyOklzc3VlQ29tbWVudDQ2MzkxNzc0NA== | phoenixjun 4190962 | 2019-02-15T05:58:44Z | 2019-02-15T05:58:44Z | NONE | is this supported or not? you can comment if it is not supported so that people like me can stop trying. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Windows installation error 309033998 | |
1364345119 | https://github.com/simonw/datasette/issues/1614#issuecomment-1364345119 | https://api.github.com/repos/simonw/datasette/issues/1614 | IC_kwDOBm6k_c5RUkEf | fgregg 536941 | 2022-12-23T21:27:10Z | 2022-12-23T21:27:10Z | CONTRIBUTOR | is this issue closed by #1893? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Try again with SQLite codemirror support 1115435536 | |
1509951952 | https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1509951952 | https://api.github.com/repos/simonw/sqlite-utils/issues/425 | IC_kwDOCGYnMM5aAAnQ | Dhyanesh97 89400147 | 2023-04-15T20:14:58Z | 2023-04-15T20:14:58Z | NONE | is this change released ? Because when we run docker containers issue still persists for production deployments. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | `sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher 1203842656 | |
609393513 | https://github.com/simonw/datasette/pull/627#issuecomment-609393513 | https://api.github.com/repos/simonw/datasette/issues/627 | MDEyOklzc3VlQ29tbWVudDYwOTM5MzUxMw== | stonebig 4312421 | 2020-04-05T10:23:57Z | 2020-04-05T10:23:57Z | NONE | is there any specific reason to stick to Jinja2~=2.10.3 when there is Jinja-2.11.1 ? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support Python 3.8, stop supporting Python 3.5 521323012 | |
1007844190 | https://github.com/simonw/datasette/pull/1574#issuecomment-1007844190 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c48Ente | fgregg 536941 | 2022-01-08T00:42:12Z | 2022-01-08T00:42:12Z | CONTRIBUTOR | is there a reason to not always use the slim option? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | introduce new option for datasette package to use a slim base image 1084193403 | |
997128712 | https://github.com/simonw/datasette/issues/1561#issuecomment-997128712 | https://api.github.com/repos/simonw/datasette/issues/1561 | IC_kwDOBm6k_c47bvoI | fgregg 536941 | 2021-12-18T02:35:48Z | 2021-12-18T02:35:48Z | CONTRIBUTOR | interesting! i love this feature. this + full caching with cloudflare is really super! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add hash id to "_memory" url if hashed url mode is turned on and crossdb is also turned on 1082765654 | |
1260909128 | https://github.com/simonw/datasette/issues/1062#issuecomment-1260909128 | https://api.github.com/repos/simonw/datasette/issues/1062 | IC_kwDOBm6k_c5LJ_JI | fgregg 536941 | 2022-09-28T13:22:53Z | 2022-09-28T14:09:54Z | CONTRIBUTOR | if you went this route: ```python with sqlite_timelimit(conn, time_limit_ms): c.execute(query) for chunk in c.fetchmany(chunk_size): yield from chunk ``` then `time_limit_ms` would probably have to be greatly extended, because the time spent in the loop will depend on the downstream processing. i wonder if this was why you were thinking this feature would need a dedicated connection? --- reading more, there's no real limit i can find on the number of active cursors (or more precisely active prepared statements objects, because sqlite doesn't really have cursors). maybe something like this would be okay? ```python with sqlite_timelimit(conn, time_limit_ms): c.execute(query) # step through at least one to evaluate the statement, not sure if this is necessary yield c.execute.fetchone() for chunk in c.fetchmany(chunk_size): yield from chunk ``` this seems quite weird that there's not more of limit of the number of active prepared statements, but i haven't been able to find one. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Refactor .csv to be an output renderer - and teach register_output_renderer to stream all rows 732674148 | |
1032120014 | https://github.com/simonw/sqlite-utils/issues/26#issuecomment-1032120014 | https://api.github.com/repos/simonw/sqlite-utils/issues/26 | IC_kwDOCGYnMM49hObO | fgregg 536941 | 2022-02-08T01:32:34Z | 2022-02-08T01:32:34Z | CONTRIBUTOR | if you are curious about prior art, https://github.com/jsnell/json-to-multicsv is really good! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Mechanism for turning nested JSON into foreign keys / many-to-many 455486286 | |
1002825217 | https://github.com/simonw/datasette/issues/1583#issuecomment-1002825217 | https://api.github.com/repos/simonw/datasette/issues/1583 | IC_kwDOBm6k_c47xeYB | fgregg 536941 | 2021-12-30T00:34:16Z | 2021-12-30T00:34:16Z | CONTRIBUTOR | if that is not desirable, it might be good to document that users might want to set up a lifecycle rule to automatically delete these build artifacts. something like https://stackoverflow.com/questions/59937542/can-i-delete-container-images-from-google-cloud-storage-artifacts-bucket | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | consider adding deletion step of cloudbuild artifacts to gcloud publish 1090810196 | |
949604763 | https://github.com/simonw/datasette/issues/1284#issuecomment-949604763 | https://api.github.com/repos/simonw/datasette/issues/1284 | IC_kwDOBm6k_c44mdGb | fgregg 536941 | 2021-10-22T12:54:34Z | 2021-10-22T12:54:34Z | CONTRIBUTOR | i'm going to take a swing at this today. we'll see. | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Feature or Documentation Request: Individual table as home page template 845794436 | |
1353721442 | https://github.com/simonw/datasette/issues/1619#issuecomment-1353721442 | https://api.github.com/repos/simonw/datasette/issues/1619 | IC_kwDOBm6k_c5QsCZi | noslouch 2090382 | 2022-12-15T21:20:53Z | 2022-12-15T21:20:53Z | NONE | i'm also getting bit by this. I'm trying to set up an nginx reverse proxy in front of multiple datasette backends. When I run it locally or behind the proxy, I see the `base_url` value added a second time to the path for various action links on table pages (view as JSON, sort by column, etc). | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | JSON link on row page is 404 if base_url setting is used 1121583414 | |
1404070841 | https://github.com/simonw/sqlite-utils/pull/203#issuecomment-1404070841 | https://api.github.com/repos/simonw/sqlite-utils/issues/203 | IC_kwDOCGYnMM5TsGu5 | fgregg 536941 | 2023-01-25T18:47:18Z | 2023-01-25T18:47:18Z | CONTRIBUTOR | i'll adopt this PR to make the changes @simonw suggested https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932 | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | changes to allow for compound foreign keys 743384829 | |
1321241426 | https://github.com/simonw/datasette/issues/1886#issuecomment-1321241426 | https://api.github.com/repos/simonw/datasette/issues/1886 | IC_kwDOBm6k_c5OwItS | fgregg 536941 | 2022-11-20T20:58:54Z | 2022-11-20T20:58:54Z | CONTRIBUTOR | i wrote up a blog post of how i'm using it! https://bunkum.us/2022/11/20/mgdo-stack.html | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Call for birthday presents: if you're using Datasette, let us know how you're using it here 1447050738 | |
344430299 | https://github.com/simonw/datasette/issues/93#issuecomment-344430299 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQzMDI5OQ== | atomotic 67420 | 2017-11-14T23:06:33Z | 2017-11-14T23:06:33Z | NONE | i will look better tomorrow, it's late i surely made some mistake https://asciinema.org/a/ZyAWbetrlriDadwWyVPUWB94H | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
964205475 | https://github.com/simonw/sqlite-utils/issues/26#issuecomment-964205475 | https://api.github.com/repos/simonw/sqlite-utils/issues/26 | IC_kwDOCGYnMM45eJuj | fgregg 536941 | 2021-11-09T14:31:29Z | 2021-11-09T14:31:29Z | CONTRIBUTOR | i was just reaching for a tool to do this this morning | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Mechanism for turning nested JSON into foreign keys / many-to-many 455486286 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
updated_at (date) >1000 ✖