From c5791156d92615f25696ba93dae5bb2dcc192c98 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 7 Mar 2022 14:04:10 -0800 Subject: [PATCH 001/952] Code of conduct, refs #1654 --- CODE_OF_CONDUCT.md | 128 +++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 128 insertions(+) create mode 100644 CODE_OF_CONDUCT.md diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md new file mode 100644 index 00000000..14d4c567 --- /dev/null +++ b/CODE_OF_CONDUCT.md @@ -0,0 +1,128 @@ +# Contributor Covenant Code of Conduct + +## Our Pledge + +We as members, contributors, and leaders pledge to make participation in our +community a harassment-free experience for everyone, regardless of age, body +size, visible or invisible disability, ethnicity, sex characteristics, gender +identity and expression, level of experience, education, socio-economic status, +nationality, personal appearance, race, religion, or sexual identity +and orientation. + +We pledge to act and interact in ways that contribute to an open, welcoming, +diverse, inclusive, and healthy community. + +## Our Standards + +Examples of behavior that contributes to a positive environment for our +community include: + +* Demonstrating empathy and kindness toward other people +* Being respectful of differing opinions, viewpoints, and experiences +* Giving and gracefully accepting constructive feedback +* Accepting responsibility and apologizing to those affected by our mistakes, + and learning from the experience +* Focusing on what is best not just for us as individuals, but for the + overall community + +Examples of unacceptable behavior include: + +* The use of sexualized language or imagery, and sexual attention or + advances of any kind +* Trolling, insulting or derogatory comments, and personal or political attacks +* Public or private harassment +* Publishing others' private information, such as a physical or email + address, without their explicit permission +* Other conduct which could reasonably be considered inappropriate in a + professional setting + +## Enforcement Responsibilities + +Community leaders are responsible for clarifying and enforcing our standards of +acceptable behavior and will take appropriate and fair corrective action in +response to any behavior that they deem inappropriate, threatening, offensive, +or harmful. + +Community leaders have the right and responsibility to remove, edit, or reject +comments, commits, code, wiki edits, issues, and other contributions that are +not aligned to this Code of Conduct, and will communicate reasons for moderation +decisions when appropriate. + +## Scope + +This Code of Conduct applies within all community spaces, and also applies when +an individual is officially representing the community in public spaces. +Examples of representing our community include using an official e-mail address, +posting via an official social media account, or acting as an appointed +representative at an online or offline event. + +## Enforcement + +Instances of abusive, harassing, or otherwise unacceptable behavior may be +reported to the community leaders responsible for enforcement at +`swillison+datasette-code-of-conduct@gmail.com`. +All complaints will be reviewed and investigated promptly and fairly. + +All community leaders are obligated to respect the privacy and security of the +reporter of any incident. + +## Enforcement Guidelines + +Community leaders will follow these Community Impact Guidelines in determining +the consequences for any action they deem in violation of this Code of Conduct: + +### 1. Correction + +**Community Impact**: Use of inappropriate language or other behavior deemed +unprofessional or unwelcome in the community. + +**Consequence**: A private, written warning from community leaders, providing +clarity around the nature of the violation and an explanation of why the +behavior was inappropriate. A public apology may be requested. + +### 2. Warning + +**Community Impact**: A violation through a single incident or series +of actions. + +**Consequence**: A warning with consequences for continued behavior. No +interaction with the people involved, including unsolicited interaction with +those enforcing the Code of Conduct, for a specified period of time. This +includes avoiding interactions in community spaces as well as external channels +like social media. Violating these terms may lead to a temporary or +permanent ban. + +### 3. Temporary Ban + +**Community Impact**: A serious violation of community standards, including +sustained inappropriate behavior. + +**Consequence**: A temporary ban from any sort of interaction or public +communication with the community for a specified period of time. No public or +private interaction with the people involved, including unsolicited interaction +with those enforcing the Code of Conduct, is allowed during this period. +Violating these terms may lead to a permanent ban. + +### 4. Permanent Ban + +**Community Impact**: Demonstrating a pattern of violation of community +standards, including sustained inappropriate behavior, harassment of an +individual, or aggression toward or disparagement of classes of individuals. + +**Consequence**: A permanent ban from any sort of public interaction within +the community. + +## Attribution + +This Code of Conduct is adapted from the [Contributor Covenant][homepage], +version 2.0, available at +https://www.contributor-covenant.org/version/2/0/code_of_conduct.html. + +Community Impact Guidelines were inspired by [Mozilla's code of conduct +enforcement ladder](https://github.com/mozilla/diversity). + +[homepage]: https://www.contributor-covenant.org + +For answers to common questions about this code of conduct, see the FAQ at +https://www.contributor-covenant.org/faq. Translations are available at +https://www.contributor-covenant.org/translations. From 239aed182053903ed69108776b6864d42bfe1eb4 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 15 Mar 2022 08:36:35 -0700 Subject: [PATCH 002/952] Revert "Code of conduct, refs #1654" This reverts commit c5791156d92615f25696ba93dae5bb2dcc192c98. Refs #1658 --- CODE_OF_CONDUCT.md | 128 --------------------------------------------- 1 file changed, 128 deletions(-) delete mode 100644 CODE_OF_CONDUCT.md diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md deleted file mode 100644 index 14d4c567..00000000 --- a/CODE_OF_CONDUCT.md +++ /dev/null @@ -1,128 +0,0 @@ -# Contributor Covenant Code of Conduct - -## Our Pledge - -We as members, contributors, and leaders pledge to make participation in our -community a harassment-free experience for everyone, regardless of age, body -size, visible or invisible disability, ethnicity, sex characteristics, gender -identity and expression, level of experience, education, socio-economic status, -nationality, personal appearance, race, religion, or sexual identity -and orientation. - -We pledge to act and interact in ways that contribute to an open, welcoming, -diverse, inclusive, and healthy community. - -## Our Standards - -Examples of behavior that contributes to a positive environment for our -community include: - -* Demonstrating empathy and kindness toward other people -* Being respectful of differing opinions, viewpoints, and experiences -* Giving and gracefully accepting constructive feedback -* Accepting responsibility and apologizing to those affected by our mistakes, - and learning from the experience -* Focusing on what is best not just for us as individuals, but for the - overall community - -Examples of unacceptable behavior include: - -* The use of sexualized language or imagery, and sexual attention or - advances of any kind -* Trolling, insulting or derogatory comments, and personal or political attacks -* Public or private harassment -* Publishing others' private information, such as a physical or email - address, without their explicit permission -* Other conduct which could reasonably be considered inappropriate in a - professional setting - -## Enforcement Responsibilities - -Community leaders are responsible for clarifying and enforcing our standards of -acceptable behavior and will take appropriate and fair corrective action in -response to any behavior that they deem inappropriate, threatening, offensive, -or harmful. - -Community leaders have the right and responsibility to remove, edit, or reject -comments, commits, code, wiki edits, issues, and other contributions that are -not aligned to this Code of Conduct, and will communicate reasons for moderation -decisions when appropriate. - -## Scope - -This Code of Conduct applies within all community spaces, and also applies when -an individual is officially representing the community in public spaces. -Examples of representing our community include using an official e-mail address, -posting via an official social media account, or acting as an appointed -representative at an online or offline event. - -## Enforcement - -Instances of abusive, harassing, or otherwise unacceptable behavior may be -reported to the community leaders responsible for enforcement at -`swillison+datasette-code-of-conduct@gmail.com`. -All complaints will be reviewed and investigated promptly and fairly. - -All community leaders are obligated to respect the privacy and security of the -reporter of any incident. - -## Enforcement Guidelines - -Community leaders will follow these Community Impact Guidelines in determining -the consequences for any action they deem in violation of this Code of Conduct: - -### 1. Correction - -**Community Impact**: Use of inappropriate language or other behavior deemed -unprofessional or unwelcome in the community. - -**Consequence**: A private, written warning from community leaders, providing -clarity around the nature of the violation and an explanation of why the -behavior was inappropriate. A public apology may be requested. - -### 2. Warning - -**Community Impact**: A violation through a single incident or series -of actions. - -**Consequence**: A warning with consequences for continued behavior. No -interaction with the people involved, including unsolicited interaction with -those enforcing the Code of Conduct, for a specified period of time. This -includes avoiding interactions in community spaces as well as external channels -like social media. Violating these terms may lead to a temporary or -permanent ban. - -### 3. Temporary Ban - -**Community Impact**: A serious violation of community standards, including -sustained inappropriate behavior. - -**Consequence**: A temporary ban from any sort of interaction or public -communication with the community for a specified period of time. No public or -private interaction with the people involved, including unsolicited interaction -with those enforcing the Code of Conduct, is allowed during this period. -Violating these terms may lead to a permanent ban. - -### 4. Permanent Ban - -**Community Impact**: Demonstrating a pattern of violation of community -standards, including sustained inappropriate behavior, harassment of an -individual, or aggression toward or disparagement of classes of individuals. - -**Consequence**: A permanent ban from any sort of public interaction within -the community. - -## Attribution - -This Code of Conduct is adapted from the [Contributor Covenant][homepage], -version 2.0, available at -https://www.contributor-covenant.org/version/2/0/code_of_conduct.html. - -Community Impact Guidelines were inspired by [Mozilla's code of conduct -enforcement ladder](https://github.com/mozilla/diversity). - -[homepage]: https://www.contributor-covenant.org - -For answers to common questions about this code of conduct, see the FAQ at -https://www.contributor-covenant.org/faq. Translations are available at -https://www.contributor-covenant.org/translations. From 5a353a32b9c4d75acbe3193fd72f735a8e78516a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 15 Mar 2022 08:37:14 -0700 Subject: [PATCH 003/952] Revert "Fixed tests for urlsafe_components, refs #1650" This reverts commit bb499942c15c4e2cfa4b6afab8f8debe5948c009. Refs #1658 --- tests/test_utils.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/tests/test_utils.py b/tests/test_utils.py index ff4f649a..1c3ab495 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -19,8 +19,8 @@ from unittest.mock import patch ("foo", ["foo"]), ("foo,bar", ["foo", "bar"]), ("123,433,112", ["123", "433", "112"]), - ("123-2C433,112", ["123,433", "112"]), - ("123-2F433-2F112", ["123/433/112"]), + ("123%2C433,112", ["123,433", "112"]), + ("123%2F433%2F112", ["123/433/112"]), ], ) def test_urlsafe_components(path, expected): From 77e718c3ffb30473759a8b1ed347f73cb2ff5cfe Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 15 Mar 2022 08:37:31 -0700 Subject: [PATCH 004/952] Revert "Fix bug with percentage redirects, close #1650" This reverts commit c85d669de387b40e667fd6942c6cc1c15b4f5964. Refs #1658 --- datasette/utils/__init__.py | 7 +------ tests/test_html.py | 4 ---- 2 files changed, 1 insertion(+), 10 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index e7c9fb1c..79feeef6 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -10,7 +10,6 @@ import markupsafe import mergedeep import os import re -import secrets import shlex import tempfile import typing @@ -1173,8 +1172,4 @@ def dash_encode(s: str) -> str: @documented def dash_decode(s: str) -> str: "Decodes a dash-encoded string, so ``-2Ffoo-2Fbar`` -> ``/foo/bar``" - # Avoid accidentally decoding a %2f style sequence - temp = secrets.token_hex(16) - s = s.replace("%", temp) - decoded = urllib.parse.unquote(s.replace("-", "%")) - return decoded.replace(temp, "%") + return urllib.parse.unquote(s.replace("-", "%")) diff --git a/tests/test_html.py b/tests/test_html.py index 55d78c05..de703284 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -961,10 +961,6 @@ def test_no_alternate_url_json(app_client, path): "/fivethirtyeight/twitter-ratio%2Fsenators", "/fivethirtyeight/twitter-2Dratio-2Fsenators", ), - ( - "/fixtures/table%2Fwith%2Fslashes", - "/fixtures/table-2Fwith-2Fslashes", - ), # query string should be preserved ("/foo/bar%2Fbaz?id=5", "/foo/bar-2Fbaz?id=5"), ), From 645381a5ed23c016281e8c6c7d141518f91b67e5 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 15 Mar 2022 08:36:35 -0700 Subject: [PATCH 005/952] Add code of conduct again Refs #1658 --- CODE_OF_CONDUCT.md | 128 +++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 128 insertions(+) create mode 100644 CODE_OF_CONDUCT.md diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md new file mode 100644 index 00000000..14d4c567 --- /dev/null +++ b/CODE_OF_CONDUCT.md @@ -0,0 +1,128 @@ +# Contributor Covenant Code of Conduct + +## Our Pledge + +We as members, contributors, and leaders pledge to make participation in our +community a harassment-free experience for everyone, regardless of age, body +size, visible or invisible disability, ethnicity, sex characteristics, gender +identity and expression, level of experience, education, socio-economic status, +nationality, personal appearance, race, religion, or sexual identity +and orientation. + +We pledge to act and interact in ways that contribute to an open, welcoming, +diverse, inclusive, and healthy community. + +## Our Standards + +Examples of behavior that contributes to a positive environment for our +community include: + +* Demonstrating empathy and kindness toward other people +* Being respectful of differing opinions, viewpoints, and experiences +* Giving and gracefully accepting constructive feedback +* Accepting responsibility and apologizing to those affected by our mistakes, + and learning from the experience +* Focusing on what is best not just for us as individuals, but for the + overall community + +Examples of unacceptable behavior include: + +* The use of sexualized language or imagery, and sexual attention or + advances of any kind +* Trolling, insulting or derogatory comments, and personal or political attacks +* Public or private harassment +* Publishing others' private information, such as a physical or email + address, without their explicit permission +* Other conduct which could reasonably be considered inappropriate in a + professional setting + +## Enforcement Responsibilities + +Community leaders are responsible for clarifying and enforcing our standards of +acceptable behavior and will take appropriate and fair corrective action in +response to any behavior that they deem inappropriate, threatening, offensive, +or harmful. + +Community leaders have the right and responsibility to remove, edit, or reject +comments, commits, code, wiki edits, issues, and other contributions that are +not aligned to this Code of Conduct, and will communicate reasons for moderation +decisions when appropriate. + +## Scope + +This Code of Conduct applies within all community spaces, and also applies when +an individual is officially representing the community in public spaces. +Examples of representing our community include using an official e-mail address, +posting via an official social media account, or acting as an appointed +representative at an online or offline event. + +## Enforcement + +Instances of abusive, harassing, or otherwise unacceptable behavior may be +reported to the community leaders responsible for enforcement at +`swillison+datasette-code-of-conduct@gmail.com`. +All complaints will be reviewed and investigated promptly and fairly. + +All community leaders are obligated to respect the privacy and security of the +reporter of any incident. + +## Enforcement Guidelines + +Community leaders will follow these Community Impact Guidelines in determining +the consequences for any action they deem in violation of this Code of Conduct: + +### 1. Correction + +**Community Impact**: Use of inappropriate language or other behavior deemed +unprofessional or unwelcome in the community. + +**Consequence**: A private, written warning from community leaders, providing +clarity around the nature of the violation and an explanation of why the +behavior was inappropriate. A public apology may be requested. + +### 2. Warning + +**Community Impact**: A violation through a single incident or series +of actions. + +**Consequence**: A warning with consequences for continued behavior. No +interaction with the people involved, including unsolicited interaction with +those enforcing the Code of Conduct, for a specified period of time. This +includes avoiding interactions in community spaces as well as external channels +like social media. Violating these terms may lead to a temporary or +permanent ban. + +### 3. Temporary Ban + +**Community Impact**: A serious violation of community standards, including +sustained inappropriate behavior. + +**Consequence**: A temporary ban from any sort of interaction or public +communication with the community for a specified period of time. No public or +private interaction with the people involved, including unsolicited interaction +with those enforcing the Code of Conduct, is allowed during this period. +Violating these terms may lead to a permanent ban. + +### 4. Permanent Ban + +**Community Impact**: Demonstrating a pattern of violation of community +standards, including sustained inappropriate behavior, harassment of an +individual, or aggression toward or disparagement of classes of individuals. + +**Consequence**: A permanent ban from any sort of public interaction within +the community. + +## Attribution + +This Code of Conduct is adapted from the [Contributor Covenant][homepage], +version 2.0, available at +https://www.contributor-covenant.org/version/2/0/code_of_conduct.html. + +Community Impact Guidelines were inspired by [Mozilla's code of conduct +enforcement ladder](https://github.com/mozilla/diversity). + +[homepage]: https://www.contributor-covenant.org + +For answers to common questions about this code of conduct, see the FAQ at +https://www.contributor-covenant.org/faq. Translations are available at +https://www.contributor-covenant.org/translations. From c10cd48baf106659bf3f129ad7bfb2226be73821 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 7 Mar 2022 11:56:59 -0800 Subject: [PATCH 006/952] Min pytest-asyncio of 0.17 So that the asyncio_mode in pytest.ini does not produce a warning on older versions of that library. --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index 8e69c2f5..e70839d6 100644 --- a/setup.py +++ b/setup.py @@ -69,7 +69,7 @@ setup( "test": [ "pytest>=5.2.2,<7.1.0", "pytest-xdist>=2.2.1,<2.6", - "pytest-asyncio>=0.10,<0.19", + "pytest-asyncio>=0.17,<0.19", "beautifulsoup4>=4.8.1,<4.11.0", "black==22.1.0", "pytest-timeout>=1.4.2,<2.2", From a35393b29cfb5b8abdc6a94e577af1c9a5c13652 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 15 Mar 2022 11:01:57 -0700 Subject: [PATCH 007/952] Tilde encoding (#1659) Closes #1657 Refs #1439 --- datasette/app.py | 11 +++++---- datasette/url_builder.py | 10 ++++---- datasette/utils/__init__.py | 37 ++++++++++++++++------------- datasette/views/base.py | 25 +++++++++++--------- datasette/views/table.py | 14 +++++++---- docs/csv_export.rst | 18 --------------- docs/internals.rst | 34 +++++++++++++-------------- tests/test_api.py | 17 ++++---------- tests/test_cli.py | 6 ++--- tests/test_html.py | 45 +++++++++++++++++++++--------------- tests/test_internals_urls.py | 2 +- tests/test_table_api.py | 9 +++++--- tests/test_table_html.py | 2 +- tests/test_utils.py | 36 +++++++++-------------------- 14 files changed, 125 insertions(+), 141 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 7abccc05..b39ef7cd 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -1211,11 +1211,14 @@ class DatasetteRouter: return await self.handle_404(request, send) async def handle_404(self, request, send, exception=None): - # If path contains % encoding, redirect to dash encoding + # If path contains % encoding, redirect to tilde encoding if "%" in request.path: - # Try the same path but with "%" replaced by "-" - # and "-" replaced with "-2D" - new_path = request.path.replace("-", "-2D").replace("%", "-") + # Try the same path but with "%" replaced by "~" + # and "~" replaced with "~7E" + # and "." replaced with "~2E" + new_path = ( + request.path.replace("~", "~7E").replace("%", "~").replace(".", "~2E") + ) if request.query_string: new_path += "?{}".format(request.query_string) await asgi_send_redirect(send, new_path) diff --git a/datasette/url_builder.py b/datasette/url_builder.py index eebfe31e..9f072462 100644 --- a/datasette/url_builder.py +++ b/datasette/url_builder.py @@ -1,4 +1,4 @@ -from .utils import dash_encode, path_with_format, HASH_LENGTH, PrefixedUrlString +from .utils import tilde_encode, path_with_format, HASH_LENGTH, PrefixedUrlString import urllib @@ -31,20 +31,20 @@ class Urls: db = self.ds.databases[database] if self.ds.setting("hash_urls") and db.hash: path = self.path( - f"{dash_encode(database)}-{db.hash[:HASH_LENGTH]}", format=format + f"{tilde_encode(database)}-{db.hash[:HASH_LENGTH]}", format=format ) else: - path = self.path(dash_encode(database), format=format) + path = self.path(tilde_encode(database), format=format) return path def table(self, database, table, format=None): - path = f"{self.database(database)}/{dash_encode(table)}" + path = f"{self.database(database)}/{tilde_encode(table)}" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path) def query(self, database, query, format=None): - path = f"{self.database(database)}/{dash_encode(query)}" + path = f"{self.database(database)}/{tilde_encode(query)}" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index 79feeef6..bd591459 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -15,6 +15,7 @@ import tempfile import typing import time import types +import secrets import shutil import urllib import yaml @@ -112,12 +113,12 @@ async def await_me_maybe(value: typing.Any) -> typing.Any: def urlsafe_components(token): - """Splits token on commas and dash-decodes each component""" - return [dash_decode(b) for b in token.split(",")] + """Splits token on commas and tilde-decodes each component""" + return [tilde_decode(b) for b in token.split(",")] def path_from_row_pks(row, pks, use_rowid, quote=True): - """Generate an optionally dash-quoted unique identifier + """Generate an optionally tilde-encoded unique identifier for a row from its primary keys.""" if use_rowid: bits = [row["rowid"]] @@ -126,7 +127,7 @@ def path_from_row_pks(row, pks, use_rowid, quote=True): row[pk]["value"] if isinstance(row[pk], dict) else row[pk] for pk in pks ] if quote: - bits = [dash_encode(str(bit)) for bit in bits] + bits = [tilde_encode(str(bit)) for bit in bits] else: bits = [str(bit) for bit in bits] @@ -1142,34 +1143,38 @@ def add_cors_headers(headers): headers["Access-Control-Expose-Headers"] = "Link" -_DASH_ENCODING_SAFE = frozenset( +_TILDE_ENCODING_SAFE = frozenset( b"ABCDEFGHIJKLMNOPQRSTUVWXYZ" b"abcdefghijklmnopqrstuvwxyz" - b"0123456789_" + b"0123456789_-" # This is the same as Python percent-encoding but I removed - # '.' and '-' and '~' + # '.' and '~' ) -class DashEncoder(dict): +class TildeEncoder(dict): # Keeps a cache internally, via __missing__ def __missing__(self, b): # Handle a cache miss, store encoded string in cache and return. - res = chr(b) if b in _DASH_ENCODING_SAFE else "-{:02X}".format(b) + res = chr(b) if b in _TILDE_ENCODING_SAFE else "~{:02X}".format(b) self[b] = res return res -_dash_encoder = DashEncoder().__getitem__ +_tilde_encoder = TildeEncoder().__getitem__ @documented -def dash_encode(s: str) -> str: - "Returns dash-encoded string - for example ``/foo/bar`` -> ``-2Ffoo-2Fbar``" - return "".join(_dash_encoder(char) for char in s.encode("utf-8")) +def tilde_encode(s: str) -> str: + "Returns tilde-encoded string - for example ``/foo/bar`` -> ``~2Ffoo~2Fbar``" + return "".join(_tilde_encoder(char) for char in s.encode("utf-8")) @documented -def dash_decode(s: str) -> str: - "Decodes a dash-encoded string, so ``-2Ffoo-2Fbar`` -> ``/foo/bar``" - return urllib.parse.unquote(s.replace("-", "%")) +def tilde_decode(s: str) -> str: + "Decodes a tilde-encoded string, so ``~2Ffoo~2Fbar`` -> ``/foo/bar``" + # Avoid accidentally decoding a %2f style sequence + temp = secrets.token_hex(16) + s = s.replace("%", temp) + decoded = urllib.parse.unquote(s.replace("~", "%")) + return decoded.replace(temp, "%") diff --git a/datasette/views/base.py b/datasette/views/base.py index 7cd385b7..1c0c3f9b 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -10,6 +10,7 @@ import pint from datasette import __version__ from datasette.database import QueryInterrupted +from datasette.utils.asgi import Request from datasette.utils import ( add_cors_headers, await_me_maybe, @@ -17,8 +18,8 @@ from datasette.utils import ( InvalidSql, LimitedWriter, call_with_supported_arguments, - dash_decode, - dash_encode, + tilde_decode, + tilde_encode, path_from_row_pks, path_with_added_args, path_with_removed_args, @@ -205,14 +206,14 @@ class DataView(BaseView): async def resolve_db_name(self, request, db_name, **kwargs): hash = None name = None - decoded_name = dash_decode(db_name) + decoded_name = tilde_decode(db_name) if decoded_name not in self.ds.databases and "-" in db_name: # No matching DB found, maybe it's a name-hash? name_bit, hash_bit = db_name.rsplit("-", 1) - if dash_decode(name_bit) not in self.ds.databases: + if tilde_decode(name_bit) not in self.ds.databases: raise NotFound(f"Database not found: {name}") else: - name = dash_decode(name_bit) + name = tilde_decode(name_bit) hash = hash_bit else: name = decoded_name @@ -235,7 +236,7 @@ class DataView(BaseView): return await db.table_exists(t) table, _format = await resolve_table_and_format( - table_and_format=dash_decode(kwargs["table_and_format"]), + table_and_format=tilde_decode(kwargs["table_and_format"]), table_exists=async_table_exists, allowed_formats=self.ds.renderers.keys(), ) @@ -243,11 +244,11 @@ class DataView(BaseView): if _format: kwargs["as_format"] = f".{_format}" elif kwargs.get("table"): - kwargs["table"] = dash_decode(kwargs["table"]) + kwargs["table"] = tilde_decode(kwargs["table"]) should_redirect = self.ds.urls.path(f"{name}-{expected}") if kwargs.get("table"): - should_redirect += "/" + dash_encode(kwargs["table"]) + should_redirect += "/" + tilde_encode(kwargs["table"]) if kwargs.get("pk_path"): should_redirect += "/" + kwargs["pk_path"] if kwargs.get("as_format"): @@ -291,6 +292,7 @@ class DataView(BaseView): if not request.args.get(key) ] if extra_parameters: + # Replace request object with a new one with modified scope if not request.query_string: new_query_string = "&".join(extra_parameters) else: @@ -300,7 +302,8 @@ class DataView(BaseView): new_scope = dict( request.scope, query_string=new_query_string.encode("latin-1") ) - request.scope = new_scope + receive = request.receive + request = Request(new_scope, receive) if stream: # Some quick soundness checks if not self.ds.setting("allow_csv_stream"): @@ -467,7 +470,7 @@ class DataView(BaseView): return await db.table_exists(t) table, _ext_format = await resolve_table_and_format( - table_and_format=dash_decode(args["table_and_format"]), + table_and_format=tilde_decode(args["table_and_format"]), table_exists=async_table_exists, allowed_formats=self.ds.renderers.keys(), ) @@ -475,7 +478,7 @@ class DataView(BaseView): args["table"] = table del args["table_and_format"] elif "table" in args: - args["table"] = dash_decode(args["table"]) + args["table"] = tilde_decode(args["table"]) return _format, args async def view_get(self, request, database, hash, correct_hash_provided, **kwargs): diff --git a/datasette/views/table.py b/datasette/views/table.py index 1d81755e..72b8e9a4 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -12,7 +12,8 @@ from datasette.utils import ( MultiParams, append_querystring, compound_keys_after_sql, - dash_encode, + tilde_decode, + tilde_encode, escape_sqlite, filters_should_redirect, is_url, @@ -143,7 +144,7 @@ class RowTableShared(DataView): '{flat_pks}'.format( base_url=base_url, database=database, - table=dash_encode(table), + table=tilde_encode(table), flat_pks=str(markupsafe.escape(pk_path)), flat_pks_quoted=path_from_row_pks(row, pks, not pks), ) @@ -200,8 +201,8 @@ class RowTableShared(DataView): link_template.format( database=database, base_url=base_url, - table=dash_encode(other_table), - link_id=dash_encode(str(value)), + table=tilde_encode(other_table), + link_id=tilde_encode(str(value)), id=str(markupsafe.escape(value)), label=str(markupsafe.escape(label)) or "-", ) @@ -346,6 +347,8 @@ class TableView(RowTableShared): write=bool(canned_query.get("write")), ) + table = tilde_decode(table) + db = self.ds.databases[database] is_view = bool(await db.get_view_definition(table)) table_exists = bool(await db.table_exists(table)) @@ -766,7 +769,7 @@ class TableView(RowTableShared): if prefix is None: prefix = "$null" else: - prefix = dash_encode(str(prefix)) + prefix = tilde_encode(str(prefix)) next_value = f"{prefix},{next_value}" added_args = {"_next": next_value} if sort: @@ -938,6 +941,7 @@ class RowView(RowTableShared): name = "row" async def data(self, request, database, hash, table, pk_path, default_labels=False): + table = tilde_decode(table) await self.check_permissions( request, [ diff --git a/docs/csv_export.rst b/docs/csv_export.rst index b1cc673c..023fa05e 100644 --- a/docs/csv_export.rst +++ b/docs/csv_export.rst @@ -59,21 +59,3 @@ truncation error message. You can increase or remove this limit using the :ref:`setting_max_csv_mb` config setting. You can also disable the CSV export feature entirely using :ref:`setting_allow_csv_stream`. - -A note on URLs --------------- - -The default URL for the CSV representation of a table is that table with -``.csv`` appended to it: - -* https://latest.datasette.io/fixtures/facetable - HTML interface -* https://latest.datasette.io/fixtures/facetable.csv - CSV export -* https://latest.datasette.io/fixtures/facetable.json - JSON API - -This pattern doesn't work for tables with names that already end in ``.csv`` or -``.json``. For those tables, you can instead use the ``_format=`` query string -parameter: - -* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - HTML interface -* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=csv - CSV export -* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json - JSON API diff --git a/docs/internals.rst b/docs/internals.rst index d035e1f1..3d223603 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -545,7 +545,7 @@ These functions can be accessed via the ``{{ urls }}`` object in Datasette templ facetable table pragma_cache_size query -Use the ``format="json"`` (or ``"csv"`` or other formats supported by plugins) arguments to get back URLs to the JSON representation. This is usually the path with ``.json`` added on the end, but it may use ``?_format=json`` in cases where the path already includes ``.json``, for example a URL to a table named ``table.json``. +Use the ``format="json"`` (or ``"csv"`` or other formats supported by plugins) arguments to get back URLs to the JSON representation. This is the path with ``.json`` added on the end. These methods each return a ``datasette.utils.PrefixedUrlString`` object, which is a subclass of the Python ``str`` type. This allows the logic that considers the ``base_url`` setting to detect if that prefix has already been applied to the path. @@ -876,31 +876,31 @@ Utility function for calling ``await`` on a return value if it is awaitable, oth .. autofunction:: datasette.utils.await_me_maybe -.. _internals_dash_encoding: +.. _internals_tilde_encoding: -Dash encoding -------------- +Tilde encoding +-------------- -Datasette uses a custom encoding scheme in some places, called **dash encoding**. This is primarily used for table names and row primary keys, to avoid any confusion between ``/`` characters in those values and the Datasette URLs that reference them. +Datasette uses a custom encoding scheme in some places, called **tilde encoding**. This is primarily used for table names and row primary keys, to avoid any confusion between ``/`` characters in those values and the Datasette URLs that reference them. -Dash encoding uses the same algorithm as `URL percent-encoding `__, but with the ``-`` hyphen character used in place of ``%``. +Tilde encoding uses the same algorithm as `URL percent-encoding `__, but with the ``~`` tilde character used in place of ``%``. -Any character other than ``ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz 0123456789_`` will be replaced by the numeric equivalent preceded by a hyphen. For example: +Any character other than ``ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz 0123456789_-`` will be replaced by the numeric equivalent preceded by a tilde. For example: -- ``/`` becomes ``-2F`` -- ``.`` becomes ``-2E`` -- ``%`` becomes ``-25`` -- ``-`` becomes ``-2D`` -- Space character becomes ``-20`` -- ``polls/2022.primary`` becomes ``polls-2F2022-2Eprimary`` +- ``/`` becomes ``~2F`` +- ``.`` becomes ``~2E`` +- ``%`` becomes ``~25`` +- ``~`` becomes ``~7E`` +- Space character becomes ``~20`` +- ``polls/2022.primary`` becomes ``polls~2F2022~2Eprimary`` -.. _internals_utils_dash_encode: +.. _internals_utils_tilde_encode: -.. autofunction:: datasette.utils.dash_encode +.. autofunction:: datasette.utils.tilde_encode -.. _internals_utils_dash_decode: +.. _internals_utils_tilde_decode: -.. autofunction:: datasette.utils.dash_decode +.. autofunction:: datasette.utils.tilde_decode .. _internals_tracer: diff --git a/tests/test_api.py b/tests/test_api.py index dd916cf0..87d91e56 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -679,18 +679,9 @@ def test_row(app_client): assert [{"id": "1", "content": "hello"}] == response.json["rows"] -def test_row_format_in_querystring(app_client): - # regression test for https://github.com/simonw/datasette/issues/563 - response = app_client.get( - "/fixtures/simple_primary_key/1?_format=json&_shape=objects" - ) - assert response.status == 200 - assert [{"id": "1", "content": "hello"}] == response.json["rows"] - - def test_row_strange_table_name(app_client): response = app_client.get( - "/fixtures/table%2Fwith%2Fslashes.csv/3.json?_shape=objects" + "/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects" ) assert response.status == 200 assert [{"pk": "3", "content": "hey"}] == response.json["rows"] @@ -942,7 +933,7 @@ def test_cors(app_client_with_cors, path, status_code): ) def test_database_with_space_in_name(app_client_two_attached_databases, path): response = app_client_two_attached_databases.get( - "/extra-20database" + path, follow_redirects=True + "/extra~20database" + path, follow_redirects=True ) assert response.status == 200 @@ -953,7 +944,7 @@ def test_common_prefix_database_names(app_client_conflicting_database_names): d["name"] for d in app_client_conflicting_database_names.get("/-/databases.json").json ] - for db_name, path in (("foo", "/foo.json"), ("foo-bar", "/foo-2Dbar.json")): + for db_name, path in (("foo", "/foo.json"), ("foo-bar", "/foo-bar.json")): data = app_client_conflicting_database_names.get(path).json assert db_name == data["database"] @@ -996,7 +987,7 @@ async def test_hidden_sqlite_stat1_table(): @pytest.mark.asyncio @pytest.mark.parametrize("db_name", ("foo", r"fo%o", "f~/c.d")) -async def test_dash_encoded_database_names(db_name): +async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get("/.json") diff --git a/tests/test_cli.py b/tests/test_cli.py index e30c2ad3..5afe72c1 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -9,7 +9,7 @@ from datasette.app import SETTINGS from datasette.plugins import DEFAULT_PLUGINS from datasette.cli import cli, serve from datasette.version import __version__ -from datasette.utils import dash_encode +from datasette.utils import tilde_encode from datasette.utils.sqlite import sqlite3 from click.testing import CliRunner import io @@ -295,12 +295,12 @@ def test_weird_database_names(ensure_eventloop, tmpdir, filename): assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(".", 1)[0] expected_link = '{}'.format( - dash_encode(filename_no_stem), filename_no_stem + tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( - cli, [db_path, "--get", "/{}".format(dash_encode(filename_no_stem))] + cli, [db_path, "--get", "/{}".format(tilde_encode(filename_no_stem))] ) assert result2.exit_code == 0, result2.output diff --git a/tests/test_html.py b/tests/test_html.py index de703284..76a8423a 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -29,7 +29,7 @@ def test_homepage(app_client_two_attached_databases): ) # Should be two attached databases assert [ - {"href": r"/extra-20database", "text": "extra database"}, + {"href": "/extra~20database", "text": "extra database"}, {"href": "/fixtures", "text": "fixtures"}, ] == [{"href": a["href"], "text": a.text.strip()} for a in soup.select("h2 a")] # Database should show count text and attached tables @@ -44,8 +44,8 @@ def test_homepage(app_client_two_attached_databases): {"href": a["href"], "text": a.text.strip()} for a in links_p.findAll("a") ] assert [ - {"href": r"/extra-20database/searchable", "text": "searchable"}, - {"href": r"/extra-20database/searchable_view", "text": "searchable_view"}, + {"href": r"/extra~20database/searchable", "text": "searchable"}, + {"href": r"/extra~20database/searchable_view", "text": "searchable_view"}, ] == table_links @@ -139,15 +139,15 @@ def test_database_page(app_client): queries_ul = soup.find("h2", text="Queries").find_next_sibling("ul") assert queries_ul is not None assert [ - ( - "/fixtures/-F0-9D-90-9C-F0-9D-90-A2-F0-9D-90-AD-F0-9D-90-A2-F0-9D-90-9E-F0-9D-90-AC", - "𝐜𝐢𝐭𝐢𝐞𝐬", - ), ("/fixtures/from_async_hook", "from_async_hook"), ("/fixtures/from_hook", "from_hook"), ("/fixtures/magic_parameters", "magic_parameters"), ("/fixtures/neighborhood_search#fragment-goes-here", "Search neighborhoods"), ("/fixtures/pragma_cache_size", "pragma_cache_size"), + ( + "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC", + "𝐜𝐢𝐭𝐢𝐞𝐬", + ), ] == sorted( [(a["href"], a.text) for a in queries_ul.find_all("a")], key=lambda p: p[0] ) @@ -193,11 +193,11 @@ def test_row_redirects_with_url_hash(app_client_with_hash): def test_row_strange_table_name_with_url_hash(app_client_with_hash): - response = app_client_with_hash.get("/fixtures/table-2Fwith-2Fslashes-2Ecsv/3") + response = app_client_with_hash.get("/fixtures/table~2Fwith~2Fslashes~2Ecsv/3") assert response.status == 302 - assert response.headers["Location"].endswith("/table-2Fwith-2Fslashes-2Ecsv/3") + assert response.headers["Location"].endswith("/table~2Fwith~2Fslashes~2Ecsv/3") response = app_client_with_hash.get( - "/fixtures/table-2Fwith-2Fslashes-2Ecsv/3", follow_redirects=True + "/fixtures/table~2Fwith~2Fslashes~2Ecsv/3", follow_redirects=True ) assert response.status == 200 @@ -229,7 +229,7 @@ def test_row_page_does_not_truncate(): ["query", "db-fixtures", "query-neighborhood_search"], ), ( - "/fixtures/table%2Fwith%2Fslashes.csv", + "/fixtures/table~2Fwith~2Fslashes~2Ecsv", ["table", "db-fixtures", "table-tablewithslashescsv-fa7563"], ), ( @@ -255,7 +255,7 @@ def test_css_classes_on_body(app_client, path, expected_classes): "table-fixtures-simple_primary_key.html, *table.html", ), ( - "/fixtures/table%2Fwith%2Fslashes.csv", + "/fixtures/table~2Fwith~2Fslashes~2Ecsv", "table-fixtures-tablewithslashescsv-fa7563.html, *table.html", ), ( @@ -359,7 +359,7 @@ def test_row_links_from_other_tables(app_client, path, expected_text, expected_l ], ), ( - "/fixtures/compound_primary_key/a-2Fb,-2Ec-2Dd", + "/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd", [ [ 'a/b', @@ -816,7 +816,8 @@ def test_base_url_affects_metadata_extra_css_urls(app_client_base_url_prefix): ), ("/fixtures/pragma_cache_size", None), ( - "/fixtures/𝐜𝐢𝐭𝐢𝐞𝐬", + # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬 + "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC", "/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B", ), ("/fixtures/magic_parameters", None), @@ -824,6 +825,7 @@ def test_base_url_affects_metadata_extra_css_urls(app_client_base_url_prefix): ) def test_edit_sql_link_on_canned_queries(app_client, path, expected): response = app_client.get(path) + assert response.status == 200 expected_link = f'Edit SQL' if expected: assert expected_link in response.text @@ -898,8 +900,8 @@ def test_trace_correctly_escaped(app_client): # Table page ("/fixtures/facetable", "http://localhost/fixtures/facetable.json"), ( - "/fixtures/table%2Fwith%2Fslashes.csv", - "http://localhost/fixtures/table%2Fwith%2Fslashes.csv?_format=json", + "/fixtures/table~2Fwith~2Fslashes~2Ecsv", + "http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json", ), # Row page ( @@ -930,6 +932,7 @@ def test_trace_correctly_escaped(app_client): ) def test_alternate_url_json(app_client, path, expected): response = app_client.get(path) + assert response.status == 200 link = response.headers["link"] assert link == '{}; rel="alternate"; type="application/json+datasette"'.format( expected @@ -959,13 +962,17 @@ def test_no_alternate_url_json(app_client, path): ( ( "/fivethirtyeight/twitter-ratio%2Fsenators", - "/fivethirtyeight/twitter-2Dratio-2Fsenators", + "/fivethirtyeight/twitter-ratio~2Fsenators", + ), + ( + "/fixtures/table%2Fwith%2Fslashes.csv", + "/fixtures/table~2Fwith~2Fslashes~2Ecsv", ), # query string should be preserved - ("/foo/bar%2Fbaz?id=5", "/foo/bar-2Fbaz?id=5"), + ("/foo/bar%2Fbaz?id=5", "/foo/bar~2Fbaz?id=5"), ), ) -def test_redirect_percent_encoding_to_dash_encoding(app_client, path, expected): +def test_redirect_percent_encoding_to_tilde_encoding(app_client, path, expected): response = app_client.get(path) assert response.status == 302 assert response.headers["location"] == expected diff --git a/tests/test_internals_urls.py b/tests/test_internals_urls.py index 16515ad6..4307789c 100644 --- a/tests/test_internals_urls.py +++ b/tests/test_internals_urls.py @@ -121,7 +121,7 @@ def test_database(ds, base_url, format, expected): ("/", "name", None, "/_memory/name"), ("/prefix/", "name", None, "/prefix/_memory/name"), ("/", "name", "json", "/_memory/name.json"), - ("/", "name.json", "json", "/_memory/name-2Ejson.json"), + ("/", "name.json", "json", "/_memory/name~2Ejson.json"), ], ) def test_table_and_query(ds, base_url, name, format, expected): diff --git a/tests/test_table_api.py b/tests/test_table_api.py index cc38d392..3ab369b3 100644 --- a/tests/test_table_api.py +++ b/tests/test_table_api.py @@ -138,13 +138,13 @@ def test_table_shape_object_compound_primary_key(app_client): response = app_client.get("/fixtures/compound_primary_key.json?_shape=object") assert response.json == { "a,b": {"pk1": "a", "pk2": "b", "content": "c"}, - "a-2Fb,-2Ec-2Dd": {"pk1": "a/b", "pk2": ".c-d", "content": "c"}, + "a~2Fb,~2Ec-d": {"pk1": "a/b", "pk2": ".c-d", "content": "c"}, } def test_table_with_slashes_in_name(app_client): response = app_client.get( - "/fixtures/table%2Fwith%2Fslashes.csv?_shape=objects&_format=json" + "/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects" ) assert response.status == 200 data = response.json @@ -1032,7 +1032,10 @@ def test_infinity_returned_as_invalid_json_if_requested(app_client): def test_custom_query_with_unicode_characters(app_client): - response = app_client.get("/fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json?_shape=array") + # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json + response = app_client.get( + "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array" + ) assert [{"id": 1, "name": "San Francisco"}] == response.json diff --git a/tests/test_table_html.py b/tests/test_table_html.py index 77d97d80..d40f017a 100644 --- a/tests/test_table_html.py +++ b/tests/test_table_html.py @@ -565,7 +565,7 @@ def test_table_html_compound_primary_key(app_client): 'c', ], [ - 'a/b,.c-d', + 'a/b,.c-d', 'a/b', '.c-d', 'c', diff --git a/tests/test_utils.py b/tests/test_utils.py index 1c3ab495..790aadc7 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -19,8 +19,8 @@ from unittest.mock import patch ("foo", ["foo"]), ("foo,bar", ["foo", "bar"]), ("123,433,112", ["123", "433", "112"]), - ("123%2C433,112", ["123,433", "112"]), - ("123%2F433%2F112", ["123/433/112"]), + ("123~2C433,112", ["123,433", "112"]), + ("123~2F433~2F112", ["123/433/112"]), ], ) def test_urlsafe_components(path, expected): @@ -93,7 +93,7 @@ def test_path_with_replaced_args(path, args, expected): "row,pks,expected_path", [ ({"A": "foo", "B": "bar"}, ["A", "B"], "foo,bar"), - ({"A": "f,o", "B": "bar"}, ["A", "B"], "f-2Co,bar"), + ({"A": "f,o", "B": "bar"}, ["A", "B"], "f~2Co,bar"), ({"A": 123}, ["A"], "123"), ( utils.CustomRow( @@ -393,9 +393,7 @@ def test_table_columns(): ("/foo?sql=select+1", "json", {}, "/foo.json?sql=select+1"), ("/foo/bar", "json", {}, "/foo/bar.json"), ("/foo/bar", "csv", {}, "/foo/bar.csv"), - ("/foo/bar.csv", "json", {}, "/foo/bar.csv?_format=json"), ("/foo/bar", "csv", {"_dl": 1}, "/foo/bar.csv?_dl=1"), - ("/foo/b.csv", "json", {"_dl": 1}, "/foo/b.csv?_dl=1&_format=json"), ( "/sf-trees/Street_Tree_List?_search=cherry&_size=1000", "csv", @@ -410,18 +408,6 @@ def test_path_with_format(path, format, extra_qs, expected): assert expected == actual -def test_path_with_format_replace_format(): - request = Request.fake("/foo/bar.csv") - assert ( - utils.path_with_format(request=request, format="blob") - == "/foo/bar.csv?_format=blob" - ) - assert ( - utils.path_with_format(request=request, format="blob", replace_format="csv") - == "/foo/bar.blob" - ) - - @pytest.mark.parametrize( "bytes,expected", [ @@ -652,15 +638,15 @@ async def test_derive_named_parameters(sql, expected): "original,expected", ( ("abc", "abc"), - ("/foo/bar", "-2Ffoo-2Fbar"), - ("/-/bar", "-2F-2D-2Fbar"), - ("-/db-/table.csv", "-2D-2Fdb-2D-2Ftable-2Ecsv"), - (r"%~-/", "-25-7E-2D-2F"), - ("-25-7E-2D-2F", "-2D25-2D7E-2D2D-2D2F"), + ("/foo/bar", "~2Ffoo~2Fbar"), + ("/-/bar", "~2F-~2Fbar"), + ("-/db-/table.csv", "-~2Fdb-~2Ftable~2Ecsv"), + (r"%~-/", "~25~7E-~2F"), + ("~25~7E~2D~2F", "~7E25~7E7E~7E2D~7E2F"), ), ) -def test_dash_encoding(original, expected): - actual = utils.dash_encode(original) +def test_tilde_encoding(original, expected): + actual = utils.tilde_encode(original) assert actual == expected # And test round-trip - assert original == utils.dash_decode(actual) + assert original == utils.tilde_decode(actual) From 77a904fea14f743560af9cc668146339bdbbd0a9 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 15 Mar 2022 11:03:01 -0700 Subject: [PATCH 008/952] Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 (#1656) Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version. - [Release notes](https://github.com/pytest-dev/pytest/releases) - [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst) - [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...7.1.0) --- updated-dependencies: - dependency-name: pytest dependency-type: direct:development ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index e70839d6..4b58b8c4 100644 --- a/setup.py +++ b/setup.py @@ -67,7 +67,7 @@ setup( extras_require={ "docs": ["sphinx_rtd_theme", "sphinx-autobuild", "codespell"], "test": [ - "pytest>=5.2.2,<7.1.0", + "pytest>=5.2.2,<7.2.0", "pytest-xdist>=2.2.1,<2.6", "pytest-asyncio>=0.17,<0.19", "beautifulsoup4>=4.8.1,<4.11.0", From 30e5f0e67c38054a8087a2a4eae3fc4d1779af90 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 17 Mar 2022 14:30:02 -0700 Subject: [PATCH 009/952] Documented internals used by datasette-hashed-urls Closes #1663 --- docs/internals.rst | 15 ++++++++++++++- 1 file changed, 14 insertions(+), 1 deletion(-) diff --git a/docs/internals.rst b/docs/internals.rst index 3d223603..117cb95c 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -217,12 +217,18 @@ You can create your own instance of this - for example to help write tests for a } }) +Constructor parameters include: + +- ``files=[...]`` - a list of database files to open +- ``immutables=[...]`` - a list of database files to open in immutable mode +- ``metadata={...}`` - a dictionary of :ref:`metadata` + .. _datasette_databases: .databases ---------- -Property exposing an ordered dictionary of databases currently connected to Datasette. +Property exposing a ``collections.OrderedDict`` of databases currently connected to Datasette. The dictionary keys are the name of the database that is used in the URL - e.g. ``/fixtures`` would have a key of ``"fixtures"``. The values are :ref:`internals_database` instances. @@ -582,6 +588,13 @@ The arguments are as follows: The first argument is the ``datasette`` instance you are attaching to, the second is a ``path=``, then ``is_mutable`` and ``is_memory`` are both optional arguments. +.. _database_hash: + +db.hash +------- + +If the database was opened in immutable mode, this property returns the 64 character SHA-256 hash of the database contents as a string. Otherwise it returns ``None``. + .. _database_execute: await db.execute(sql, ...) From d4f60c2388c01ddce1b16f95c16d310e037c9912 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 18 Mar 2022 17:12:03 -0700 Subject: [PATCH 010/952] Remove hashed URL mode Also simplified how view class routing works. Refs #1661 --- datasette/app.py | 2 +- datasette/views/base.py | 153 ++++++----------------------------- datasette/views/database.py | 19 +++-- datasette/views/index.py | 3 +- datasette/views/special.py | 3 +- datasette/views/table.py | 34 ++++---- tests/fixtures.py | 6 -- tests/test_api.py | 29 ------- tests/test_custom_pages.py | 42 +++++----- tests/test_html.py | 28 ------- tests/test_internals_urls.py | 18 ----- tests/test_table_api.py | 8 -- 12 files changed, 79 insertions(+), 266 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index b39ef7cd..3099ada7 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -1097,7 +1097,7 @@ class Datasette: ) add_route( TableView.as_view(self), - r"/(?P[^/]+)/(?P[^/]+?$)", + r"/(?P[^/]+)/(?P[^\/\.]+)(\.[a-zA-Z0-9_]+)?$", ) add_route( RowView.as_view(self), diff --git a/datasette/views/base.py b/datasette/views/base.py index 1c0c3f9b..e31beb19 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -122,11 +122,11 @@ class BaseView: async def delete(self, request, *args, **kwargs): return Response.text("Method not allowed", status=405) - async def dispatch_request(self, request, *args, **kwargs): + async def dispatch_request(self, request): if self.ds: await self.ds.refresh_schemas() handler = getattr(self, request.method.lower(), None) - return await handler(request, *args, **kwargs) + return await handler(request) async def render(self, templates, request, context=None): context = context or {} @@ -169,9 +169,7 @@ class BaseView: def as_view(cls, *class_args, **class_kwargs): async def view(request, send): self = view.view_class(*class_args, **class_kwargs) - return await self.dispatch_request( - request, **request.scope["url_route"]["kwargs"] - ) + return await self.dispatch_request(request) view.view_class = cls view.__doc__ = cls.__doc__ @@ -200,90 +198,14 @@ class DataView(BaseView): add_cors_headers(r.headers) return r - async def data(self, request, database, hash, **kwargs): + async def data(self, request): raise NotImplementedError - async def resolve_db_name(self, request, db_name, **kwargs): - hash = None - name = None - decoded_name = tilde_decode(db_name) - if decoded_name not in self.ds.databases and "-" in db_name: - # No matching DB found, maybe it's a name-hash? - name_bit, hash_bit = db_name.rsplit("-", 1) - if tilde_decode(name_bit) not in self.ds.databases: - raise NotFound(f"Database not found: {name}") - else: - name = tilde_decode(name_bit) - hash = hash_bit - else: - name = decoded_name - - try: - db = self.ds.databases[name] - except KeyError: - raise NotFound(f"Database not found: {name}") - - # Verify the hash - expected = "000" - if db.hash is not None: - expected = db.hash[:HASH_LENGTH] - correct_hash_provided = expected == hash - - if not correct_hash_provided: - if "table_and_format" in kwargs: - - async def async_table_exists(t): - return await db.table_exists(t) - - table, _format = await resolve_table_and_format( - table_and_format=tilde_decode(kwargs["table_and_format"]), - table_exists=async_table_exists, - allowed_formats=self.ds.renderers.keys(), - ) - kwargs["table"] = table - if _format: - kwargs["as_format"] = f".{_format}" - elif kwargs.get("table"): - kwargs["table"] = tilde_decode(kwargs["table"]) - - should_redirect = self.ds.urls.path(f"{name}-{expected}") - if kwargs.get("table"): - should_redirect += "/" + tilde_encode(kwargs["table"]) - if kwargs.get("pk_path"): - should_redirect += "/" + kwargs["pk_path"] - if kwargs.get("as_format"): - should_redirect += kwargs["as_format"] - if kwargs.get("as_db"): - should_redirect += kwargs["as_db"] - - if ( - (self.ds.setting("hash_urls") or "_hash" in request.args) - and - # Redirect only if database is immutable - not self.ds.databases[name].is_mutable - ): - return name, expected, correct_hash_provided, should_redirect - - return name, expected, correct_hash_provided, None - def get_templates(self, database, table=None): assert NotImplemented - async def get(self, request, db_name, **kwargs): - ( - database, - hash, - correct_hash_provided, - should_redirect, - ) = await self.resolve_db_name(request, db_name, **kwargs) - if should_redirect: - return self.redirect(request, should_redirect, remove_args={"_hash"}) - - return await self.view_get( - request, database, hash, correct_hash_provided, **kwargs - ) - - async def as_csv(self, request, database, hash, **kwargs): + async def as_csv(self, request, database): + kwargs = {} stream = request.args.get("_stream") # Do not calculate facets or counts: extra_parameters = [ @@ -313,9 +235,7 @@ class DataView(BaseView): kwargs["_size"] = "max" # Fetch the first page try: - response_or_template_contexts = await self.data( - request, database, hash, **kwargs - ) + response_or_template_contexts = await self.data(request) if isinstance(response_or_template_contexts, Response): return response_or_template_contexts elif len(response_or_template_contexts) == 4: @@ -367,10 +287,11 @@ class DataView(BaseView): next = None while first or (next and stream): try: + kwargs = {} if next: kwargs["_next"] = next if not first: - data, _, _ = await self.data(request, database, hash, **kwargs) + data, _, _ = await self.data(request, **kwargs) if first: if request.args.get("_header") != "off": await writer.writerow(headings) @@ -445,60 +366,39 @@ class DataView(BaseView): if not trace: content_type = "text/csv; charset=utf-8" disposition = 'attachment; filename="{}.csv"'.format( - kwargs.get("table", database) + request.url_vars.get("table", database) ) headers["content-disposition"] = disposition return AsgiStream(stream_fn, headers=headers, content_type=content_type) - async def get_format(self, request, database, args): - """Determine the format of the response from the request, from URL - parameters or from a file extension. - - `args` is a dict of the path components parsed from the URL by the router. - """ - # If ?_format= is provided, use that as the format - _format = request.args.get("_format", None) - if not _format: - _format = (args.pop("as_format", None) or "").lstrip(".") + def get_format(self, request): + # Format is the bit from the path following the ., if one exists + last_path_component = request.path.split("/")[-1] + if "." in last_path_component: + return last_path_component.split(".")[-1] else: - args.pop("as_format", None) - if "table_and_format" in args: - db = self.ds.databases[database] + return None - async def async_table_exists(t): - return await db.table_exists(t) - - table, _ext_format = await resolve_table_and_format( - table_and_format=tilde_decode(args["table_and_format"]), - table_exists=async_table_exists, - allowed_formats=self.ds.renderers.keys(), - ) - _format = _format or _ext_format - args["table"] = table - del args["table_and_format"] - elif "table" in args: - args["table"] = tilde_decode(args["table"]) - return _format, args - - async def view_get(self, request, database, hash, correct_hash_provided, **kwargs): - _format, kwargs = await self.get_format(request, database, kwargs) + async def get(self, request): + db_name = request.url_vars["db_name"] + database = tilde_decode(db_name) + _format = self.get_format(request) + data_kwargs = {} if _format == "csv": - return await self.as_csv(request, database, hash, **kwargs) + return await self.as_csv(request, database) if _format is None: # HTML views default to expanding all foreign key labels - kwargs["default_labels"] = True + data_kwargs["default_labels"] = True extra_template_data = {} start = time.perf_counter() status_code = None templates = [] try: - response_or_template_contexts = await self.data( - request, database, hash, **kwargs - ) + response_or_template_contexts = await self.data(request, **data_kwargs) if isinstance(response_or_template_contexts, Response): return response_or_template_contexts # If it has four items, it includes an HTTP status code @@ -650,10 +550,7 @@ class DataView(BaseView): ttl = request.args.get("_ttl", None) if ttl is None or not ttl.isdigit(): - if correct_hash_provided: - ttl = self.ds.setting("default_cache_ttl_hashed") - else: - ttl = self.ds.setting("default_cache_ttl") + ttl = self.ds.setting("default_cache_ttl") return self.set_response_headers(r, ttl) diff --git a/datasette/views/database.py b/datasette/views/database.py index e26706e7..48635e01 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -12,6 +12,7 @@ from datasette.utils import ( await_me_maybe, check_visibility, derive_named_parameters, + tilde_decode, to_css_class, validate_sql_select, is_url, @@ -21,7 +22,7 @@ from datasette.utils import ( sqlite3, InvalidSql, ) -from datasette.utils.asgi import AsgiFileDownload, Response, Forbidden +from datasette.utils.asgi import AsgiFileDownload, NotFound, Response, Forbidden from datasette.plugins import pm from .base import DatasetteError, DataView @@ -30,7 +31,8 @@ from .base import DatasetteError, DataView class DatabaseView(DataView): name = "database" - async def data(self, request, database, hash, default_labels=False, _size=None): + async def data(self, request, default_labels=False, _size=None): + database = tilde_decode(request.url_vars["db_name"]) await self.check_permissions( request, [ @@ -45,10 +47,13 @@ class DatabaseView(DataView): sql = request.args.get("sql") validate_sql_select(sql) return await QueryView(self.ds).data( - request, database, hash, sql, _size=_size, metadata=metadata + request, sql, _size=_size, metadata=metadata ) - db = self.ds.databases[database] + try: + db = self.ds.databases[database] + except KeyError: + raise NotFound("Database not found: {}".format(database)) table_counts = await db.table_counts(5) hidden_table_names = set(await db.hidden_table_names()) @@ -156,7 +161,8 @@ class DatabaseView(DataView): class DatabaseDownload(DataView): name = "database_download" - async def view_get(self, request, database, hash, correct_hash_present, **kwargs): + async def get(self, request): + database = tilde_decode(request.url_vars["db_name"]) await self.check_permissions( request, [ @@ -191,8 +197,6 @@ class QueryView(DataView): async def data( self, request, - database, - hash, sql, editable=True, canned_query=None, @@ -201,6 +205,7 @@ class QueryView(DataView): named_parameters=None, write=False, ): + database = tilde_decode(request.url_vars["db_name"]) params = {key: request.args.get(key) for key in request.args} if "sql" in params: params.pop("sql") diff --git a/datasette/views/index.py b/datasette/views/index.py index 18454759..311a49db 100644 --- a/datasette/views/index.py +++ b/datasette/views/index.py @@ -18,7 +18,8 @@ COUNT_DB_SIZE_LIMIT = 100 * 1024 * 1024 class IndexView(BaseView): name = "index" - async def get(self, request, as_format): + async def get(self, request): + as_format = request.url_vars["as_format"] await self.check_permission(request, "view-instance") databases = [] for name, db in self.ds.databases.items(): diff --git a/datasette/views/special.py b/datasette/views/special.py index cdd530f0..c7b5061f 100644 --- a/datasette/views/special.py +++ b/datasette/views/special.py @@ -14,7 +14,8 @@ class JsonDataView(BaseView): self.data_callback = data_callback self.needs_request = needs_request - async def get(self, request, as_format): + async def get(self, request): + as_format = request.url_vars["as_format"] await self.check_permission(request, "view-instance") if self.needs_request: data = self.data_callback(request) diff --git a/datasette/views/table.py b/datasette/views/table.py index 72b8e9a4..8bdc7417 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -271,20 +271,18 @@ class RowTableShared(DataView): class TableView(RowTableShared): name = "table" - async def post(self, request, db_name, table_and_format): + async def post(self, request): + db_name = tilde_decode(request.url_vars["db_name"]) + table = tilde_decode(request.url_vars["table"]) # Handle POST to a canned query - canned_query = await self.ds.get_canned_query( - db_name, table_and_format, request.actor - ) + canned_query = await self.ds.get_canned_query(db_name, table, request.actor) assert canned_query, "You may only POST to a canned query" return await QueryView(self.ds).data( request, - db_name, - None, canned_query["sql"], metadata=canned_query, editable=False, - canned_query=table_and_format, + canned_query=table, named_parameters=canned_query.get("params"), write=bool(canned_query.get("write")), ) @@ -325,20 +323,22 @@ class TableView(RowTableShared): async def data( self, request, - database, - hash, - table, default_labels=False, _next=None, _size=None, ): + database = tilde_decode(request.url_vars["db_name"]) + table = tilde_decode(request.url_vars["table"]) + try: + db = self.ds.databases[database] + except KeyError: + raise NotFound("Database not found: {}".format(database)) + # If this is a canned query, not a table, then dispatch to QueryView instead canned_query = await self.ds.get_canned_query(database, table, request.actor) if canned_query: return await QueryView(self.ds).data( request, - database, - hash, canned_query["sql"], metadata=canned_query, editable=False, @@ -347,9 +347,6 @@ class TableView(RowTableShared): write=bool(canned_query.get("write")), ) - table = tilde_decode(table) - - db = self.ds.databases[database] is_view = bool(await db.get_view_definition(table)) table_exists = bool(await db.table_exists(table)) @@ -940,8 +937,9 @@ async def _sql_params_pks(db, table, pk_values): class RowView(RowTableShared): name = "row" - async def data(self, request, database, hash, table, pk_path, default_labels=False): - table = tilde_decode(table) + async def data(self, request, default_labels=False): + database = tilde_decode(request.url_vars["db_name"]) + table = tilde_decode(request.url_vars["table"]) await self.check_permissions( request, [ @@ -950,7 +948,7 @@ class RowView(RowTableShared): "view-instance", ], ) - pk_values = urlsafe_components(pk_path) + pk_values = urlsafe_components(request.url_vars["pk_path"]) db = self.ds.databases[database] sql, params, pks = await _sql_params_pks(db, table, pk_values) results = await db.execute(sql, params, truncate=True) diff --git a/tests/fixtures.py b/tests/fixtures.py index 11f09c41..342a3020 100644 --- a/tests/fixtures.py +++ b/tests/fixtures.py @@ -214,12 +214,6 @@ def app_client_two_attached_databases_one_immutable(): yield client -@pytest.fixture(scope="session") -def app_client_with_hash(): - with make_app_client(settings={"hash_urls": True}, is_immutable=True) as client: - yield client - - @pytest.fixture(scope="session") def app_client_with_trace(): with make_app_client(settings={"trace_debug": True}, is_immutable=True) as client: diff --git a/tests/test_api.py b/tests/test_api.py index 87d91e56..46e41afb 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -825,35 +825,6 @@ def test_config_redirects_to_settings(app_client, path, expected_redirect): assert response.headers["Location"] == expected_redirect -@pytest.mark.parametrize( - "path,expected_redirect", - [ - ("/fixtures/facetable.json?_hash=1", "/fixtures-HASH/facetable.json"), - ( - "/fixtures/facetable.json?city_id=1&_hash=1", - "/fixtures-HASH/facetable.json?city_id=1", - ), - ], -) -def test_hash_parameter( - app_client_two_attached_databases_one_immutable, path, expected_redirect -): - # First get the current hash for the fixtures database - current_hash = app_client_two_attached_databases_one_immutable.ds.databases[ - "fixtures" - ].hash[:7] - response = app_client_two_attached_databases_one_immutable.get(path) - assert response.status == 302 - location = response.headers["Location"] - assert expected_redirect.replace("HASH", current_hash) == location - - -def test_hash_parameter_ignored_for_mutable_databases(app_client): - path = "/fixtures/facetable.json?_hash=1" - response = app_client.get(path) - assert response.status == 200 - - test_json_columns_default_expected = [ {"intval": 1, "strval": "s", "floatval": 0.5, "jsonval": '{"foo": "bar"}'} ] diff --git a/tests/test_custom_pages.py b/tests/test_custom_pages.py index 66b7437a..f2cfe394 100644 --- a/tests/test_custom_pages.py +++ b/tests/test_custom_pages.py @@ -21,61 +21,61 @@ def custom_pages_client_with_base_url(): def test_custom_pages_view_name(custom_pages_client): response = custom_pages_client.get("/about") - assert 200 == response.status - assert "ABOUT! view_name:page" == response.text + assert response.status == 200 + assert response.text == "ABOUT! view_name:page" def test_request_is_available(custom_pages_client): response = custom_pages_client.get("/request") - assert 200 == response.status - assert "path:/request" == response.text + assert response.status == 200 + assert response.text == "path:/request" def test_custom_pages_with_base_url(custom_pages_client_with_base_url): response = custom_pages_client_with_base_url.get("/prefix/request") - assert 200 == response.status - assert "path:/prefix/request" == response.text + assert response.status == 200 + assert response.text == "path:/prefix/request" def test_custom_pages_nested(custom_pages_client): response = custom_pages_client.get("/nested/nest") - assert 200 == response.status - assert "Nest!" == response.text + assert response.status == 200 + assert response.text == "Nest!" response = custom_pages_client.get("/nested/nest2") - assert 404 == response.status + assert response.status == 404 def test_custom_status(custom_pages_client): response = custom_pages_client.get("/202") - assert 202 == response.status - assert "202!" == response.text + assert response.status == 202 + assert response.text == "202!" def test_custom_headers(custom_pages_client): response = custom_pages_client.get("/headers") - assert 200 == response.status - assert "foo" == response.headers["x-this-is-foo"] - assert "bar" == response.headers["x-this-is-bar"] - assert "FOOBAR" == response.text + assert response.status == 200 + assert response.headers["x-this-is-foo"] == "foo" + assert response.headers["x-this-is-bar"] == "bar" + assert response.text == "FOOBAR" def test_custom_content_type(custom_pages_client): response = custom_pages_client.get("/atom") - assert 200 == response.status + assert response.status == 200 assert response.headers["content-type"] == "application/xml" - assert "" == response.text + assert response.text == "" def test_redirect(custom_pages_client): response = custom_pages_client.get("/redirect") - assert 302 == response.status - assert "/example" == response.headers["Location"] + assert response.status == 302 + assert response.headers["Location"] == "/example" def test_redirect2(custom_pages_client): response = custom_pages_client.get("/redirect2") - assert 301 == response.status - assert "/example" == response.headers["Location"] + assert response.status == 301 + assert response.headers["Location"] == "/example" @pytest.mark.parametrize( diff --git a/tests/test_html.py b/tests/test_html.py index 76a8423a..6e4c22b1 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -5,7 +5,6 @@ from .fixtures import ( # noqa app_client_base_url_prefix, app_client_shorter_time_limit, app_client_two_attached_databases, - app_client_with_hash, make_app_client, METADATA, ) @@ -101,13 +100,6 @@ def test_not_allowed_methods(): assert response.status == 405 -def test_database_page_redirects_with_url_hash(app_client_with_hash): - response = app_client_with_hash.get("/fixtures") - assert response.status == 302 - response = app_client_with_hash.get("/fixtures", follow_redirects=True) - assert "fixtures" in response.text - - def test_database_page(app_client): response = app_client.get("/fixtures") soup = Soup(response.body, "html.parser") @@ -182,26 +174,6 @@ def test_sql_time_limit(app_client_shorter_time_limit): assert expected_html_fragment in response.text -def test_row_redirects_with_url_hash(app_client_with_hash): - response = app_client_with_hash.get("/fixtures/simple_primary_key/1") - assert response.status == 302 - assert response.headers["Location"].endswith("/1") - response = app_client_with_hash.get( - "/fixtures/simple_primary_key/1", follow_redirects=True - ) - assert response.status == 200 - - -def test_row_strange_table_name_with_url_hash(app_client_with_hash): - response = app_client_with_hash.get("/fixtures/table~2Fwith~2Fslashes~2Ecsv/3") - assert response.status == 302 - assert response.headers["Location"].endswith("/table~2Fwith~2Fslashes~2Ecsv/3") - response = app_client_with_hash.get( - "/fixtures/table~2Fwith~2Fslashes~2Ecsv/3", follow_redirects=True - ) - assert response.status == 200 - - def test_row_page_does_not_truncate(): with make_app_client(settings={"truncate_cells_html": 5}) as client: response = client.get("/fixtures/facetable/1") diff --git a/tests/test_internals_urls.py b/tests/test_internals_urls.py index 4307789c..d60aafcf 100644 --- a/tests/test_internals_urls.py +++ b/tests/test_internals_urls.py @@ -1,6 +1,5 @@ from datasette.app import Datasette from datasette.utils import PrefixedUrlString -from .fixtures import app_client_with_hash import pytest @@ -147,20 +146,3 @@ def test_row(ds, base_url, format, expected): actual = ds.urls.row("_memory", "facetable", "1", format=format) assert actual == expected assert isinstance(actual, PrefixedUrlString) - - -@pytest.mark.parametrize("base_url", ["/", "/prefix/"]) -def test_database_hashed(app_client_with_hash, base_url): - ds = app_client_with_hash.ds - original_base_url = ds._settings["base_url"] - try: - ds._settings["base_url"] = base_url - db_hash = ds.get_database("fixtures").hash - assert len(db_hash) == 64 - expected = f"{base_url}fixtures-{db_hash[:7]}" - assert ds.urls.database("fixtures") == expected - assert ds.urls.table("fixtures", "name") == expected + "/name" - assert ds.urls.query("fixtures", "name") == expected + "/name" - finally: - # Reset this since fixture is shared with other tests - ds._settings["base_url"] = original_base_url diff --git a/tests/test_table_api.py b/tests/test_table_api.py index 3ab369b3..3d0a7fbd 100644 --- a/tests/test_table_api.py +++ b/tests/test_table_api.py @@ -2,7 +2,6 @@ from datasette.utils import detect_json1 from datasette.utils.sqlite import sqlite_version from .fixtures import ( # noqa app_client, - app_client_with_hash, app_client_with_trace, app_client_returned_rows_matches_page_size, generate_compound_rows, @@ -41,13 +40,6 @@ def test_table_not_exists_json(app_client): } == app_client.get("/fixtures/blah.json").json -def test_jsono_redirects_to_shape_objects(app_client_with_hash): - response_1 = app_client_with_hash.get("/fixtures/simple_primary_key.jsono") - response = app_client_with_hash.get(response_1.headers["Location"]) - assert response.status == 302 - assert response.headers["Location"].endswith("?_shape=objects") - - def test_table_shape_arrays(app_client): response = app_client.get("/fixtures/simple_primary_key.json?_shape=arrays") assert [ From 8658c66438ec71edc7e9adc495f4692b937a0f57 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 18 Mar 2022 17:19:31 -0700 Subject: [PATCH 011/952] Show error if --setting hash_urls 1 used, refs #1661 --- datasette/app.py | 15 ++++++++++----- datasette/cli.py | 21 ++++++++++++++++++--- tests/test_cli.py | 7 +++++++ 3 files changed, 35 insertions(+), 8 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 3099ada7..c1c0663d 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -118,11 +118,6 @@ SETTINGS = ( 50, "Time limit for calculating a suggested facet", ), - Setting( - "hash_urls", - False, - "Include DB file contents hash in URLs, for far-future caching", - ), Setting( "allow_facet", True, @@ -177,6 +172,16 @@ SETTINGS = ( ), Setting("base_url", "/", "Datasette URLs should use this base path"), ) +OBSOLETE_SETTINGS = { + option.name: option + for option in ( + Setting( + "hash_urls", + False, + "The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead", + ), + ) +} DEFAULT_SETTINGS = {option.name: option.default for option in SETTINGS} diff --git a/datasette/cli.py b/datasette/cli.py index 61e7ce91..b94ac192 100644 --- a/datasette/cli.py +++ b/datasette/cli.py @@ -12,7 +12,14 @@ from subprocess import call import sys from runpy import run_module import webbrowser -from .app import Datasette, DEFAULT_SETTINGS, SETTINGS, SQLITE_LIMIT_ATTACHED, pm +from .app import ( + OBSOLETE_SETTINGS, + Datasette, + DEFAULT_SETTINGS, + SETTINGS, + SQLITE_LIMIT_ATTACHED, + pm, +) from .utils import ( StartupError, check_connection, @@ -50,8 +57,12 @@ class Config(click.ParamType): return name, value = config.split(":", 1) if name not in DEFAULT_SETTINGS: + if name in OBSOLETE_SETTINGS: + msg = OBSOLETE_SETTINGS[name].help + else: + msg = f"{name} is not a valid option (--help-settings to see all)" self.fail( - f"{name} is not a valid option (--help-settings to see all)", + msg, param, ctx, ) @@ -83,8 +94,12 @@ class Setting(CompositeParamType): def convert(self, config, param, ctx): name, value = config if name not in DEFAULT_SETTINGS: + if name in OBSOLETE_SETTINGS: + msg = OBSOLETE_SETTINGS[name].help + else: + msg = f"{name} is not a valid option (--help-settings to see all)" self.fail( - f"{name} is not a valid option (--help-settings to see all)", + msg, param, ctx, ) diff --git a/tests/test_cli.py b/tests/test_cli.py index 5afe72c1..89e8d044 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -310,3 +310,10 @@ def test_help_settings(): result = runner.invoke(cli, ["--help-settings"]) for setting in SETTINGS: assert setting.name in result.output + + +def test_help_error_on_hash_urls_setting(): + runner = CliRunner() + result = runner.invoke(cli, ["--setting", "hash_urls", 1]) + assert result.exit_code == 2 + assert 'The hash_urls setting has been removed' in result.output From 9979dcd07f9921ac30c4c0b5ea60d09cd1e10556 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 18 Mar 2022 17:25:14 -0700 Subject: [PATCH 012/952] Also remove default_cache_ttl_hashed setting, refs #1661 --- datasette/app.py | 17 +++-------------- datasette/cli.py | 16 ++++++++-------- datasette/url_builder.py | 9 +-------- tests/test_api.py | 2 -- tests/test_cli.py | 7 ++++--- 5 files changed, 16 insertions(+), 35 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index c1c0663d..f52e3283 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -134,11 +134,6 @@ SETTINGS = ( 5, "Default HTTP cache TTL (used in Cache-Control: max-age= header)", ), - Setting( - "default_cache_ttl_hashed", - 365 * 24 * 60 * 60, - "Default HTTP cache TTL for hashed URL pages", - ), Setting("cache_size_kb", 0, "SQLite cache size in KB (0 == use SQLite default)"), Setting( "allow_csv_stream", @@ -172,17 +167,11 @@ SETTINGS = ( ), Setting("base_url", "/", "Datasette URLs should use this base path"), ) +_HASH_URLS_REMOVED = "The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead" OBSOLETE_SETTINGS = { - option.name: option - for option in ( - Setting( - "hash_urls", - False, - "The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead", - ), - ) + "hash_urls": _HASH_URLS_REMOVED, + "default_cache_ttl_hashed": _HASH_URLS_REMOVED, } - DEFAULT_SETTINGS = {option.name: option.default for option in SETTINGS} FAVICON_PATH = app_root / "datasette" / "static" / "favicon.png" diff --git a/datasette/cli.py b/datasette/cli.py index b94ac192..3c6e1b2c 100644 --- a/datasette/cli.py +++ b/datasette/cli.py @@ -57,10 +57,10 @@ class Config(click.ParamType): return name, value = config.split(":", 1) if name not in DEFAULT_SETTINGS: - if name in OBSOLETE_SETTINGS: - msg = OBSOLETE_SETTINGS[name].help - else: - msg = f"{name} is not a valid option (--help-settings to see all)" + msg = ( + OBSOLETE_SETTINGS.get(name) + or f"{name} is not a valid option (--help-settings to see all)" + ) self.fail( msg, param, @@ -94,10 +94,10 @@ class Setting(CompositeParamType): def convert(self, config, param, ctx): name, value = config if name not in DEFAULT_SETTINGS: - if name in OBSOLETE_SETTINGS: - msg = OBSOLETE_SETTINGS[name].help - else: - msg = f"{name} is not a valid option (--help-settings to see all)" + msg = ( + OBSOLETE_SETTINGS.get(name) + or f"{name} is not a valid option (--help-settings to see all)" + ) self.fail( msg, param, diff --git a/datasette/url_builder.py b/datasette/url_builder.py index 9f072462..498ec85d 100644 --- a/datasette/url_builder.py +++ b/datasette/url_builder.py @@ -28,14 +28,7 @@ class Urls: return self.path("-/logout") def database(self, database, format=None): - db = self.ds.databases[database] - if self.ds.setting("hash_urls") and db.hash: - path = self.path( - f"{tilde_encode(database)}-{db.hash[:HASH_LENGTH]}", format=format - ) - else: - path = self.path(tilde_encode(database), format=format) - return path + return self.path(tilde_encode(database), format=format) def table(self, database, table, format=None): path = f"{self.database(database)}/{tilde_encode(table)}" diff --git a/tests/test_api.py b/tests/test_api.py index 46e41afb..d3c94023 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -798,14 +798,12 @@ def test_settings_json(app_client): "allow_facet": True, "suggest_facets": True, "default_cache_ttl": 5, - "default_cache_ttl_hashed": 365 * 24 * 60 * 60, "num_sql_threads": 1, "cache_size_kb": 0, "allow_csv_stream": True, "max_csv_mb": 100, "truncate_cells_html": 2048, "force_https_urls": False, - "hash_urls": False, "template_debug": False, "trace_debug": False, "base_url": "/", diff --git a/tests/test_cli.py b/tests/test_cli.py index 89e8d044..dca65f26 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -312,8 +312,9 @@ def test_help_settings(): assert setting.name in result.output -def test_help_error_on_hash_urls_setting(): +@pytest.mark.parametrize("setting", ("hash_urls", "default_cache_ttl_hashed")) +def test_help_error_on_hash_urls_setting(setting): runner = CliRunner() - result = runner.invoke(cli, ["--setting", "hash_urls", 1]) + result = runner.invoke(cli, ["--setting", setting, 1]) assert result.exit_code == 2 - assert 'The hash_urls setting has been removed' in result.output + assert "The hash_urls setting has been removed" in result.output From 32963018e7edfab1233de7c7076c428d0e5c7813 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 18 Mar 2022 17:33:06 -0700 Subject: [PATCH 013/952] Updated documentation to remove hash_urls, refs #1661 --- docs/cli-reference.rst | 4 ---- docs/performance.rst | 18 +++++++++++------- docs/settings.rst | 27 --------------------------- 3 files changed, 11 insertions(+), 38 deletions(-) diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index 155a005d..69670d8a 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -142,8 +142,6 @@ datasette serve --help-settings (default=200) facet_suggest_time_limit_ms Time limit for calculating a suggested facet (default=50) - hash_urls Include DB file contents hash in URLs, for far- - future caching (default=False) allow_facet Allow users to specify columns to facet using ?_facet= parameter (default=True) allow_download Allow users to download the original SQLite @@ -152,8 +150,6 @@ datasette serve --help-settings (default=True) default_cache_ttl Default HTTP cache TTL (used in Cache-Control: max-age= header) (default=5) - default_cache_ttl_hashed Default HTTP cache TTL for hashed URL pages - (default=31536000) cache_size_kb SQLite cache size in KB (0 == use SQLite default) (default=0) allow_csv_stream Allow .csv?_stream=1 to download all rows diff --git a/docs/performance.rst b/docs/performance.rst index bcf3208e..d37f1804 100644 --- a/docs/performance.rst +++ b/docs/performance.rst @@ -60,18 +60,22 @@ The :ref:`setting_default_cache_ttl` setting sets the default HTTP cache TTL for You can also change the cache timeout on a per-request basis using the ``?_ttl=10`` query string parameter. This can be useful when you are working with the Datasette JSON API - you may decide that a specific query can be cached for a longer time, or maybe you need to set ``?_ttl=0`` for some requests for example if you are running a SQL ``order by random()`` query. -Hashed URL mode ---------------- +datasette-hashed-urls +--------------------- -When you open a database file in immutable mode using the ``-i`` option, Datasette calculates a SHA-256 hash of the contents of that file on startup. This content hash can then optionally be used to create URLs that are guaranteed to change if the contents of the file changes in the future. This results in URLs that can then be cached indefinitely by both browsers and caching proxies - an enormous potential performance optimization. +If you open a database file in immutable mode using the ``-i`` option, you can be assured that the content of that database will not change for the lifetime of the Datasette server. -You can enable these hashed URLs in two ways: using the :ref:`setting_hash_urls` configuration setting (which affects all requests to Datasette) or via the ``?_hash=1`` query string parameter (which only applies to the current request). +The `datasette-hashed-urls plugin `__ implements an optimization where your database is served with part of the SHA-256 hash of the database contents baked into the URL. -With hashed URLs enabled, any request to e.g. ``/mydatabase/mytable`` will 302 redirect to ``mydatabase-455fe3a/mytable``. The URL containing the hash will be served with a very long cache expire header - configured using :ref:`setting_default_cache_ttl_hashed` which defaults to 365 days. +A database at ``/fixtures`` will instead be served at ``/fixtures-aa7318b``, and a year-long cache expiry header will be returned with those pages. -Since these responses are cached for a long time, you may wish to build API clients against the non-hashed version of these URLs. These 302 redirects are served extremely quickly, so this should still be a performant way to work against the Datasette API. +This will then be cached by both browsers and caching proxies such as Cloudflare or Fastly, providing a potentially significant performance boost. -If you run Datasette behind an `HTTP/2 server push `__ aware proxy such as Cloudflare Datasette will serve the 302 redirects in such a way that the redirected page will be efficiently "pushed" to the browser as part of the response, without the browser needing to make a second HTTP request to fetch the redirected resource. +To install the plugin, run the following:: + + datasette install datasette-hashed-urls .. note:: + Prior to Datasette 0.61 hashed URL mode was a core Datasette feature, enabled using the ``hash_urls`` setting. This implementation has now been removed in favor of the ``datasette-hashed-urls`` plugin. + Prior to Datasette 0.28 hashed URL mode was the default behaviour for Datasette, since all database files were assumed to be immutable and unchanging. From 0.28 onwards the default has been to treat database files as mutable unless explicitly configured otherwise. diff --git a/docs/settings.rst b/docs/settings.rst index da06d6a0..60c4b36d 100644 --- a/docs/settings.rst +++ b/docs/settings.rst @@ -178,17 +178,6 @@ Default HTTP caching max-age header in seconds, used for ``Cache-Control: max-ag datasette mydatabase.db --setting default_cache_ttl 60 -.. _setting_default_cache_ttl_hashed: - -default_cache_ttl_hashed -~~~~~~~~~~~~~~~~~~~~~~~~ - -Default HTTP caching max-age for responses served using using the :ref:`hashed-urls mechanism `. Defaults to 365 days (31536000 seconds). - -:: - - datasette mydatabase.db --setting default_cache_ttl_hashed 10000 - .. _setting_cache_size_kb: cache_size_kb @@ -251,22 +240,6 @@ HTTP but is served to the outside world via a proxy that enables HTTPS. datasette mydatabase.db --setting force_https_urls 1 -.. _setting_hash_urls: - -hash_urls -~~~~~~~~~ - -When enabled, this setting causes Datasette to append a content hash of the -database file to the URL path for every table and query within that database. - -When combined with far-future expire headers this ensures that queries can be -cached forever, safe in the knowledge that any modifications to the database -itself will result in new, uncached URL paths. - -:: - - datasette mydatabase.db --setting hash_urls 1 - .. _setting_template_debug: template_debug From 4e47a2d894b96854348343374c8e97c9d7055cf6 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 18 Mar 2022 18:37:54 -0700 Subject: [PATCH 014/952] Fixed bug where tables with a column called n caused 500 errors Closes #1228 --- datasette/facets.py | 6 +++--- tests/fixtures.py | 33 ++++++++++++++++---------------- tests/test_api.py | 1 + tests/test_csv.py | 32 +++++++++++++++---------------- tests/test_internals_database.py | 10 ++++++++++ tests/test_plugins.py | 6 ++++-- tests/test_table_api.py | 8 ++++++++ 7 files changed, 59 insertions(+), 37 deletions(-) diff --git a/datasette/facets.py b/datasette/facets.py index a1bb4a5f..b15a758c 100644 --- a/datasette/facets.py +++ b/datasette/facets.py @@ -151,10 +151,10 @@ class ColumnFacet(Facet): if column in already_enabled: continue suggested_facet_sql = """ - select {column}, count(*) as n from ( + select {column} as value, count(*) as n from ( {sql} - ) where {column} is not null - group by {column} + ) where value is not null + group by value limit {limit} """.format( column=escape_sqlite(column), sql=self.sql, limit=facet_size + 1 diff --git a/tests/fixtures.py b/tests/fixtures.py index 342a3020..e0e4ec7b 100644 --- a/tests/fixtures.py +++ b/tests/fixtures.py @@ -564,26 +564,27 @@ CREATE TABLE facetable ( tags text, complex_array text, distinct_some_null, + n text, FOREIGN KEY ("_city_id") REFERENCES [facet_cities](id) ); INSERT INTO facetable - (created, planet_int, on_earth, state, _city_id, _neighborhood, tags, complex_array, distinct_some_null) + (created, planet_int, on_earth, state, _city_id, _neighborhood, tags, complex_array, distinct_some_null, n) VALUES - ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'Mission', '["tag1", "tag2"]', '[{"foo": "bar"}]', 'one'), - ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'Dogpatch', '["tag1", "tag3"]', '[]', 'two'), - ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'SOMA', '[]', '[]', null), - ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'Tenderloin', '[]', '[]', null), - ("2019-01-15 08:00:00", 1, 1, 'CA', 1, 'Bernal Heights', '[]', '[]', null), - ("2019-01-15 08:00:00", 1, 1, 'CA', 1, 'Hayes Valley', '[]', '[]', null), - ("2019-01-15 08:00:00", 1, 1, 'CA', 2, 'Hollywood', '[]', '[]', null), - ("2019-01-15 08:00:00", 1, 1, 'CA', 2, 'Downtown', '[]', '[]', null), - ("2019-01-16 08:00:00", 1, 1, 'CA', 2, 'Los Feliz', '[]', '[]', null), - ("2019-01-16 08:00:00", 1, 1, 'CA', 2, 'Koreatown', '[]', '[]', null), - ("2019-01-16 08:00:00", 1, 1, 'MI', 3, 'Downtown', '[]', '[]', null), - ("2019-01-17 08:00:00", 1, 1, 'MI', 3, 'Greektown', '[]', '[]', null), - ("2019-01-17 08:00:00", 1, 1, 'MI', 3, 'Corktown', '[]', '[]', null), - ("2019-01-17 08:00:00", 1, 1, 'MI', 3, 'Mexicantown', '[]', '[]', null), - ("2019-01-17 08:00:00", 2, 0, 'MC', 4, 'Arcadia Planitia', '[]', '[]', null) + ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'Mission', '["tag1", "tag2"]', '[{"foo": "bar"}]', 'one', 'n1'), + ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'Dogpatch', '["tag1", "tag3"]', '[]', 'two', 'n2'), + ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'SOMA', '[]', '[]', null, null), + ("2019-01-14 08:00:00", 1, 1, 'CA', 1, 'Tenderloin', '[]', '[]', null, null), + ("2019-01-15 08:00:00", 1, 1, 'CA', 1, 'Bernal Heights', '[]', '[]', null, null), + ("2019-01-15 08:00:00", 1, 1, 'CA', 1, 'Hayes Valley', '[]', '[]', null, null), + ("2019-01-15 08:00:00", 1, 1, 'CA', 2, 'Hollywood', '[]', '[]', null, null), + ("2019-01-15 08:00:00", 1, 1, 'CA', 2, 'Downtown', '[]', '[]', null, null), + ("2019-01-16 08:00:00", 1, 1, 'CA', 2, 'Los Feliz', '[]', '[]', null, null), + ("2019-01-16 08:00:00", 1, 1, 'CA', 2, 'Koreatown', '[]', '[]', null, null), + ("2019-01-16 08:00:00", 1, 1, 'MI', 3, 'Downtown', '[]', '[]', null, null), + ("2019-01-17 08:00:00", 1, 1, 'MI', 3, 'Greektown', '[]', '[]', null, null), + ("2019-01-17 08:00:00", 1, 1, 'MI', 3, 'Corktown', '[]', '[]', null, null), + ("2019-01-17 08:00:00", 1, 1, 'MI', 3, 'Mexicantown', '[]', '[]', null, null), + ("2019-01-17 08:00:00", 2, 0, 'MC', 4, 'Arcadia Planitia', '[]', '[]', null, null) ; CREATE TABLE binary_data ( diff --git a/tests/test_api.py b/tests/test_api.py index d3c94023..421bb1fe 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -210,6 +210,7 @@ def test_database_page(app_client): "tags", "complex_array", "distinct_some_null", + "n", ], "primary_keys": ["pk"], "count": 15, diff --git a/tests/test_csv.py b/tests/test_csv.py index 8749cd8b..7fc25a09 100644 --- a/tests/test_csv.py +++ b/tests/test_csv.py @@ -24,22 +24,22 @@ world ) EXPECTED_TABLE_WITH_LABELS_CSV = """ -pk,created,planet_int,on_earth,state,_city_id,_city_id_label,_neighborhood,tags,complex_array,distinct_some_null -1,2019-01-14 08:00:00,1,1,CA,1,San Francisco,Mission,"[""tag1"", ""tag2""]","[{""foo"": ""bar""}]",one -2,2019-01-14 08:00:00,1,1,CA,1,San Francisco,Dogpatch,"[""tag1"", ""tag3""]",[],two -3,2019-01-14 08:00:00,1,1,CA,1,San Francisco,SOMA,[],[], -4,2019-01-14 08:00:00,1,1,CA,1,San Francisco,Tenderloin,[],[], -5,2019-01-15 08:00:00,1,1,CA,1,San Francisco,Bernal Heights,[],[], -6,2019-01-15 08:00:00,1,1,CA,1,San Francisco,Hayes Valley,[],[], -7,2019-01-15 08:00:00,1,1,CA,2,Los Angeles,Hollywood,[],[], -8,2019-01-15 08:00:00,1,1,CA,2,Los Angeles,Downtown,[],[], -9,2019-01-16 08:00:00,1,1,CA,2,Los Angeles,Los Feliz,[],[], -10,2019-01-16 08:00:00,1,1,CA,2,Los Angeles,Koreatown,[],[], -11,2019-01-16 08:00:00,1,1,MI,3,Detroit,Downtown,[],[], -12,2019-01-17 08:00:00,1,1,MI,3,Detroit,Greektown,[],[], -13,2019-01-17 08:00:00,1,1,MI,3,Detroit,Corktown,[],[], -14,2019-01-17 08:00:00,1,1,MI,3,Detroit,Mexicantown,[],[], -15,2019-01-17 08:00:00,2,0,MC,4,Memnonia,Arcadia Planitia,[],[], +pk,created,planet_int,on_earth,state,_city_id,_city_id_label,_neighborhood,tags,complex_array,distinct_some_null,n +1,2019-01-14 08:00:00,1,1,CA,1,San Francisco,Mission,"[""tag1"", ""tag2""]","[{""foo"": ""bar""}]",one,n1 +2,2019-01-14 08:00:00,1,1,CA,1,San Francisco,Dogpatch,"[""tag1"", ""tag3""]",[],two,n2 +3,2019-01-14 08:00:00,1,1,CA,1,San Francisco,SOMA,[],[],, +4,2019-01-14 08:00:00,1,1,CA,1,San Francisco,Tenderloin,[],[],, +5,2019-01-15 08:00:00,1,1,CA,1,San Francisco,Bernal Heights,[],[],, +6,2019-01-15 08:00:00,1,1,CA,1,San Francisco,Hayes Valley,[],[],, +7,2019-01-15 08:00:00,1,1,CA,2,Los Angeles,Hollywood,[],[],, +8,2019-01-15 08:00:00,1,1,CA,2,Los Angeles,Downtown,[],[],, +9,2019-01-16 08:00:00,1,1,CA,2,Los Angeles,Los Feliz,[],[],, +10,2019-01-16 08:00:00,1,1,CA,2,Los Angeles,Koreatown,[],[],, +11,2019-01-16 08:00:00,1,1,MI,3,Detroit,Downtown,[],[],, +12,2019-01-17 08:00:00,1,1,MI,3,Detroit,Greektown,[],[],, +13,2019-01-17 08:00:00,1,1,MI,3,Detroit,Corktown,[],[],, +14,2019-01-17 08:00:00,1,1,MI,3,Detroit,Mexicantown,[],[],, +15,2019-01-17 08:00:00,2,0,MC,4,Memnonia,Arcadia Planitia,[],[],, """.lstrip().replace( "\n", "\r\n" ) diff --git a/tests/test_internals_database.py b/tests/test_internals_database.py index 31538a24..551f67e1 100644 --- a/tests/test_internals_database.py +++ b/tests/test_internals_database.py @@ -86,6 +86,7 @@ async def test_table_exists(db, tables, exists): "tags", "complex_array", "distinct_some_null", + "n", ], ), ( @@ -204,6 +205,15 @@ async def test_table_columns(db, table, expected): is_pk=0, hidden=0, ), + Column( + cid=10, + name="n", + type="text", + notnull=0, + default_value=None, + is_pk=0, + hidden=0, + ), ], ), ( diff --git a/tests/test_plugins.py b/tests/test_plugins.py index 656f39e4..15bde962 100644 --- a/tests/test_plugins.py +++ b/tests/test_plugins.py @@ -442,6 +442,7 @@ def test_hook_register_output_renderer_all_parameters(app_client): "tags", "complex_array", "distinct_some_null", + "n", ], "rows": [ "", @@ -460,7 +461,7 @@ def test_hook_register_output_renderer_all_parameters(app_client): "", "", ], - "sql": "select pk, created, planet_int, on_earth, state, _city_id, _neighborhood, tags, complex_array, distinct_some_null from facetable order by pk limit 51", + "sql": "select pk, created, planet_int, on_earth, state, _city_id, _neighborhood, tags, complex_array, distinct_some_null, n from facetable order by pk limit 51", "query_name": None, "database": "fixtures", "table": "facetable", @@ -531,8 +532,9 @@ def test_hook_register_output_renderer_can_render(app_client): "tags", "complex_array", "distinct_some_null", + "n", ], - "sql": "select pk, created, planet_int, on_earth, state, _city_id, _neighborhood, tags, complex_array, distinct_some_null from facetable order by pk limit 51", + "sql": "select pk, created, planet_int, on_earth, state, _city_id, _neighborhood, tags, complex_array, distinct_some_null, n from facetable order by pk limit 51", "query_name": None, "database": "fixtures", "table": "facetable", diff --git a/tests/test_table_api.py b/tests/test_table_api.py index 3d0a7fbd..9db383c3 100644 --- a/tests/test_table_api.py +++ b/tests/test_table_api.py @@ -532,6 +532,7 @@ def test_table_filter_json_arraycontains(app_client): '["tag1", "tag2"]', '[{"foo": "bar"}]', "one", + "n1", ], [ 2, @@ -544,6 +545,7 @@ def test_table_filter_json_arraycontains(app_client): '["tag1", "tag3"]', "[]", "two", + "n2", ], ] @@ -565,6 +567,7 @@ def test_table_filter_json_arraynotcontains(app_client): '["tag1", "tag2"]', '[{"foo": "bar"}]', "one", + "n1", ] ] @@ -585,6 +588,7 @@ def test_table_filter_extra_where(app_client): '["tag1", "tag3"]', "[]", "two", + "n2", ] ] == response.json["rows"] @@ -958,6 +962,7 @@ def test_expand_labels(app_client): "tags": '["tag1", "tag3"]', "complex_array": "[]", "distinct_some_null": "two", + "n": "n2", }, "13": { "pk": 13, @@ -970,6 +975,7 @@ def test_expand_labels(app_client): "tags": "[]", "complex_array": "[]", "distinct_some_null": None, + "n": None, }, } == response.json @@ -1161,6 +1167,7 @@ def test_generated_columns_are_visible_in_datasette(): "tags", "complex_array", "distinct_some_null", + "n", ], ), ( @@ -1188,6 +1195,7 @@ def test_generated_columns_are_visible_in_datasette(): "tags", "complex_array", "distinct_some_null", + "n", ], ), ( From 711767bcd3c1e76a0861fe7f24069ff1c8efc97a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 18 Mar 2022 21:03:08 -0700 Subject: [PATCH 015/952] Refactored URL routing to add tests, closes #1666 Refs #1660 --- datasette/app.py | 54 ++++++++++++++++++++----------------- datasette/utils/__init__.py | 8 ++++++ tests/test_routes.py | 34 +++++++++++++++++++++++ 3 files changed, 72 insertions(+), 24 deletions(-) create mode 100644 tests/test_routes.py diff --git a/datasette/app.py b/datasette/app.py index f52e3283..8987112c 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -60,6 +60,7 @@ from .utils import ( module_from_path, parse_metadata, resolve_env_secrets, + resolve_routes, to_css_class, ) from .utils.asgi import ( @@ -974,8 +975,7 @@ class Datasette: output.append(script) return output - def app(self): - """Returns an ASGI app function that serves the whole of Datasette""" + def _routes(self): routes = [] for routes_to_add in pm.hook.register_routes(datasette=self): @@ -1099,6 +1099,15 @@ class Datasette: + renderer_regex + r")?$", ) + return [ + # Compile any strings to regular expressions + ((re.compile(pattern) if isinstance(pattern, str) else pattern), view) + for pattern, view in routes + ] + + def app(self): + """Returns an ASGI app function that serves the whole of Datasette""" + routes = self._routes() self._register_custom_units() async def setup_db(): @@ -1129,12 +1138,7 @@ class Datasette: class DatasetteRouter: def __init__(self, datasette, routes): self.ds = datasette - routes = routes or [] - self.routes = [ - # Compile any strings to regular expressions - ((re.compile(pattern) if isinstance(pattern, str) else pattern), view) - for pattern, view in routes - ] + self.routes = routes or [] # Build a list of pages/blah/{name}.html matching expressions pattern_templates = [ filepath @@ -1187,22 +1191,24 @@ class DatasetteRouter: break scope_modifications["actor"] = actor or default_actor scope = dict(scope, **scope_modifications) - for regex, view in self.routes: - match = regex.match(path) - if match is not None: - new_scope = dict(scope, url_route={"kwargs": match.groupdict()}) - request.scope = new_scope - try: - response = await view(request, send) - if response: - self.ds._write_messages_to_response(request, response) - await response.asgi_send(send) - return - except NotFound as exception: - return await self.handle_404(request, send, exception) - except Exception as exception: - return await self.handle_500(request, send, exception) - return await self.handle_404(request, send) + + match, view = resolve_routes(self.routes, path) + + if match is None: + return await self.handle_404(request, send) + + new_scope = dict(scope, url_route={"kwargs": match.groupdict()}) + request.scope = new_scope + try: + response = await view(request, send) + if response: + self.ds._write_messages_to_response(request, response) + await response.asgi_send(send) + return + except NotFound as exception: + return await self.handle_404(request, send, exception) + except Exception as exception: + return await self.handle_500(request, send, exception) async def handle_404(self, request, send, exception=None): # If path contains % encoding, redirect to tilde encoding diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index bd591459..ccdf8ad4 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -1178,3 +1178,11 @@ def tilde_decode(s: str) -> str: s = s.replace("%", temp) decoded = urllib.parse.unquote(s.replace("~", "%")) return decoded.replace(temp, "%") + + +def resolve_routes(routes, path): + for regex, view in routes: + match = regex.match(path) + if match is not None: + return match, view + return None, None diff --git a/tests/test_routes.py b/tests/test_routes.py new file mode 100644 index 00000000..a1960f14 --- /dev/null +++ b/tests/test_routes.py @@ -0,0 +1,34 @@ +from datasette.app import Datasette +from datasette.utils import resolve_routes +import pytest + + +@pytest.fixture(scope="session") +def routes(): + ds = Datasette() + return ds._routes() + + +@pytest.mark.parametrize( + "path,expected", + ( + ("/", "IndexView"), + ("/foo", "DatabaseView"), + ("/foo.csv", "DatabaseView"), + ("/foo.json", "DatabaseView"), + ("/foo.humbug", "DatabaseView"), + ("/foo/humbug", "TableView"), + ("/foo/humbug.json", "TableView"), + ("/foo/humbug.blah", "TableView"), + ("/foo/humbug/1", "RowView"), + ("/foo/humbug/1.json", "RowView"), + ("/-/metadata.json", "JsonDataView"), + ("/-/metadata", "JsonDataView"), + ), +) +def test_routes(routes, path, expected): + match, view = resolve_routes(routes, path) + if expected is None: + assert match is None + else: + assert view.view_class.__name__ == expected From 764738dfcb16cd98b0987d443f59d5baa9d3c332 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 09:30:22 -0700 Subject: [PATCH 016/952] test_routes also now asserts matches, refs #1666 --- tests/test_routes.py | 41 +++++++++++++++++++++++++---------------- 1 file changed, 25 insertions(+), 16 deletions(-) diff --git a/tests/test_routes.py b/tests/test_routes.py index a1960f14..6718c232 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -10,25 +10,34 @@ def routes(): @pytest.mark.parametrize( - "path,expected", + "path,expected_class,expected_matches", ( - ("/", "IndexView"), - ("/foo", "DatabaseView"), - ("/foo.csv", "DatabaseView"), - ("/foo.json", "DatabaseView"), - ("/foo.humbug", "DatabaseView"), - ("/foo/humbug", "TableView"), - ("/foo/humbug.json", "TableView"), - ("/foo/humbug.blah", "TableView"), - ("/foo/humbug/1", "RowView"), - ("/foo/humbug/1.json", "RowView"), - ("/-/metadata.json", "JsonDataView"), - ("/-/metadata", "JsonDataView"), + ("/", "IndexView", {"as_format": ""}), + ("/foo", "DatabaseView", {"as_format": None, "db_name": "foo"}), + ("/foo.csv", "DatabaseView", {"as_format": ".csv", "db_name": "foo"}), + ("/foo.json", "DatabaseView", {"as_format": ".json", "db_name": "foo"}), + ("/foo.humbug", "DatabaseView", {"as_format": None, "db_name": "foo.humbug"}), + ("/foo/humbug", "TableView", {"db_name": "foo", "table": "humbug"}), + ("/foo/humbug.json", "TableView", {"db_name": "foo", "table": "humbug"}), + ("/foo/humbug.blah", "TableView", {"db_name": "foo", "table": "humbug"}), + ( + "/foo/humbug/1", + "RowView", + {"as_format": None, "db_name": "foo", "pk_path": "1", "table": "humbug"}, + ), + ( + "/foo/humbug/1.json", + "RowView", + {"as_format": ".json", "db_name": "foo", "pk_path": "1", "table": "humbug"}, + ), + ("/-/metadata.json", "JsonDataView", {"as_format": ".json"}), + ("/-/metadata", "JsonDataView", {"as_format": ""}), ), ) -def test_routes(routes, path, expected): +def test_routes(routes, path, expected_class, expected_matches): match, view = resolve_routes(routes, path) - if expected is None: + if expected_class is None: assert match is None else: - assert view.view_class.__name__ == expected + assert view.view_class.__name__ == expected_class + assert match.groupdict() == expected_matches From 61419388c134001118aaf7dfb913562d467d7913 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 09:52:08 -0700 Subject: [PATCH 017/952] Rename route match groups for consistency, refs #1667, #1660 --- datasette/app.py | 28 ++++++++++++---------------- datasette/blob_renderer.py | 4 ++-- datasette/views/base.py | 2 +- datasette/views/database.py | 6 +++--- datasette/views/index.py | 2 +- datasette/views/special.py | 2 +- datasette/views/table.py | 8 ++++---- tests/test_routes.py | 24 ++++++++++++------------ 8 files changed, 36 insertions(+), 40 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 8987112c..5259c50c 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -988,7 +988,7 @@ class Datasette: # Generate a regex snippet to match all registered renderer file extensions renderer_regex = "|".join(r"\." + key for key in self.renderers.keys()) - add_route(IndexView.as_view(self), r"/(?P(\.jsono?)?$)") + add_route(IndexView.as_view(self), r"/(?P(\.jsono?)?$)") # TODO: /favicon.ico and /-/static/ deserve far-future cache expires add_route(favicon, "/favicon.ico") @@ -1020,21 +1020,21 @@ class Datasette: ) add_route( JsonDataView.as_view(self, "metadata.json", lambda: self.metadata()), - r"/-/metadata(?P(\.json)?)$", + r"/-/metadata(?P(\.json)?)$", ) add_route( JsonDataView.as_view(self, "versions.json", self._versions), - r"/-/versions(?P(\.json)?)$", + r"/-/versions(?P(\.json)?)$", ) add_route( JsonDataView.as_view( self, "plugins.json", self._plugins, needs_request=True ), - r"/-/plugins(?P(\.json)?)$", + r"/-/plugins(?P(\.json)?)$", ) add_route( JsonDataView.as_view(self, "settings.json", lambda: self._settings), - r"/-/settings(?P(\.json)?)$", + r"/-/settings(?P(\.json)?)$", ) add_route( permanent_redirect("/-/settings.json"), @@ -1046,15 +1046,15 @@ class Datasette: ) add_route( JsonDataView.as_view(self, "threads.json", self._threads), - r"/-/threads(?P(\.json)?)$", + r"/-/threads(?P(\.json)?)$", ) add_route( JsonDataView.as_view(self, "databases.json", self._connected_databases), - r"/-/databases(?P(\.json)?)$", + r"/-/databases(?P(\.json)?)$", ) add_route( JsonDataView.as_view(self, "actor.json", self._actor, needs_request=True), - r"/-/actor(?P(\.json)?)$", + r"/-/actor(?P(\.json)?)$", ) add_route( AuthTokenView.as_view(self), @@ -1080,22 +1080,18 @@ class Datasette: PatternPortfolioView.as_view(self), r"/-/patterns$", ) - add_route( - DatabaseDownload.as_view(self), r"/(?P[^/]+?)(?P\.db)$" - ) + add_route(DatabaseDownload.as_view(self), r"/(?P[^/]+?)\.db$") add_route( DatabaseView.as_view(self), - r"/(?P[^/]+?)(?P" - + renderer_regex - + r"|.jsono|\.csv)?$", + r"/(?P[^/]+?)(?P" + renderer_regex + r"|.jsono|\.csv)?$", ) add_route( TableView.as_view(self), - r"/(?P[^/]+)/(?P
[^\/\.]+)(\.[a-zA-Z0-9_]+)?$", + r"/(?P[^/]+)/(?P
[^\/\.]+)(\.[a-zA-Z0-9_]+)?$", ) add_route( RowView.as_view(self), - r"/(?P[^/]+)/(?P
[^/]+?)/(?P[^/]+?)(?P" + r"/(?P[^/]+)/(?P
[^/]+?)/(?P[^/]+?)(?P" + renderer_regex + r")?$", ) diff --git a/datasette/blob_renderer.py b/datasette/blob_renderer.py index 217b3638..4d8c6bea 100644 --- a/datasette/blob_renderer.py +++ b/datasette/blob_renderer.py @@ -34,8 +34,8 @@ async def render_blob(datasette, database, rows, columns, request, table, view_n filename_bits = [] if table: filename_bits.append(to_css_class(table)) - if "pk_path" in request.url_vars: - filename_bits.append(request.url_vars["pk_path"]) + if "pks" in request.url_vars: + filename_bits.append(request.url_vars["pks"]) filename_bits.append(to_css_class(blob_column)) if blob_hash: filename_bits.append(blob_hash[:6]) diff --git a/datasette/views/base.py b/datasette/views/base.py index e31beb19..0bbf98bb 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -381,7 +381,7 @@ class DataView(BaseView): return None async def get(self, request): - db_name = request.url_vars["db_name"] + db_name = request.url_vars["database"] database = tilde_decode(db_name) _format = self.get_format(request) data_kwargs = {} diff --git a/datasette/views/database.py b/datasette/views/database.py index 48635e01..93bd1011 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -32,7 +32,7 @@ class DatabaseView(DataView): name = "database" async def data(self, request, default_labels=False, _size=None): - database = tilde_decode(request.url_vars["db_name"]) + database = tilde_decode(request.url_vars["database"]) await self.check_permissions( request, [ @@ -162,7 +162,7 @@ class DatabaseDownload(DataView): name = "database_download" async def get(self, request): - database = tilde_decode(request.url_vars["db_name"]) + database = tilde_decode(request.url_vars["database"]) await self.check_permissions( request, [ @@ -205,7 +205,7 @@ class QueryView(DataView): named_parameters=None, write=False, ): - database = tilde_decode(request.url_vars["db_name"]) + database = tilde_decode(request.url_vars["database"]) params = {key: request.args.get(key) for key in request.args} if "sql" in params: params.pop("sql") diff --git a/datasette/views/index.py b/datasette/views/index.py index 311a49db..f5e31181 100644 --- a/datasette/views/index.py +++ b/datasette/views/index.py @@ -19,7 +19,7 @@ class IndexView(BaseView): name = "index" async def get(self, request): - as_format = request.url_vars["as_format"] + as_format = request.url_vars["format"] await self.check_permission(request, "view-instance") databases = [] for name, db in self.ds.databases.items(): diff --git a/datasette/views/special.py b/datasette/views/special.py index c7b5061f..395ee587 100644 --- a/datasette/views/special.py +++ b/datasette/views/special.py @@ -15,7 +15,7 @@ class JsonDataView(BaseView): self.needs_request = needs_request async def get(self, request): - as_format = request.url_vars["as_format"] + as_format = request.url_vars["format"] await self.check_permission(request, "view-instance") if self.needs_request: data = self.data_callback(request) diff --git a/datasette/views/table.py b/datasette/views/table.py index 8bdc7417..ea4f24b7 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -272,7 +272,7 @@ class TableView(RowTableShared): name = "table" async def post(self, request): - db_name = tilde_decode(request.url_vars["db_name"]) + db_name = tilde_decode(request.url_vars["database"]) table = tilde_decode(request.url_vars["table"]) # Handle POST to a canned query canned_query = await self.ds.get_canned_query(db_name, table, request.actor) @@ -327,7 +327,7 @@ class TableView(RowTableShared): _next=None, _size=None, ): - database = tilde_decode(request.url_vars["db_name"]) + database = tilde_decode(request.url_vars["database"]) table = tilde_decode(request.url_vars["table"]) try: db = self.ds.databases[database] @@ -938,7 +938,7 @@ class RowView(RowTableShared): name = "row" async def data(self, request, default_labels=False): - database = tilde_decode(request.url_vars["db_name"]) + database = tilde_decode(request.url_vars["database"]) table = tilde_decode(request.url_vars["table"]) await self.check_permissions( request, @@ -948,7 +948,7 @@ class RowView(RowTableShared): "view-instance", ], ) - pk_values = urlsafe_components(request.url_vars["pk_path"]) + pk_values = urlsafe_components(request.url_vars["pks"]) db = self.ds.databases[database] sql, params, pks = await _sql_params_pks(db, table, pk_values) results = await db.execute(sql, params, truncate=True) diff --git a/tests/test_routes.py b/tests/test_routes.py index 6718c232..349ac302 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -12,26 +12,26 @@ def routes(): @pytest.mark.parametrize( "path,expected_class,expected_matches", ( - ("/", "IndexView", {"as_format": ""}), - ("/foo", "DatabaseView", {"as_format": None, "db_name": "foo"}), - ("/foo.csv", "DatabaseView", {"as_format": ".csv", "db_name": "foo"}), - ("/foo.json", "DatabaseView", {"as_format": ".json", "db_name": "foo"}), - ("/foo.humbug", "DatabaseView", {"as_format": None, "db_name": "foo.humbug"}), - ("/foo/humbug", "TableView", {"db_name": "foo", "table": "humbug"}), - ("/foo/humbug.json", "TableView", {"db_name": "foo", "table": "humbug"}), - ("/foo/humbug.blah", "TableView", {"db_name": "foo", "table": "humbug"}), + ("/", "IndexView", {"format": ""}), + ("/foo", "DatabaseView", {"format": None, "database": "foo"}), + ("/foo.csv", "DatabaseView", {"format": ".csv", "database": "foo"}), + ("/foo.json", "DatabaseView", {"format": ".json", "database": "foo"}), + ("/foo.humbug", "DatabaseView", {"format": None, "database": "foo.humbug"}), + ("/foo/humbug", "TableView", {"database": "foo", "table": "humbug"}), + ("/foo/humbug.json", "TableView", {"database": "foo", "table": "humbug"}), + ("/foo/humbug.blah", "TableView", {"database": "foo", "table": "humbug"}), ( "/foo/humbug/1", "RowView", - {"as_format": None, "db_name": "foo", "pk_path": "1", "table": "humbug"}, + {"format": None, "database": "foo", "pks": "1", "table": "humbug"}, ), ( "/foo/humbug/1.json", "RowView", - {"as_format": ".json", "db_name": "foo", "pk_path": "1", "table": "humbug"}, + {"format": ".json", "database": "foo", "pks": "1", "table": "humbug"}, ), - ("/-/metadata.json", "JsonDataView", {"as_format": ".json"}), - ("/-/metadata", "JsonDataView", {"as_format": ""}), + ("/-/metadata.json", "JsonDataView", {"format": ".json"}), + ("/-/metadata", "JsonDataView", {"format": ""}), ), ) def test_routes(routes, path, expected_class, expected_matches): From b9c2b1cfc8692b9700416db98721fa3ec982f6be Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 13:29:10 -0700 Subject: [PATCH 018/952] Consistent treatment of format in route capturing, refs #1667 Also refs #1660 --- datasette/app.py | 30 ++++++++++++------------------ tests/test_api.py | 4 ++-- tests/test_routes.py | 32 ++++++++++++++++++++++---------- 3 files changed, 36 insertions(+), 30 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 5259c50c..edef34e9 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -985,10 +985,7 @@ class Datasette: def add_route(view, regex): routes.append((regex, view)) - # Generate a regex snippet to match all registered renderer file extensions - renderer_regex = "|".join(r"\." + key for key in self.renderers.keys()) - - add_route(IndexView.as_view(self), r"/(?P(\.jsono?)?$)") + add_route(IndexView.as_view(self), r"/(\.(?Pjsono?))?$") # TODO: /favicon.ico and /-/static/ deserve far-future cache expires add_route(favicon, "/favicon.ico") @@ -1020,21 +1017,21 @@ class Datasette: ) add_route( JsonDataView.as_view(self, "metadata.json", lambda: self.metadata()), - r"/-/metadata(?P(\.json)?)$", + r"/-/metadata(\.(?Pjson))?$", ) add_route( JsonDataView.as_view(self, "versions.json", self._versions), - r"/-/versions(?P(\.json)?)$", + r"/-/versions(\.(?Pjson))?$", ) add_route( JsonDataView.as_view( self, "plugins.json", self._plugins, needs_request=True ), - r"/-/plugins(?P(\.json)?)$", + r"/-/plugins(\.(?Pjson))?$", ) add_route( JsonDataView.as_view(self, "settings.json", lambda: self._settings), - r"/-/settings(?P(\.json)?)$", + r"/-/settings(\.(?Pjson))?$", ) add_route( permanent_redirect("/-/settings.json"), @@ -1046,15 +1043,15 @@ class Datasette: ) add_route( JsonDataView.as_view(self, "threads.json", self._threads), - r"/-/threads(?P(\.json)?)$", + r"/-/threads(\.(?Pjson))?$", ) add_route( JsonDataView.as_view(self, "databases.json", self._connected_databases), - r"/-/databases(?P(\.json)?)$", + r"/-/databases(\.(?Pjson))?$", ) add_route( JsonDataView.as_view(self, "actor.json", self._actor, needs_request=True), - r"/-/actor(?P(\.json)?)$", + r"/-/actor(\.(?Pjson))?$", ) add_route( AuthTokenView.as_view(self), @@ -1080,20 +1077,17 @@ class Datasette: PatternPortfolioView.as_view(self), r"/-/patterns$", ) - add_route(DatabaseDownload.as_view(self), r"/(?P[^/]+?)\.db$") + add_route(DatabaseDownload.as_view(self), r"/(?P[^\/\.]+)\.db$") add_route( - DatabaseView.as_view(self), - r"/(?P[^/]+?)(?P" + renderer_regex + r"|.jsono|\.csv)?$", + DatabaseView.as_view(self), r"/(?P[^\/\.]+)(\.(?P\w+))?$" ) add_route( TableView.as_view(self), - r"/(?P[^/]+)/(?P
[^\/\.]+)(\.[a-zA-Z0-9_]+)?$", + r"/(?P[^\/\.]+)/(?P
[^\/\.]+)(\.(?P\w+))?$", ) add_route( RowView.as_view(self), - r"/(?P[^/]+)/(?P
[^/]+?)/(?P[^/]+?)(?P" - + renderer_regex - + r")?$", + r"/(?P[^\/\.]+)/(?P
[^/]+?)/(?P[^/]+?)(\.(?P\w+))?$", ) return [ # Compile any strings to regular expressions diff --git a/tests/test_api.py b/tests/test_api.py index 421bb1fe..253c1718 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -629,8 +629,8 @@ def test_old_memory_urls_redirect(app_client_no_files, path, expected_redirect): def test_database_page_for_database_with_dot_in_name(app_client_with_dot): - response = app_client_with_dot.get("/fixtures.dot.json") - assert 200 == response.status + response = app_client_with_dot.get("/fixtures~2Edot.json") + assert response.status == 200 def test_custom_sql(app_client): diff --git a/tests/test_routes.py b/tests/test_routes.py index 349ac302..1fa55018 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -12,14 +12,26 @@ def routes(): @pytest.mark.parametrize( "path,expected_class,expected_matches", ( - ("/", "IndexView", {"format": ""}), + ("/", "IndexView", {"format": None}), ("/foo", "DatabaseView", {"format": None, "database": "foo"}), - ("/foo.csv", "DatabaseView", {"format": ".csv", "database": "foo"}), - ("/foo.json", "DatabaseView", {"format": ".json", "database": "foo"}), - ("/foo.humbug", "DatabaseView", {"format": None, "database": "foo.humbug"}), - ("/foo/humbug", "TableView", {"database": "foo", "table": "humbug"}), - ("/foo/humbug.json", "TableView", {"database": "foo", "table": "humbug"}), - ("/foo/humbug.blah", "TableView", {"database": "foo", "table": "humbug"}), + ("/foo.csv", "DatabaseView", {"format": "csv", "database": "foo"}), + ("/foo.json", "DatabaseView", {"format": "json", "database": "foo"}), + ("/foo.humbug", "DatabaseView", {"format": "humbug", "database": "foo"}), + ( + "/foo/humbug", + "TableView", + {"database": "foo", "table": "humbug", "format": None}, + ), + ( + "/foo/humbug.json", + "TableView", + {"database": "foo", "table": "humbug", "format": "json"}, + ), + ( + "/foo/humbug.blah", + "TableView", + {"database": "foo", "table": "humbug", "format": "blah"}, + ), ( "/foo/humbug/1", "RowView", @@ -28,10 +40,10 @@ def routes(): ( "/foo/humbug/1.json", "RowView", - {"format": ".json", "database": "foo", "pks": "1", "table": "humbug"}, + {"format": "json", "database": "foo", "pks": "1", "table": "humbug"}, ), - ("/-/metadata.json", "JsonDataView", {"format": ".json"}), - ("/-/metadata", "JsonDataView", {"format": ""}), + ("/-/metadata.json", "JsonDataView", {"format": "json"}), + ("/-/metadata", "JsonDataView", {"format": None}), ), ) def test_routes(routes, path, expected_class, expected_matches): From 798f075ef9b98819fdb564f9f79c78975a0f71e8 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 13:32:29 -0700 Subject: [PATCH 019/952] Read format from route captures, closes #1667 Refs #1660 --- datasette/utils/__init__.py | 20 -------------------- datasette/views/base.py | 12 +----------- tests/test_utils.py | 25 ------------------------- 3 files changed, 1 insertion(+), 56 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index ccdf8ad4..c89b9d23 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -731,26 +731,6 @@ def module_from_path(path, name): return mod -async def resolve_table_and_format( - table_and_format, table_exists, allowed_formats=None -): - if allowed_formats is None: - allowed_formats = [] - if "." in table_and_format: - # Check if a table exists with this exact name - it_exists = await table_exists(table_and_format) - if it_exists: - return table_and_format, None - - # Check if table ends with a known format - formats = list(allowed_formats) + ["csv", "jsono"] - for _format in formats: - if table_and_format.endswith(f".{_format}"): - table = table_and_format[: -(len(_format) + 1)] - return table, _format - return table_and_format, None - - def path_with_format( *, request=None, path=None, format=None, extra_qs=None, replace_format=None ): diff --git a/datasette/views/base.py b/datasette/views/base.py index 0bbf98bb..24e97d95 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -19,12 +19,10 @@ from datasette.utils import ( LimitedWriter, call_with_supported_arguments, tilde_decode, - tilde_encode, path_from_row_pks, path_with_added_args, path_with_removed_args, path_with_format, - resolve_table_and_format, sqlite3, HASH_LENGTH, ) @@ -372,18 +370,10 @@ class DataView(BaseView): return AsgiStream(stream_fn, headers=headers, content_type=content_type) - def get_format(self, request): - # Format is the bit from the path following the ., if one exists - last_path_component = request.path.split("/")[-1] - if "." in last_path_component: - return last_path_component.split(".")[-1] - else: - return None - async def get(self, request): db_name = request.url_vars["database"] database = tilde_decode(db_name) - _format = self.get_format(request) + _format = request.url_vars["format"] data_kwargs = {} if _format == "csv": diff --git a/tests/test_utils.py b/tests/test_utils.py index 790aadc7..7b41a87f 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -351,31 +351,6 @@ def test_compound_keys_after_sql(): ) -async def table_exists(table): - return table == "exists.csv" - - -@pytest.mark.asyncio -@pytest.mark.parametrize( - "table_and_format,expected_table,expected_format", - [ - ("blah", "blah", None), - ("blah.csv", "blah", "csv"), - ("blah.json", "blah", "json"), - ("blah.baz", "blah.baz", None), - ("exists.csv", "exists.csv", None), - ], -) -async def test_resolve_table_and_format( - table_and_format, expected_table, expected_format -): - actual_table, actual_format = await utils.resolve_table_and_format( - table_and_format, table_exists, ["json"] - ) - assert expected_table == actual_table - assert expected_format == actual_format - - def test_table_columns(): conn = sqlite3.connect(":memory:") conn.executescript( From 7a6654a253dee243518dc542ce4c06dbb0d0801d Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 17:11:17 -0700 Subject: [PATCH 020/952] Databases can now have a .route separate from their .name, refs #1668 --- datasette/app.py | 13 ++++++-- datasette/database.py | 1 + datasette/views/base.py | 12 +++++-- datasette/views/database.py | 18 ++++++----- datasette/views/table.py | 29 ++++++++++++----- docs/internals.rst | 11 ++++--- tests/test_internals_datasette.py | 1 + tests/test_routes.py | 52 ++++++++++++++++++++++++++++++- 8 files changed, 111 insertions(+), 26 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index edef34e9..5c8101a3 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -388,13 +388,18 @@ class Datasette: def unsign(self, signed, namespace="default"): return URLSafeSerializer(self._secret, namespace).loads(signed) - def get_database(self, name=None): + def get_database(self, name=None, route=None): + if route is not None: + matches = [db for db in self.databases.values() if db.route == route] + if not matches: + raise KeyError + return matches[0] if name is None: - # Return first no-_schemas database + # Return first database that isn't "_internal" name = [key for key in self.databases.keys() if key != "_internal"][0] return self.databases[name] - def add_database(self, db, name=None): + def add_database(self, db, name=None, route=None): new_databases = self.databases.copy() if name is None: # Pick a unique name for this database @@ -407,6 +412,7 @@ class Datasette: name = "{}_{}".format(suggestion, i) i += 1 db.name = name + db.route = route or name new_databases[name] = db # don't mutate! that causes race conditions with live import self.databases = new_databases @@ -693,6 +699,7 @@ class Datasette: return [ { "name": d.name, + "route": d.route, "path": d.path, "size": d.size, "is_mutable": d.is_mutable, diff --git a/datasette/database.py b/datasette/database.py index 6ce87215..ba594a8c 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -31,6 +31,7 @@ class Database: self, ds, path=None, is_mutable=False, is_memory=False, memory_name=None ): self.name = None + self.route = None self.ds = ds self.path = path self.is_mutable = is_mutable diff --git a/datasette/views/base.py b/datasette/views/base.py index 24e97d95..afa9eaa6 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -371,13 +371,19 @@ class DataView(BaseView): return AsgiStream(stream_fn, headers=headers, content_type=content_type) async def get(self, request): - db_name = request.url_vars["database"] - database = tilde_decode(db_name) + database_route = tilde_decode(request.url_vars["database"]) + + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name + _format = request.url_vars["format"] data_kwargs = {} if _format == "csv": - return await self.as_csv(request, database) + return await self.as_csv(request, database_route) if _format is None: # HTML views default to expanding all foreign key labels diff --git a/datasette/views/database.py b/datasette/views/database.py index 93bd1011..2563c5b2 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -32,7 +32,13 @@ class DatabaseView(DataView): name = "database" async def data(self, request, default_labels=False, _size=None): - database = tilde_decode(request.url_vars["database"]) + database_route = tilde_decode(request.url_vars["database"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name + await self.check_permissions( request, [ @@ -50,11 +56,6 @@ class DatabaseView(DataView): request, sql, _size=_size, metadata=metadata ) - try: - db = self.ds.databases[database] - except KeyError: - raise NotFound("Database not found: {}".format(database)) - table_counts = await db.table_counts(5) hidden_table_names = set(await db.hidden_table_names()) all_foreign_keys = await db.get_all_foreign_keys() @@ -171,9 +172,10 @@ class DatabaseDownload(DataView): "view-instance", ], ) - if database not in self.ds.databases: + try: + db = self.ds.get_database(route=database) + except KeyError: raise DatasetteError("Invalid database", status=404) - db = self.ds.databases[database] if db.is_memory: raise DatasetteError("Cannot download in-memory databases", status=404) if not self.ds.setting("allow_download") or db.is_mutable: diff --git a/datasette/views/table.py b/datasette/views/table.py index ea4f24b7..7fa1da3a 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -272,10 +272,15 @@ class TableView(RowTableShared): name = "table" async def post(self, request): - db_name = tilde_decode(request.url_vars["database"]) + database_route = tilde_decode(request.url_vars["database"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name table = tilde_decode(request.url_vars["table"]) # Handle POST to a canned query - canned_query = await self.ds.get_canned_query(db_name, table, request.actor) + canned_query = await self.ds.get_canned_query(database, table, request.actor) assert canned_query, "You may only POST to a canned query" return await QueryView(self.ds).data( request, @@ -327,12 +332,13 @@ class TableView(RowTableShared): _next=None, _size=None, ): - database = tilde_decode(request.url_vars["database"]) + database_route = tilde_decode(request.url_vars["database"]) table = tilde_decode(request.url_vars["table"]) try: - db = self.ds.databases[database] + db = self.ds.get_database(route=database_route) except KeyError: - raise NotFound("Database not found: {}".format(database)) + raise NotFound("Database not found: {}".format(database_route)) + database = db.name # If this is a canned query, not a table, then dispatch to QueryView instead canned_query = await self.ds.get_canned_query(database, table, request.actor) @@ -938,8 +944,13 @@ class RowView(RowTableShared): name = "row" async def data(self, request, default_labels=False): - database = tilde_decode(request.url_vars["database"]) + database_route = tilde_decode(request.url_vars["database"]) table = tilde_decode(request.url_vars["table"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name await self.check_permissions( request, [ @@ -949,7 +960,11 @@ class RowView(RowTableShared): ], ) pk_values = urlsafe_components(request.url_vars["pks"]) - db = self.ds.databases[database] + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name sql, params, pks = await _sql_params_pks(db, table, pk_values) results = await db.execute(sql, params, truncate=True) columns = [r[0] for r in results.description] diff --git a/docs/internals.rst b/docs/internals.rst index 117cb95c..323256c7 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -307,14 +307,17 @@ Returns the specified database object. Raises a ``KeyError`` if the database doe .. _datasette_add_database: -.add_database(db, name=None) ----------------------------- +.add_database(db, name=None, route=None) +---------------------------------------- ``db`` - datasette.database.Database instance The database to be attached. ``name`` - string, optional - The name to be used for this database - this will be used in the URL path, e.g. ``/dbname``. If not specified Datasette will pick one based on the filename or memory name. + The name to be used for this database . If not specified Datasette will pick one based on the filename or memory name. + +``route`` - string, optional + This will be used in the URL path. If not specified, it will default to the same thing as the ``name``. The ``datasette.add_database(db)`` method lets you add a new database to the current Datasette instance. @@ -371,7 +374,7 @@ Using either of these pattern will result in the in-memory database being served ``name`` - string The name of the database to be removed. -This removes a database that has been previously added. ``name=`` is the unique name of that database, used in its URL path. +This removes a database that has been previously added. ``name=`` is the unique name of that database. .. _datasette_sign: diff --git a/tests/test_internals_datasette.py b/tests/test_internals_datasette.py index adf84be9..cc200a2d 100644 --- a/tests/test_internals_datasette.py +++ b/tests/test_internals_datasette.py @@ -55,6 +55,7 @@ async def test_datasette_constructor(): assert databases == [ { "name": "_memory", + "route": "_memory", "path": None, "size": 0, "is_mutable": False, diff --git a/tests/test_routes.py b/tests/test_routes.py index 1fa55018..dd3bc644 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -1,6 +1,7 @@ -from datasette.app import Datasette +from datasette.app import Datasette, Database from datasette.utils import resolve_routes import pytest +import pytest_asyncio @pytest.fixture(scope="session") @@ -53,3 +54,52 @@ def test_routes(routes, path, expected_class, expected_matches): else: assert view.view_class.__name__ == expected_class assert match.groupdict() == expected_matches + + +@pytest_asyncio.fixture +async def ds_with_route(): + ds = Datasette() + ds.remove_database("_memory") + db = Database(ds, is_memory=True, memory_name="route-name-db") + ds.add_database(db, name="name", route="route-name") + await db.execute_write_script( + """ + create table if not exists t (id integer primary key); + insert or replace into t (id) values (1); + """ + ) + return ds + + +@pytest.mark.asyncio +async def test_db_with_route_databases(ds_with_route): + response = await ds_with_route.client.get("/-/databases.json") + assert response.json()[0] == { + "name": "name", + "route": "route-name", + "path": None, + "size": 0, + "is_mutable": True, + "is_memory": True, + "hash": None, + } + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + "path,expected_status", + ( + ("/", 200), + ("/name", 404), + ("/name/t", 404), + ("/name/t/1", 404), + ("/route-name", 200), + ("/route-name/t", 200), + ("/route-name/t/1", 200), + ), +) +async def test_db_with_route_that_does_not_match_name( + ds_with_route, path, expected_status +): + response = await ds_with_route.client.get(path) + assert response.status_code == expected_status From e10da9af3595c0a4e09c6f370103571aa4ea106e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 17:21:56 -0700 Subject: [PATCH 021/952] alternative-route demo, refs #1668 --- .github/workflows/deploy-latest.yml | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/.github/workflows/deploy-latest.yml b/.github/workflows/deploy-latest.yml index 1ae96e89..92aa1c6b 100644 --- a/.github/workflows/deploy-latest.yml +++ b/.github/workflows/deploy-latest.yml @@ -42,6 +42,17 @@ jobs: sphinx-build -b xml . _build sphinx-to-sqlite ../docs.db _build cd .. + - name: Set up the alternate-route demo + run: | + echo ' + from datasette import hookimpl + + @hookimpl + def startup(datasette): + db = datasette.get_database("fixtures2") + db.route = "alternative-route" + ' > plugins/alternative_route.py + cp fixtures.db fixtures2.db - name: Set up Cloud Run uses: google-github-actions/setup-gcloud@master with: @@ -54,7 +65,7 @@ jobs: gcloud config set project datasette-222320 export SUFFIX="-${GITHUB_REF#refs/heads/}" export SUFFIX=${SUFFIX#-main} - datasette publish cloudrun fixtures.db extra_database.db \ + datasette publish cloudrun fixtures.db fixtures2.db extra_database.db \ -m fixtures.json \ --plugins-dir=plugins \ --branch=$GITHUB_SHA \ From cdbae2b93f441653616dd889644c63e4150ceec1 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 17:31:23 -0700 Subject: [PATCH 022/952] Fixed internal links to respect db.route, refs #1668 --- datasette/url_builder.py | 3 ++- datasette/views/table.py | 5 ++--- tests/test_routes.py | 22 +++++++++++++--------- 3 files changed, 17 insertions(+), 13 deletions(-) diff --git a/datasette/url_builder.py b/datasette/url_builder.py index 498ec85d..574bf3c1 100644 --- a/datasette/url_builder.py +++ b/datasette/url_builder.py @@ -28,7 +28,8 @@ class Urls: return self.path("-/logout") def database(self, database, format=None): - return self.path(tilde_encode(database), format=format) + db = self.ds.get_database(database) + return self.path(tilde_encode(db.route), format=format) def table(self, database, table, format=None): path = f"{self.database(database)}/{tilde_encode(table)}" diff --git a/datasette/views/table.py b/datasette/views/table.py index 7fa1da3a..8745c28a 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -141,10 +141,9 @@ class RowTableShared(DataView): "is_special_link_column": is_special_link_column, "raw": pk_path, "value": markupsafe.Markup( - '{flat_pks}'.format( + '{flat_pks}'.format( base_url=base_url, - database=database, - table=tilde_encode(table), + table_path=self.ds.urls.table(database, table), flat_pks=str(markupsafe.escape(pk_path)), flat_pks_quoted=path_from_row_pks(row, pks, not pks), ) diff --git a/tests/test_routes.py b/tests/test_routes.py index dd3bc644..211b77b5 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -61,7 +61,7 @@ async def ds_with_route(): ds = Datasette() ds.remove_database("_memory") db = Database(ds, is_memory=True, memory_name="route-name-db") - ds.add_database(db, name="name", route="route-name") + ds.add_database(db, name="original-name", route="custom-route-name") await db.execute_write_script( """ create table if not exists t (id integer primary key); @@ -75,8 +75,8 @@ async def ds_with_route(): async def test_db_with_route_databases(ds_with_route): response = await ds_with_route.client.get("/-/databases.json") assert response.json()[0] == { - "name": "name", - "route": "route-name", + "name": "original-name", + "route": "custom-route-name", "path": None, "size": 0, "is_mutable": True, @@ -90,12 +90,12 @@ async def test_db_with_route_databases(ds_with_route): "path,expected_status", ( ("/", 200), - ("/name", 404), - ("/name/t", 404), - ("/name/t/1", 404), - ("/route-name", 200), - ("/route-name/t", 200), - ("/route-name/t/1", 200), + ("/original-name", 404), + ("/original-name/t", 404), + ("/original-name/t/1", 404), + ("/custom-route-name", 200), + ("/custom-route-name/t", 200), + ("/custom-route-name/t/1", 200), ), ) async def test_db_with_route_that_does_not_match_name( @@ -103,3 +103,7 @@ async def test_db_with_route_that_does_not_match_name( ): response = await ds_with_route.client.get(path) assert response.status_code == expected_status + # There should be links to custom-route-name but none to original-name + if response.status_code == 200: + assert "/custom-route-name" in response.text + assert "/original-name" not in response.text From 5471e3c4914837de957e206d8fb80c9ec383bc2e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 18:14:40 -0700 Subject: [PATCH 023/952] Release 0.61a0 Refs #957, #1533, #1545, #1576, #1577, #1587, #1601, #1603, #1607, #1612, #1621, #1649, #1654, #1657, #1661, #1668 --- datasette/version.py | 2 +- docs/changelog.rst | 29 +++++++++++++++++++++++++++-- docs/performance.rst | 2 ++ 3 files changed, 30 insertions(+), 3 deletions(-) diff --git a/datasette/version.py b/datasette/version.py index 91224615..ccc1e04b 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.60.2" +__version__ = "0.61a0" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index c58c8444..0f3d3aff 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,14 +4,39 @@ Changelog ========= -.. _v0_60.2: +.. _v0_61_a0: + +0.61a0 (2022-03-19) +------------------- + +- Removed hashed URL mode from Datasette. The new ``datasette-hashed-urls`` plugin can be used to achieve the same result, see :ref:`performance_hashed_urls` for details. (:issue:`1661`) +- Databases can now have a custom path within the Datasette instance that is indpendent of the database name, using the ``db.route`` property. (:issue:`1668`) +- URLs within Datasette now use a different encoding scheme for tables or databases that include "special" characters outside of the range of ``a-zA-Z0-9_-``. This scheme is explained here: :ref:`internals_tilde_encoding`. (:issue:`1657`) +- Table and row HTML pages now include a ```` element and return a ``Link: URL; rel="alternate"; type="application/json+datasette"`` HTTP header pointing to the JSON version of those pages. (:issue:`1533`) +- ``Access-Control-Expose-Headers: Link`` is now added to the CORS headers, allowing remote JavaScript to access that header. +- Canned queries are now shown at the top of the database page, directly below the SQL editor. Previously they were shown at the bottom, below the list of tables. (:issue:`1612`) +- Datasette now has a default favicon. (:issue:`1603`) +- ``sqlite_stat`` tables are now hidden by default. (:issue:`1587`) +- SpatiaLite tables ``data_licenses``, ``KNN`` and ``KNN2`` are now hidden by default. (:issue:`1601`) +- Python 3.6 is no longer supported. (:issue:`1577`) +- Tests now run against Python 3.11-dev. (:issue:`1621`) +- Fixed bug where :ref:`custom pages ` did not work on Windows. Thanks, Robert Christie. (:issue:`1545`) +- SQL query tracing mechanism now works for queries executed in ``asyncio`` sub-tasks, such as those created by ``asyncio.gather()``. (:issue:`1576`) +- :ref:`internals_tracer` mechanism is now documented. +- Common Datasette symbols can now be imported directly from the top-level ``datasette`` package, see :ref:`internals_shortcuts`. Those symbols are ``Response``, ``Forbidden``, ``NotFound``, ``hookimpl``, ``actor_matches_allow``. (:issue:`957`) +- ``/-/versions`` page now returns additional details for libraries used by SpatiaLite. (:issue:`1607`) +- Documentation now links to the `Datasette Tutorials `__. +- Datasette will now also look for SpatiaLite in ``/opt/homebrew`` - thanks, Dan Peterson. (`#1649 `__) +- Datasette is now covered by a `Code of Conduct `__. (:issue:`1654`) + +.. _v0_60_2: 0.60.2 (2022-02-07) ------------------- - Fixed a bug where Datasette would open the same file twice with two different database names if you ran ``datasette file.db file.db``. (:issue:`1632`) -.. _v0_60.1: +.. _v0_60_1: 0.60.1 (2022-01-20) ------------------- diff --git a/docs/performance.rst b/docs/performance.rst index d37f1804..89bbf5ae 100644 --- a/docs/performance.rst +++ b/docs/performance.rst @@ -60,6 +60,8 @@ The :ref:`setting_default_cache_ttl` setting sets the default HTTP cache TTL for You can also change the cache timeout on a per-request basis using the ``?_ttl=10`` query string parameter. This can be useful when you are working with the Datasette JSON API - you may decide that a specific query can be cached for a longer time, or maybe you need to set ``?_ttl=0`` for some requests for example if you are running a SQL ``order by random()`` query. +.. _performance_hashed_urls: + datasette-hashed-urls --------------------- From cb4854a435cc1418665edec2a73664ad74a32017 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 18:17:58 -0700 Subject: [PATCH 024/952] Fixed typo --- docs/changelog.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/changelog.rst b/docs/changelog.rst index 0f3d3aff..9f5a143c 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -10,7 +10,7 @@ Changelog ------------------- - Removed hashed URL mode from Datasette. The new ``datasette-hashed-urls`` plugin can be used to achieve the same result, see :ref:`performance_hashed_urls` for details. (:issue:`1661`) -- Databases can now have a custom path within the Datasette instance that is indpendent of the database name, using the ``db.route`` property. (:issue:`1668`) +- Databases can now have a custom path within the Datasette instance that is independent of the database name, using the ``db.route`` property. (:issue:`1668`) - URLs within Datasette now use a different encoding scheme for tables or databases that include "special" characters outside of the range of ``a-zA-Z0-9_-``. This scheme is explained here: :ref:`internals_tilde_encoding`. (:issue:`1657`) - Table and row HTML pages now include a ```` element and return a ``Link: URL; rel="alternate"; type="application/json+datasette"`` HTTP header pointing to the JSON version of those pages. (:issue:`1533`) - ``Access-Control-Expose-Headers: Link`` is now added to the CORS headers, allowing remote JavaScript to access that header. From 4a4164b81191dec35e423486a208b05a9edc65e4 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 19 Mar 2022 18:23:03 -0700 Subject: [PATCH 025/952] Added another note to the 0.61a0 release notes, refs #1228 --- docs/changelog.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/changelog.rst b/docs/changelog.rst index 9f5a143c..05ad85f2 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -28,6 +28,7 @@ Changelog - Documentation now links to the `Datasette Tutorials `__. - Datasette will now also look for SpatiaLite in ``/opt/homebrew`` - thanks, Dan Peterson. (`#1649 `__) - Datasette is now covered by a `Code of Conduct `__. (:issue:`1654`) +- Fixed error caused when a table had a column named ``n``. (:issue:`1228`) .. _v0_60_2: From e627510b760198ccedba9e5af47a771e847785c9 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 21 Mar 2022 10:13:16 -0700 Subject: [PATCH 026/952] BaseView.check_permissions is now datasette.ensure_permissions, closes #1675 Refs #1660 --- datasette/app.py | 35 +++++++++++++++++++++++++++++++++++ datasette/views/base.py | 26 -------------------------- datasette/views/database.py | 12 ++++++------ datasette/views/table.py | 8 ++++---- docs/internals.rst | 26 ++++++++++++++++++++++++++ 5 files changed, 71 insertions(+), 36 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 5c8101a3..9e509e96 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -1,4 +1,5 @@ import asyncio +from typing import Sequence, Union, Tuple import asgi_csrf import collections import datetime @@ -628,6 +629,40 @@ class Datasette: ) return result + async def ensure_permissions( + self, + actor: dict, + permissions: Sequence[Union[Tuple[str, Union[str, Tuple[str, str]]], str]], + ): + """ + permissions is a list of (action, resource) tuples or 'action' strings + + Raises datasette.Forbidden() if any of the checks fail + """ + for permission in permissions: + if isinstance(permission, str): + action = permission + resource = None + elif isinstance(permission, (tuple, list)) and len(permission) == 2: + action, resource = permission + else: + assert ( + False + ), "permission should be string or tuple of two items: {}".format( + repr(permission) + ) + ok = await self.permission_allowed( + actor, + action, + resource=resource, + default=None, + ) + if ok is not None: + if ok: + return + else: + raise Forbidden(action) + async def execute( self, db_name, diff --git a/datasette/views/base.py b/datasette/views/base.py index afa9eaa6..d1e684a2 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -76,32 +76,6 @@ class BaseView: if not ok: raise Forbidden(action) - async def check_permissions(self, request, permissions): - """permissions is a list of (action, resource) tuples or 'action' strings""" - for permission in permissions: - if isinstance(permission, str): - action = permission - resource = None - elif isinstance(permission, (tuple, list)) and len(permission) == 2: - action, resource = permission - else: - assert ( - False - ), "permission should be string or tuple of two items: {}".format( - repr(permission) - ) - ok = await self.ds.permission_allowed( - request.actor, - action, - resource=resource, - default=None, - ) - if ok is not None: - if ok: - return - else: - raise Forbidden(action) - def database_color(self, database): return "ff0000" diff --git a/datasette/views/database.py b/datasette/views/database.py index 2563c5b2..69ed1233 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -39,8 +39,8 @@ class DatabaseView(DataView): raise NotFound("Database not found: {}".format(database_route)) database = db.name - await self.check_permissions( - request, + await self.ds.ensure_permissions( + request.actor, [ ("view-database", database), "view-instance", @@ -164,8 +164,8 @@ class DatabaseDownload(DataView): async def get(self, request): database = tilde_decode(request.url_vars["database"]) - await self.check_permissions( - request, + await self.ds.ensure_permissions( + request.actor, [ ("view-database-download", database), ("view-database", database), @@ -217,8 +217,8 @@ class QueryView(DataView): private = False if canned_query: # Respect canned query permissions - await self.check_permissions( - request, + await self.ds.ensure_permissions( + request.actor, [ ("view-query", (database, canned_query)), ("view-database", database), diff --git a/datasette/views/table.py b/datasette/views/table.py index 8745c28a..84169820 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -360,8 +360,8 @@ class TableView(RowTableShared): raise NotFound(f"Table not found: {table}") # Ensure user has permission to view this table - await self.check_permissions( - request, + await self.ds.ensure_permissions( + request.actor, [ ("view-table", (database, table)), ("view-database", database), @@ -950,8 +950,8 @@ class RowView(RowTableShared): except KeyError: raise NotFound("Database not found: {}".format(database_route)) database = db.name - await self.check_permissions( - request, + await self.ds.ensure_permissions( + request.actor, [ ("view-table", (database, table)), ("view-database", database), diff --git a/docs/internals.rst b/docs/internals.rst index 323256c7..12adde00 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -295,6 +295,32 @@ If neither ``metadata.json`` nor any of the plugins provide an answer to the per See :ref:`permissions` for a full list of permission actions included in Datasette core. +.. _datasette_permission_allowed: + +await .ensure_permissions(actor, permissions) +--------------------------------------------- + +``actor`` - dictionary + The authenticated actor. This is usually ``request.actor``. + +``permissions`` - list + A list of permissions to check. Each permission in that list can be a string ``action`` name or a 2-tuple of ``(action, resource)``. + +This method allows multiple permissions to be checked at onced. It raises a ``datasette.Forbidden`` exception if any of the checks are denied before one of them is explicitly granted. + +This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns ``True`` or not a single one of them returns ``False``: + +.. code-block:: python + + await self.ds.ensure_permissions( + request.actor, + [ + ("view-table", (database, table)), + ("view-database", database), + "view-instance", + ] + ) + .. _datasette_get_database: .get_database(name) From dfafce6d962d615d98a7080e546c7b3662ae7d34 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 21 Mar 2022 11:37:27 -0700 Subject: [PATCH 027/952] Display no-opinion permission checks on /-/permissions --- datasette/templates/permissions_debug.html | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/datasette/templates/permissions_debug.html b/datasette/templates/permissions_debug.html index d898ea8c..db709c14 100644 --- a/datasette/templates/permissions_debug.html +++ b/datasette/templates/permissions_debug.html @@ -10,6 +10,9 @@ .check-result-false { color: red; } +.check-result-no-opinion { + color: #aaa; +} .check h2 { font-size: 1em } @@ -38,6 +41,8 @@ {{ check.when }} {% if check.result %} + {% elif check.result is none %} + none {% else %} {% endif %} From 194e4f6c3fffde69eb196f8535ca45386b40ec2d Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 21 Mar 2022 11:41:56 -0700 Subject: [PATCH 028/952] Removed check_permission() from BaseView, closes #1677 Refs #1660 --- datasette/app.py | 1 + datasette/views/base.py | 10 ---------- datasette/views/database.py | 2 +- datasette/views/index.py | 2 +- datasette/views/special.py | 10 +++++----- tests/test_permissions.py | 13 ++++++++----- 6 files changed, 16 insertions(+), 22 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 9e509e96..22ae211f 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -639,6 +639,7 @@ class Datasette: Raises datasette.Forbidden() if any of the checks fail """ + assert actor is None or isinstance(actor, dict) for permission in permissions: if isinstance(permission, str): action = permission diff --git a/datasette/views/base.py b/datasette/views/base.py index d1e684a2..221e1882 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -66,16 +66,6 @@ class BaseView: response.body = b"" return response - async def check_permission(self, request, action, resource=None): - ok = await self.ds.permission_allowed( - request.actor, - action, - resource=resource, - default=True, - ) - if not ok: - raise Forbidden(action) - def database_color(self, database): return "ff0000" diff --git a/datasette/views/database.py b/datasette/views/database.py index 69ed1233..31a1839f 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -229,7 +229,7 @@ class QueryView(DataView): None, "view-query", (database, canned_query), default=True ) else: - await self.check_permission(request, "execute-sql", database) + await self.ds.ensure_permissions(request.actor, [("execute-sql", database)]) # Extract any :named parameters named_parameters = named_parameters or await derive_named_parameters( diff --git a/datasette/views/index.py b/datasette/views/index.py index f5e31181..1c391e26 100644 --- a/datasette/views/index.py +++ b/datasette/views/index.py @@ -20,7 +20,7 @@ class IndexView(BaseView): async def get(self, request): as_format = request.url_vars["format"] - await self.check_permission(request, "view-instance") + await self.ds.ensure_permissions(request.actor, ["view-instance"]) databases = [] for name, db in self.ds.databases.items(): visible, database_private = await check_visibility( diff --git a/datasette/views/special.py b/datasette/views/special.py index 395ee587..dd834528 100644 --- a/datasette/views/special.py +++ b/datasette/views/special.py @@ -16,7 +16,7 @@ class JsonDataView(BaseView): async def get(self, request): as_format = request.url_vars["format"] - await self.check_permission(request, "view-instance") + await self.ds.ensure_permissions(request.actor, ["view-instance"]) if self.needs_request: data = self.data_callback(request) else: @@ -47,7 +47,7 @@ class PatternPortfolioView(BaseView): has_json_alternate = False async def get(self, request): - await self.check_permission(request, "view-instance") + await self.ds.ensure_permissions(request.actor, ["view-instance"]) return await self.render(["patterns.html"], request=request) @@ -95,7 +95,7 @@ class PermissionsDebugView(BaseView): has_json_alternate = False async def get(self, request): - await self.check_permission(request, "view-instance") + await self.ds.ensure_permissions(request.actor, ["view-instance"]) if not await self.ds.permission_allowed(request.actor, "permissions-debug"): raise Forbidden("Permission denied") return await self.render( @@ -146,11 +146,11 @@ class MessagesDebugView(BaseView): has_json_alternate = False async def get(self, request): - await self.check_permission(request, "view-instance") + await self.ds.ensure_permissions(request.actor, ["view-instance"]) return await self.render(["messages_debug.html"], request) async def post(self, request): - await self.check_permission(request, "view-instance") + await self.ds.ensure_permissions(request.actor, ["view-instance"]) post = await request.post_vars() message = post.get("message", "") message_type = post.get("message_type") or "INFO" diff --git a/tests/test_permissions.py b/tests/test_permissions.py index 788523b0..f4169dbe 100644 --- a/tests/test_permissions.py +++ b/tests/test_permissions.py @@ -321,17 +321,20 @@ def test_permissions_debug(app_client): checks = [ { "action": div.select_one(".check-action").text, - "result": bool(div.select(".check-result-true")), + # True = green tick, False = red cross, None = gray None + "result": None + if div.select(".check-result-no-opinion") + else bool(div.select(".check-result-true")), "used_default": bool(div.select(".check-used-default")), } for div in check_divs ] - assert [ + assert checks == [ {"action": "permissions-debug", "result": True, "used_default": False}, - {"action": "view-instance", "result": True, "used_default": True}, + {"action": "view-instance", "result": None, "used_default": True}, {"action": "permissions-debug", "result": False, "used_default": True}, - {"action": "view-instance", "result": True, "used_default": True}, - ] == checks + {"action": "view-instance", "result": None, "used_default": True}, + ] @pytest.mark.parametrize( From 1a7750eb29fd15dd2eea3b9f6e33028ce441b143 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 21 Mar 2022 12:01:37 -0700 Subject: [PATCH 029/952] Documented datasette.check_visibility() method, closes #1678 --- datasette/app.py | 18 ++++++++++++++++++ datasette/utils/__init__.py | 19 ------------------- datasette/views/database.py | 10 +++------- datasette/views/index.py | 11 ++++------- docs/internals.rst | 28 +++++++++++++++++++++++++++- 5 files changed, 52 insertions(+), 34 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 22ae211f..c9eede26 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -664,6 +664,24 @@ class Datasette: else: raise Forbidden(action) + async def check_visibility(self, actor, action, resource): + """Returns (visible, private) - visible = can you see it, private = can others see it too""" + visible = await self.permission_allowed( + actor, + action, + resource=resource, + default=True, + ) + if not visible: + return False, False + private = not await self.permission_allowed( + None, + action, + resource=resource, + default=True, + ) + return visible, private + async def execute( self, db_name, diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index c89b9d23..cd8e3d61 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -1002,25 +1002,6 @@ def actor_matches_allow(actor, allow): return False -async def check_visibility(datasette, actor, action, resource, default=True): - """Returns (visible, private) - visible = can you see it, private = can others see it too""" - visible = await datasette.permission_allowed( - actor, - action, - resource=resource, - default=default, - ) - if not visible: - return False, False - private = not await datasette.permission_allowed( - None, - action, - resource=resource, - default=default, - ) - return visible, private - - def resolve_env_secrets(config, environ): """Create copy that recursively replaces {"$env": "NAME"} with values from environ""" if isinstance(config, dict): diff --git a/datasette/views/database.py b/datasette/views/database.py index 31a1839f..103bd575 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -10,7 +10,6 @@ import markupsafe from datasette.utils import ( add_cors_headers, await_me_maybe, - check_visibility, derive_named_parameters, tilde_decode, to_css_class, @@ -62,8 +61,7 @@ class DatabaseView(DataView): views = [] for view_name in await db.view_names(): - visible, private = await check_visibility( - self.ds, + visible, private = await self.ds.check_visibility( request.actor, "view-table", (database, view_name), @@ -78,8 +76,7 @@ class DatabaseView(DataView): tables = [] for table in table_counts: - visible, private = await check_visibility( - self.ds, + visible, private = await self.ds.check_visibility( request.actor, "view-table", (database, table), @@ -105,8 +102,7 @@ class DatabaseView(DataView): for query in ( await self.ds.get_canned_queries(database, request.actor) ).values(): - visible, private = await check_visibility( - self.ds, + visible, private = await self.ds.check_visibility( request.actor, "view-query", (database, query["name"]), diff --git a/datasette/views/index.py b/datasette/views/index.py index 1c391e26..aec78814 100644 --- a/datasette/views/index.py +++ b/datasette/views/index.py @@ -1,7 +1,7 @@ import hashlib import json -from datasette.utils import add_cors_headers, check_visibility, CustomJSONEncoder +from datasette.utils import add_cors_headers, CustomJSONEncoder from datasette.utils.asgi import Response from datasette.version import __version__ @@ -23,8 +23,7 @@ class IndexView(BaseView): await self.ds.ensure_permissions(request.actor, ["view-instance"]) databases = [] for name, db in self.ds.databases.items(): - visible, database_private = await check_visibility( - self.ds, + visible, database_private = await self.ds.check_visibility( request.actor, "view-database", name, @@ -36,8 +35,7 @@ class IndexView(BaseView): views = [] for view_name in await db.view_names(): - visible, private = await check_visibility( - self.ds, + visible, private = await self.ds.check_visibility( request.actor, "view-table", (name, view_name), @@ -55,8 +53,7 @@ class IndexView(BaseView): tables = {} for table in table_names: - visible, private = await check_visibility( - self.ds, + visible, private = await self.ds.check_visibility( request.actor, "view-table", (name, table), diff --git a/docs/internals.rst b/docs/internals.rst index 12adde00..f9a24fea 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -295,7 +295,7 @@ If neither ``metadata.json`` nor any of the plugins provide an answer to the per See :ref:`permissions` for a full list of permission actions included in Datasette core. -.. _datasette_permission_allowed: +.. _datasette_ensure_permissions: await .ensure_permissions(actor, permissions) --------------------------------------------- @@ -321,6 +321,32 @@ This is useful when you need to check multiple permissions at once. For example, ] ) +.. _datasette_check_visibilty: + +await .check_visibility(actor, action, resource=None) +----------------------------------------------------- + +``actor`` - dictionary + The authenticated actor. This is usually ``request.actor``. + +``action`` - string + The name of the action that is being permission checked. + +``resource`` - string or tuple, optional + The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource. + +This convenience method can be used to answer the question "should this item be considered private, in that it is visible to me but it is not visible to anonymous users?" + +It returns a tuple of two booleans, ``(visible, private)``. ``visible`` indicates if the actor can see this resource. ``private`` will be ``True`` if an anonymous user would not be able to view the resource. + +This example checks if the user can access a specific table, and sets ``private`` so that a padlock icon can later be displayed: + +.. code-block:: python + + visible, private = await self.ds.check_visibility( + request.actor, "view-table", (database, table) + ) + .. _datasette_get_database: .get_database(name) From 72bfd75fb7241893c931348e6aca712edc67ab04 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 21 Mar 2022 14:55:50 -0700 Subject: [PATCH 030/952] Drop n=1 threshold down to <= 20ms, closes #1679 --- datasette/utils/__init__.py | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index cd8e3d61..9109f823 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -182,15 +182,16 @@ class CustomJSONEncoder(json.JSONEncoder): def sqlite_timelimit(conn, ms): deadline = time.perf_counter() + (ms / 1000) # n is the number of SQLite virtual machine instructions that will be - # executed between each check. It's hard to know what to pick here. - # After some experimentation, I've decided to go with 1000 by default and - # 1 for time limits that are less than 50ms + # executed between each check. It takes about 0.08ms to execute 1000. + # https://github.com/simonw/datasette/issues/1679 n = 1000 - if ms < 50: + if ms <= 20: + # This mainly happens while executing our test suite n = 1 def handler(): if time.perf_counter() >= deadline: + # Returning 1 terminates the query with an error return 1 conn.set_progress_handler(handler, n) From 12f3ca79956ed9003c874f67748432adcacc6fd2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 21 Mar 2022 18:42:03 -0700 Subject: [PATCH 031/952] google-github-actions/setup-gcloud@v0 --- .github/workflows/deploy-latest.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/deploy-latest.yml b/.github/workflows/deploy-latest.yml index 92aa1c6b..a61f6629 100644 --- a/.github/workflows/deploy-latest.yml +++ b/.github/workflows/deploy-latest.yml @@ -14,7 +14,7 @@ jobs: - name: Set up Python uses: actions/setup-python@v2 with: - python-version: 3.9 + python-version: "3.10" - uses: actions/cache@v2 name: Configure pip caching with: @@ -54,7 +54,7 @@ jobs: ' > plugins/alternative_route.py cp fixtures.db fixtures2.db - name: Set up Cloud Run - uses: google-github-actions/setup-gcloud@master + uses: google-github-actions/setup-gcloud@v0 with: version: '275.0.0' service_account_email: ${{ secrets.GCP_SA_EMAIL }} From c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 22 Mar 2022 09:49:26 -0700 Subject: [PATCH 032/952] google-github-actions/setup-gcloud@v0 --- .github/workflows/publish.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/publish.yml b/.github/workflows/publish.yml index 3cfc67da..3e4f8146 100644 --- a/.github/workflows/publish.yml +++ b/.github/workflows/publish.yml @@ -85,7 +85,7 @@ jobs: sphinx-to-sqlite ../docs.db _build cd .. - name: Set up Cloud Run - uses: google-github-actions/setup-gcloud@master + uses: google-github-actions/setup-gcloud@v0 with: version: '275.0.0' service_account_email: ${{ secrets.GCP_SA_EMAIL }} From d7c793d7998388d915f8d270079c68a77a785051 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 23 Mar 2022 11:12:26 -0700 Subject: [PATCH 033/952] Release 0.61 Refs #957, #1228, #1533, #1545, #1576, #1577, #1587, #1601, #1603, #1607, #1612, #1621, #1649, #1654, #1657, #1661, #1668, #1675, #1678 --- datasette/version.py | 2 +- docs/changelog.rst | 22 ++++++++++++++-------- 2 files changed, 15 insertions(+), 9 deletions(-) diff --git a/datasette/version.py b/datasette/version.py index ccc1e04b..f9b10696 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.61a0" +__version__ = "0.61" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index 05ad85f2..d2de8da1 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,30 +4,36 @@ Changelog ========= -.. _v0_61_a0: +.. _v0_61: -0.61a0 (2022-03-19) -------------------- +0.61 (2022-03-23) +----------------- +In preparation for Datasette 1.0, this release includes two potentially backwards-incompatible changes. Hashed URL mode has been moved to a separate plugin, and the way Datasette generates URLs to databases and tables with special characters in their name such as ``/`` and ``.`` has changed. + +Datasette also now requires Python 3.7 or higher. + +- URLs within Datasette now use a different encoding scheme for tables or databases that include "special" characters outside of the range of ``a-zA-Z0-9_-``. This scheme is explained here: :ref:`internals_tilde_encoding`. (:issue:`1657`) - Removed hashed URL mode from Datasette. The new ``datasette-hashed-urls`` plugin can be used to achieve the same result, see :ref:`performance_hashed_urls` for details. (:issue:`1661`) - Databases can now have a custom path within the Datasette instance that is independent of the database name, using the ``db.route`` property. (:issue:`1668`) -- URLs within Datasette now use a different encoding scheme for tables or databases that include "special" characters outside of the range of ``a-zA-Z0-9_-``. This scheme is explained here: :ref:`internals_tilde_encoding`. (:issue:`1657`) +- Datasette is now covered by a `Code of Conduct `__. (:issue:`1654`) +- Python 3.6 is no longer supported. (:issue:`1577`) +- Tests now run against Python 3.11-dev. (:issue:`1621`) +- New :ref:`datasette.ensure_permissions(actor, permissions) ` internal method for checking multiple permissions at once. (:issue:`1675`) +- New :ref:`datasette.check_visibility(actor, action, resource=None) ` internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. (:issue:`1678`) - Table and row HTML pages now include a ```` element and return a ``Link: URL; rel="alternate"; type="application/json+datasette"`` HTTP header pointing to the JSON version of those pages. (:issue:`1533`) - ``Access-Control-Expose-Headers: Link`` is now added to the CORS headers, allowing remote JavaScript to access that header. - Canned queries are now shown at the top of the database page, directly below the SQL editor. Previously they were shown at the bottom, below the list of tables. (:issue:`1612`) - Datasette now has a default favicon. (:issue:`1603`) - ``sqlite_stat`` tables are now hidden by default. (:issue:`1587`) - SpatiaLite tables ``data_licenses``, ``KNN`` and ``KNN2`` are now hidden by default. (:issue:`1601`) -- Python 3.6 is no longer supported. (:issue:`1577`) -- Tests now run against Python 3.11-dev. (:issue:`1621`) -- Fixed bug where :ref:`custom pages ` did not work on Windows. Thanks, Robert Christie. (:issue:`1545`) - SQL query tracing mechanism now works for queries executed in ``asyncio`` sub-tasks, such as those created by ``asyncio.gather()``. (:issue:`1576`) - :ref:`internals_tracer` mechanism is now documented. - Common Datasette symbols can now be imported directly from the top-level ``datasette`` package, see :ref:`internals_shortcuts`. Those symbols are ``Response``, ``Forbidden``, ``NotFound``, ``hookimpl``, ``actor_matches_allow``. (:issue:`957`) - ``/-/versions`` page now returns additional details for libraries used by SpatiaLite. (:issue:`1607`) - Documentation now links to the `Datasette Tutorials `__. - Datasette will now also look for SpatiaLite in ``/opt/homebrew`` - thanks, Dan Peterson. (`#1649 `__) -- Datasette is now covered by a `Code of Conduct `__. (:issue:`1654`) +- Fixed bug where :ref:`custom pages ` did not work on Windows. Thanks, Robert Christie. (:issue:`1545`) - Fixed error caused when a table had a column named ``n``. (:issue:`1228`) .. _v0_60_2: From 0159662ab8ccb363c59647861360e0cb7a6f930d Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 23 Mar 2022 11:48:10 -0700 Subject: [PATCH 034/952] Fix for bug running ?sql= against databases with a different route, closes #1682 --- datasette/views/database.py | 7 ++++++- tests/test_routes.py | 1 + 2 files changed, 7 insertions(+), 1 deletion(-) diff --git a/datasette/views/database.py b/datasette/views/database.py index 103bd575..bdd433cc 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -203,7 +203,12 @@ class QueryView(DataView): named_parameters=None, write=False, ): - database = tilde_decode(request.url_vars["database"]) + database_route = tilde_decode(request.url_vars["database"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name params = {key: request.args.get(key) for key in request.args} if "sql" in params: params.pop("sql") diff --git a/tests/test_routes.py b/tests/test_routes.py index 211b77b5..5ae55d21 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -94,6 +94,7 @@ async def test_db_with_route_databases(ds_with_route): ("/original-name/t", 404), ("/original-name/t/1", 404), ("/custom-route-name", 200), + ("/custom-route-name?sql=select+id+from+t", 200), ("/custom-route-name/t", 200), ("/custom-route-name/t/1", 200), ), From d431a9055e977aefe48689a2e5866ea8d3558a6c Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 23 Mar 2022 11:54:10 -0700 Subject: [PATCH 035/952] Release 0.61.1 Refs #1682 Refs https://github.com/simonw/datasette-hashed-urls/issues/13 --- datasette/version.py | 2 +- docs/changelog.rst | 7 +++++++ 2 files changed, 8 insertions(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index f9b10696..02451a1e 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.61" +__version__ = "0.61.1" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index d2de8da1..03cf62b6 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,6 +4,13 @@ Changelog ========= +.. _v0_61_1: + +0.61.1 (2022-03-23) +------------------- + +- Fixed a bug where databases with a different route from their name (as used by the `datasette-hashed-urls plugin `__) returned errors when executing custom SQL queries. (:issue:`1682`) + .. _v0_61: 0.61 (2022-03-23) From c496f2b663ff0cef908ffaaa68b8cb63111fb5f2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 24 Mar 2022 12:16:19 -0700 Subject: [PATCH 036/952] Don't show facet in cog menu if not allow_facet, closes #1683 --- datasette/static/table.js | 10 ++++++++-- datasette/templates/table.html | 1 + datasette/views/table.py | 3 +++ tests/test_table_html.py | 14 ++++++++++++++ 4 files changed, 26 insertions(+), 2 deletions(-) diff --git a/datasette/static/table.js b/datasette/static/table.js index 3c88cc40..096a27ac 100644 --- a/datasette/static/table.js +++ b/datasette/static/table.js @@ -128,7 +128,8 @@ var DROPDOWN_ICON_SVG = ` el.dataset.column); @@ -137,7 +138,12 @@ var DROPDOWN_ICON_SVG = ` + `_, `datasette-publish-vercel `_ @@ -400,7 +422,9 @@ If the value matches that pattern, the plugin returns an HTML link element: if not isinstance(value, str): return None stripped = value.strip() - if not stripped.startswith("{") and stripped.endswith("}"): + if not stripped.startswith("{") and stripped.endswith( + "}" + ): return None try: data = json.loads(value) @@ -412,14 +436,18 @@ If the value matches that pattern, the plugin returns an HTML link element: return None href = data["href"] if not ( - href.startswith("/") or href.startswith("http://") + href.startswith("/") + or href.startswith("http://") or href.startswith("https://") ): return None - return markupsafe.Markup('{label}'.format( - href=markupsafe.escape(data["href"]), - label=markupsafe.escape(data["label"] or "") or " " - )) + return markupsafe.Markup( + '{label}'.format( + href=markupsafe.escape(data["href"]), + label=markupsafe.escape(data["label"] or "") + or " ", + ) + ) Examples: `datasette-render-binary `_, `datasette-render-markdown `__, `datasette-json-html `__ @@ -516,7 +544,7 @@ Here is a more complex example: return Response( "\n".join(lines), content_type="text/plain; charset=utf-8", - headers={"x-sqlite-version": result.first()[0]} + headers={"x-sqlite-version": result.first()[0]}, ) And here is an example ``can_render`` function which returns ``True`` only if the query results contain the columns ``atom_id``, ``atom_title`` and ``atom_updated``: @@ -524,7 +552,11 @@ And here is an example ``can_render`` function which returns ``True`` only if th .. code-block:: python def can_render_demo(columns): - return {"atom_id", "atom_title", "atom_updated"}.issubset(columns) + return { + "atom_id", + "atom_title", + "atom_updated", + }.issubset(columns) Examples: `datasette-atom `_, `datasette-ics `_, `datasette-geojson `__ @@ -548,16 +580,14 @@ Return a list of ``(regex, view_function)`` pairs, something like this: async def hello_from(request): name = request.url_vars["name"] - return Response.html("Hello from {}".format( - html.escape(name) - )) + return Response.html( + "Hello from {}".format(html.escape(name)) + ) @hookimpl def register_routes(): - return [ - (r"^/hello-from/(?P.*)$", hello_from) - ] + return [(r"^/hello-from/(?P.*)$", hello_from)] The view functions can take a number of different optional arguments. The corresponding argument will be passed to your function depending on its named parameters - a form of dependency injection. @@ -606,10 +636,13 @@ This example registers a new ``datasette verify file1.db file2.db`` command that import click import sqlite3 + @hookimpl def register_commands(cli): @cli.command() - @click.argument("files", type=click.Path(exists=True), nargs=-1) + @click.argument( + "files", type=click.Path(exists=True), nargs=-1 + ) def verify(files): "Verify that files can be opened by Datasette" for file in files: @@ -617,7 +650,9 @@ This example registers a new ``datasette verify file1.db file2.db`` command that try: conn.execute("select * from sqlite_master") except sqlite3.DatabaseError: - raise click.ClickException("Invalid database: {}".format(file)) + raise click.ClickException( + "Invalid database: {}".format(file) + ) The new command can then be executed like so:: @@ -656,15 +691,18 @@ Each Facet subclass implements a new type of facet operation. The class should l async def suggest(self): # Use self.sql and self.params to suggest some facets suggested_facets = [] - suggested_facets.append({ - "name": column, # Or other unique name - # Construct the URL that will enable this facet: - "toggle_url": self.ds.absolute_url( - self.request, path_with_added_args( - self.request, {"_facet": column} - ) - ), - }) + suggested_facets.append( + { + "name": column, # Or other unique name + # Construct the URL that will enable this facet: + "toggle_url": self.ds.absolute_url( + self.request, + path_with_added_args( + self.request, {"_facet": column} + ), + ), + } + ) return suggested_facets async def facet_results(self): @@ -678,18 +716,25 @@ Each Facet subclass implements a new type of facet operation. The class should l try: facet_results_values = [] # More calculations... - facet_results_values.append({ - "value": value, - "label": label, - "count": count, - "toggle_url": self.ds.absolute_url(self.request, toggle_path), - "selected": selected, - }) - facet_results.append({ - "name": column, - "results": facet_results_values, - "truncated": len(facet_rows_results) > facet_size, - }) + facet_results_values.append( + { + "value": value, + "label": label, + "count": count, + "toggle_url": self.ds.absolute_url( + self.request, toggle_path + ), + "selected": selected, + } + ) + facet_results.append( + { + "name": column, + "results": facet_results_values, + "truncated": len(facet_rows_results) + > facet_size, + } + ) except QueryInterrupted: facets_timed_out.append(column) @@ -728,21 +773,33 @@ This example plugin adds a ``x-databases`` HTTP header listing the currently att def asgi_wrapper(datasette): def wrap_with_databases_header(app): @wraps(app) - async def add_x_databases_header(scope, receive, send): + async def add_x_databases_header( + scope, receive, send + ): async def wrapped_send(event): if event["type"] == "http.response.start": - original_headers = event.get("headers") or [] + original_headers = ( + event.get("headers") or [] + ) event = { "type": event["type"], "status": event["status"], - "headers": original_headers + [ - [b"x-databases", - ", ".join(datasette.databases.keys()).encode("utf-8")] + "headers": original_headers + + [ + [ + b"x-databases", + ", ".join( + datasette.databases.keys() + ).encode("utf-8"), + ] ], } await send(event) + await app(scope, receive, wrapped_send) + return add_x_databases_header + return wrap_with_databases_header Examples: `datasette-cors `__, `datasette-pyinstrument `__ @@ -759,7 +816,9 @@ This hook fires when the Datasette application server first starts up. You can i @hookimpl def startup(datasette): config = datasette.plugin_config("my-plugin") or {} - assert "required-setting" in config, "my-plugin requires setting required-setting" + assert ( + "required-setting" in config + ), "my-plugin requires setting required-setting" Or you can return an async function which will be awaited on startup. Use this option if you need to make any database queries: @@ -770,9 +829,12 @@ Or you can return an async function which will be awaited on startup. Use this o async def inner(): db = datasette.get_database() if "my_table" not in await db.table_names(): - await db.execute_write(""" + await db.execute_write( + """ create table my_table (mycol text) - """) + """ + ) + return inner Potential use-cases: @@ -815,6 +877,7 @@ Ues this hook to return a dictionary of additional :ref:`canned query `__ @@ -888,9 +960,12 @@ Here's an example that authenticates the actor based on an incoming API key: SECRET_KEY = "this-is-a-secret" + @hookimpl def actor_from_request(datasette, request): - authorization = request.headers.get("authorization") or "" + authorization = ( + request.headers.get("authorization") or "" + ) expected = "Bearer {}".format(SECRET_KEY) if secrets.compare_digest(authorization, expected): @@ -906,6 +981,7 @@ Instead of returning a dictionary, this function can return an awaitable functio from datasette import hookimpl + @hookimpl def actor_from_request(datasette, request): async def inner(): @@ -914,7 +990,8 @@ Instead of returning a dictionary, this function can return an awaitable functio return None # Look up ?_token=xxx in sessions table result = await datasette.get_database().execute( - "select count(*) from sessions where token = ?", [token] + "select count(*) from sessions where token = ?", + [token], ) if result.first()[0]: return {"token": token} @@ -952,7 +1029,7 @@ The hook should return an instance of ``datasette.filters.FilterArguments`` whic where_clauses=["id > :max_id"], params={"max_id": 5}, human_descriptions=["max_id is greater than 5"], - extra_context={} + extra_context={}, ) The arguments to the ``FilterArguments`` class constructor are as follows: @@ -973,10 +1050,13 @@ This example plugin causes 0 results to be returned if ``?_nothing=1`` is added from datasette import hookimpl from datasette.filters import FilterArguments + @hookimpl def filters_from_request(self, request): if request.args.get("_nothing"): - return FilterArguments(["1 = 0"], human_descriptions=["NOTHING"]) + return FilterArguments( + ["1 = 0"], human_descriptions=["NOTHING"] + ) Example: `datasette-leaflet-freedraw `_ @@ -1006,6 +1086,7 @@ Here's an example plugin which randomly selects if a permission should be allowe from datasette import hookimpl import random + @hookimpl def permission_allowed(action): if action != "view-instance": @@ -1024,11 +1105,16 @@ Here's an example that allows users to view the ``admin_log`` table only if thei async def inner(): if action == "execute-sql" and resource == "staff": return False - if action == "view-table" and resource == ("staff", "admin_log"): + if action == "view-table" and resource == ( + "staff", + "admin_log", + ): if not actor: return False user_id = actor["id"] - return await datasette.get_database("staff").execute( + return await datasette.get_database( + "staff" + ).execute( "select count(*) from admin_users where user_id = :user_id", {"user_id": user_id}, ) @@ -1059,18 +1145,21 @@ This example registers two new magic parameters: ``:_request_http_version`` retu from uuid import uuid4 + def uuid(key, request): if key == "new": return str(uuid4()) else: raise KeyError + def request(key, request): if key == "http_version": return request.scope["http_version"] else: raise KeyError + @hookimpl def register_magic_parameters(datasette): return [ @@ -1103,9 +1192,12 @@ This example returns a redirect to a ``/-/login`` page: from datasette import hookimpl from urllib.parse import urlencode + @hookimpl def forbidden(request, message): - return Response.redirect("/-/login?=" + urlencode({"message": message})) + return Response.redirect( + "/-/login?=" + urlencode({"message": message}) + ) The function can alternatively return an awaitable function if it needs to make any asynchronous method calls. This example renders a template: @@ -1114,10 +1206,15 @@ The function can alternatively return an awaitable function if it needs to make from datasette import hookimpl from datasette.utils.asgi import Response + @hookimpl def forbidden(datasette): async def inner(): - return Response.html(await datasette.render_template("forbidden.html")) + return Response.html( + await datasette.render_template( + "forbidden.html" + ) + ) return inner @@ -1147,11 +1244,17 @@ This example adds a new menu item but only if the signed in user is ``"root"``: from datasette import hookimpl + @hookimpl def menu_links(datasette, actor): if actor and actor.get("id") == "root": return [ - {"href": datasette.urls.path("/-/edit-schema"), "label": "Edit schema"}, + { + "href": datasette.urls.path( + "/-/edit-schema" + ), + "label": "Edit schema", + }, ] Using :ref:`internals_datasette_urls` here ensures that links in the menu will take the :ref:`setting_base_url` setting into account. @@ -1188,13 +1291,20 @@ This example adds a new table action if the signed in user is ``"root"``: from datasette import hookimpl + @hookimpl def table_actions(datasette, actor): if actor and actor.get("id") == "root": - return [{ - "href": datasette.urls.path("/-/edit-schema/{}/{}".format(database, table)), - "label": "Edit schema for this table", - }] + return [ + { + "href": datasette.urls.path( + "/-/edit-schema/{}/{}".format( + database, table + ) + ), + "label": "Edit schema for this table", + } + ] Example: `datasette-graphql `_ @@ -1238,6 +1348,7 @@ This example will disable CSRF protection for that specific URL path: from datasette import hookimpl + @hookimpl def skip_csrf(scope): return scope["path"] == "/submit-comment" @@ -1278,7 +1389,9 @@ This hook is responsible for returning a dictionary corresponding to Datasette : "description": get_instance_description(datasette), "databases": [], } - for db_name, db_data_dict in get_my_database_meta(datasette, database, table, key): + for db_name, db_data_dict in get_my_database_meta( + datasette, database, table, key + ): metadata["databases"][db_name] = db_data_dict # whatever we return here will be merged with any other plugins using this hook and # will be overwritten by a local metadata.yaml if one exists! From 498e1536f5f3e69c50934c0c031055e0af770bf6 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 24 Apr 2022 09:08:56 -0700 Subject: [PATCH 055/952] One more blacken-docs test, refs #1718 --- docs/testing_plugins.rst | 45 ++++++++++++++++++++++++---------------- 1 file changed, 27 insertions(+), 18 deletions(-) diff --git a/docs/testing_plugins.rst b/docs/testing_plugins.rst index 8e4e3f91..6361d744 100644 --- a/docs/testing_plugins.rst +++ b/docs/testing_plugins.rst @@ -19,7 +19,10 @@ If you use the template described in :ref:`writing_plugins_cookiecutter` your pl response = await datasette.client.get("/-/plugins.json") assert response.status_code == 200 installed_plugins = {p["name"] for p in response.json()} - assert "datasette-plugin-template-demo" in installed_plugins + assert ( + "datasette-plugin-template-demo" + in installed_plugins + ) This test uses the :ref:`internals_datasette_client` object to exercise a test instance of Datasette. ``datasette.client`` is a wrapper around the `HTTPX `__ Python library which can imitate HTTP requests using ASGI. This is the recommended way to write tests against a Datasette instance. @@ -37,9 +40,7 @@ If you are building an installable package you can add them as test dependencies setup( name="datasette-my-plugin", # ... - extras_require={ - "test": ["pytest", "pytest-asyncio"] - }, + extras_require={"test": ["pytest", "pytest-asyncio"]}, tests_require=["datasette-my-plugin[test]"], ) @@ -87,31 +88,34 @@ Here's an example that uses the `sqlite-utils library Date: Sun, 24 Apr 2022 09:17:59 -0700 Subject: [PATCH 056/952] Finished applying blacken-docs, closes #1718 --- .github/workflows/test.yml | 3 +-- docs/testing_plugins.rst | 4 +++- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 38b62995..8d916e49 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -34,6 +34,5 @@ jobs: cog --check docs/*.rst - name: Check if blacken-docs needs to be run run: | + # This fails on syntax errors, or a diff was applied blacken-docs -l 60 docs/*.rst - # This fails if a diff was generated: - git diff-index --quiet HEAD -- diff --git a/docs/testing_plugins.rst b/docs/testing_plugins.rst index 6361d744..1bbaaac1 100644 --- a/docs/testing_plugins.rst +++ b/docs/testing_plugins.rst @@ -118,7 +118,9 @@ Here's an example that uses the `sqlite-utils library Date: Sun, 24 Apr 2022 09:59:20 -0700 Subject: [PATCH 057/952] Cosmetic tweaks after blacken-docs, refs #1718 --- docs/plugin_hooks.rst | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index ace206b7..4560ec9a 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -162,9 +162,8 @@ And here's an example which adds a ``sql_first(sql_query)`` function which execu or database or next(iter(datasette.databases.keys())) ) - return (await datasette.execute(dbname, sql)).rows[ - 0 - ][0] + result = await datasette.execute(dbname, sql) + return result.rows[0][0] return {"sql_first": sql_first} @@ -422,8 +421,8 @@ If the value matches that pattern, the plugin returns an HTML link element: if not isinstance(value, str): return None stripped = value.strip() - if not stripped.startswith("{") and stripped.endswith( - "}" + if not ( + stripped.startswith("{") and stripped.endswith("}") ): return None try: From 579f59dcec43a91dd7d404e00b87a00afd8515f2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 25 Apr 2022 11:33:35 -0700 Subject: [PATCH 058/952] Refactor to remove RowTableShared class, closes #1719 Refs #1715 --- datasette/app.py | 3 +- datasette/views/row.py | 142 +++++++++++ datasette/views/table.py | 497 +++++++++++++++------------------------ 3 files changed, 328 insertions(+), 314 deletions(-) create mode 100644 datasette/views/row.py diff --git a/datasette/app.py b/datasette/app.py index c9eede26..d269372c 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -40,7 +40,8 @@ from .views.special import ( PermissionsDebugView, MessagesDebugView, ) -from .views.table import RowView, TableView +from .views.table import TableView +from .views.row import RowView from .renderer import json_renderer from .url_builder import Urls from .database import Database, QueryInterrupted diff --git a/datasette/views/row.py b/datasette/views/row.py new file mode 100644 index 00000000..b1c7362d --- /dev/null +++ b/datasette/views/row.py @@ -0,0 +1,142 @@ +from datasette.utils.asgi import NotFound +from datasette.database import QueryInterrupted +from .base import DataView +from datasette.utils import ( + tilde_decode, + urlsafe_components, + to_css_class, + escape_sqlite, +) +from .table import _sql_params_pks, display_columns_and_rows + + +class RowView(DataView): + name = "row" + + async def data(self, request, default_labels=False): + database_route = tilde_decode(request.url_vars["database"]) + table = tilde_decode(request.url_vars["table"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name + await self.ds.ensure_permissions( + request.actor, + [ + ("view-table", (database, table)), + ("view-database", database), + "view-instance", + ], + ) + pk_values = urlsafe_components(request.url_vars["pks"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database = db.name + sql, params, pks = await _sql_params_pks(db, table, pk_values) + results = await db.execute(sql, params, truncate=True) + columns = [r[0] for r in results.description] + rows = list(results.rows) + if not rows: + raise NotFound(f"Record not found: {pk_values}") + + async def template_data(): + display_columns, display_rows = await display_columns_and_rows( + self.ds, + database, + table, + results.description, + rows, + link_column=False, + truncate_cells=0, + ) + for column in display_columns: + column["sortable"] = False + return { + "foreign_key_tables": await self.foreign_key_tables( + database, table, pk_values + ), + "display_columns": display_columns, + "display_rows": display_rows, + "custom_table_templates": [ + f"_table-{to_css_class(database)}-{to_css_class(table)}.html", + f"_table-row-{to_css_class(database)}-{to_css_class(table)}.html", + "_table.html", + ], + "metadata": (self.ds.metadata("databases") or {}) + .get(database, {}) + .get("tables", {}) + .get(table, {}), + } + + data = { + "database": database, + "table": table, + "rows": rows, + "columns": columns, + "primary_keys": pks, + "primary_key_values": pk_values, + "units": self.ds.table_metadata(database, table).get("units", {}), + } + + if "foreign_key_tables" in (request.args.get("_extras") or "").split(","): + data["foreign_key_tables"] = await self.foreign_key_tables( + database, table, pk_values + ) + + return ( + data, + template_data, + ( + f"row-{to_css_class(database)}-{to_css_class(table)}.html", + "row.html", + ), + ) + + async def foreign_key_tables(self, database, table, pk_values): + if len(pk_values) != 1: + return [] + db = self.ds.databases[database] + all_foreign_keys = await db.get_all_foreign_keys() + foreign_keys = all_foreign_keys[table]["incoming"] + if len(foreign_keys) == 0: + return [] + + sql = "select " + ", ".join( + [ + "(select count(*) from {table} where {column}=:id)".format( + table=escape_sqlite(fk["other_table"]), + column=escape_sqlite(fk["other_column"]), + ) + for fk in foreign_keys + ] + ) + try: + rows = list(await db.execute(sql, {"id": pk_values[0]})) + except QueryInterrupted: + # Almost certainly hit the timeout + return [] + + foreign_table_counts = dict( + zip( + [(fk["other_table"], fk["other_column"]) for fk in foreign_keys], + list(rows[0]), + ) + ) + foreign_key_tables = [] + for fk in foreign_keys: + count = ( + foreign_table_counts.get((fk["other_table"], fk["other_column"])) or 0 + ) + key = fk["other_column"] + if key.startswith("_"): + key += "__exact" + link = "{}?{}={}".format( + self.ds.urls.table(database, fk["other_table"]), + key, + ",".join(pk_values), + ) + foreign_key_tables.append({**fk, **{"count": count, "link": link}}) + return foreign_key_tables diff --git a/datasette/views/table.py b/datasette/views/table.py index dc85165e..37fb2ebb 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -1,4 +1,3 @@ -import urllib import itertools import json @@ -9,7 +8,6 @@ from datasette.database import QueryInterrupted from datasette.utils import ( await_me_maybe, CustomRow, - MultiParams, append_querystring, compound_keys_after_sql, format_bytes, @@ -21,7 +19,6 @@ from datasette.utils import ( is_url, path_from_row_pks, path_with_added_args, - path_with_format, path_with_removed_args, path_with_replaced_args, to_css_class, @@ -68,7 +65,9 @@ class Row: return json.dumps(d, default=repr, indent=2) -class RowTableShared(DataView): +class TableView(DataView): + name = "table" + async def sortable_columns_for_table(self, database, table, use_rowid): db = self.ds.databases[database] table_metadata = self.ds.table_metadata(database, table) @@ -89,193 +88,6 @@ class RowTableShared(DataView): expandables.append((fk, label_column)) return expandables - async def display_columns_and_rows( - self, database, table, description, rows, link_column=False, truncate_cells=0 - ): - """Returns columns, rows for specified table - including fancy foreign key treatment""" - db = self.ds.databases[database] - table_metadata = self.ds.table_metadata(database, table) - column_descriptions = table_metadata.get("columns") or {} - column_details = {col.name: col for col in await db.table_column_details(table)} - sortable_columns = await self.sortable_columns_for_table(database, table, True) - pks = await db.primary_keys(table) - pks_for_display = pks - if not pks_for_display: - pks_for_display = ["rowid"] - - columns = [] - for r in description: - if r[0] == "rowid" and "rowid" not in column_details: - type_ = "integer" - notnull = 0 - else: - type_ = column_details[r[0]].type - notnull = column_details[r[0]].notnull - columns.append( - { - "name": r[0], - "sortable": r[0] in sortable_columns, - "is_pk": r[0] in pks_for_display, - "type": type_, - "notnull": notnull, - "description": column_descriptions.get(r[0]), - } - ) - - column_to_foreign_key_table = { - fk["column"]: fk["other_table"] - for fk in await db.foreign_keys_for_table(table) - } - - cell_rows = [] - base_url = self.ds.setting("base_url") - for row in rows: - cells = [] - # Unless we are a view, the first column is a link - either to the rowid - # or to the simple or compound primary key - if link_column: - is_special_link_column = len(pks) != 1 - pk_path = path_from_row_pks(row, pks, not pks, False) - cells.append( - { - "column": pks[0] if len(pks) == 1 else "Link", - "value_type": "pk", - "is_special_link_column": is_special_link_column, - "raw": pk_path, - "value": markupsafe.Markup( - '{flat_pks}'.format( - base_url=base_url, - table_path=self.ds.urls.table(database, table), - flat_pks=str(markupsafe.escape(pk_path)), - flat_pks_quoted=path_from_row_pks(row, pks, not pks), - ) - ), - } - ) - - for value, column_dict in zip(row, columns): - column = column_dict["name"] - if link_column and len(pks) == 1 and column == pks[0]: - # If there's a simple primary key, don't repeat the value as it's - # already shown in the link column. - continue - - # First let the plugins have a go - # pylint: disable=no-member - plugin_display_value = None - for candidate in pm.hook.render_cell( - value=value, - column=column, - table=table, - database=database, - datasette=self.ds, - ): - candidate = await await_me_maybe(candidate) - if candidate is not None: - plugin_display_value = candidate - break - if plugin_display_value: - display_value = plugin_display_value - elif isinstance(value, bytes): - formatted = format_bytes(len(value)) - display_value = markupsafe.Markup( - '<Binary: {:,} byte{}>'.format( - self.ds.urls.row_blob( - database, - table, - path_from_row_pks(row, pks, not pks), - column, - ), - ' title="{}"'.format(formatted) - if "bytes" not in formatted - else "", - len(value), - "" if len(value) == 1 else "s", - ) - ) - elif isinstance(value, dict): - # It's an expanded foreign key - display link to other row - label = value["label"] - value = value["value"] - # The table we link to depends on the column - other_table = column_to_foreign_key_table[column] - link_template = ( - LINK_WITH_LABEL if (label != value) else LINK_WITH_VALUE - ) - display_value = markupsafe.Markup( - link_template.format( - database=database, - base_url=base_url, - table=tilde_encode(other_table), - link_id=tilde_encode(str(value)), - id=str(markupsafe.escape(value)), - label=str(markupsafe.escape(label)) or "-", - ) - ) - elif value in ("", None): - display_value = markupsafe.Markup(" ") - elif is_url(str(value).strip()): - display_value = markupsafe.Markup( - '{url}'.format( - url=markupsafe.escape(value.strip()) - ) - ) - elif column in table_metadata.get("units", {}) and value != "": - # Interpret units using pint - value = value * ureg(table_metadata["units"][column]) - # Pint uses floating point which sometimes introduces errors in the compact - # representation, which we have to round off to avoid ugliness. In the vast - # majority of cases this rounding will be inconsequential. I hope. - value = round(value.to_compact(), 6) - display_value = markupsafe.Markup( - f"{value:~P}".replace(" ", " ") - ) - else: - display_value = str(value) - if truncate_cells and len(display_value) > truncate_cells: - display_value = display_value[:truncate_cells] + "\u2026" - - cells.append( - { - "column": column, - "value": display_value, - "raw": value, - "value_type": "none" - if value is None - else str(type(value).__name__), - } - ) - cell_rows.append(Row(cells)) - - if link_column: - # Add the link column header. - # If it's a simple primary key, we have to remove and re-add that column name at - # the beginning of the header row. - first_column = None - if len(pks) == 1: - columns = [col for col in columns if col["name"] != pks[0]] - first_column = { - "name": pks[0], - "sortable": len(pks) == 1, - "is_pk": True, - "type": column_details[pks[0]].type, - "notnull": column_details[pks[0]].notnull, - } - else: - first_column = { - "name": "Link", - "sortable": False, - "is_pk": False, - "type": "", - "notnull": 0, - } - columns = [first_column] + columns - return columns, cell_rows - - -class TableView(RowTableShared): - name = "table" - async def post(self, request): database_route = tilde_decode(request.url_vars["database"]) try: @@ -807,13 +619,17 @@ class TableView(RowTableShared): async def extra_template(): nonlocal sort - display_columns, display_rows = await self.display_columns_and_rows( + display_columns, display_rows = await display_columns_and_rows( + self.ds, database, table, results.description, rows, link_column=not is_view, truncate_cells=self.ds.setting("truncate_cells_html"), + sortable_columns=await self.sortable_columns_for_table( + database, table, use_rowid=True + ), ) metadata = ( (self.ds.metadata("databases") or {}) @@ -948,132 +764,187 @@ async def _sql_params_pks(db, table, pk_values): return sql, params, pks -class RowView(RowTableShared): - name = "row" +async def display_columns_and_rows( + datasette, + database, + table, + description, + rows, + link_column=False, + truncate_cells=0, + sortable_columns=None, +): + """Returns columns, rows for specified table - including fancy foreign key treatment""" + sortable_columns = sortable_columns or set() + db = datasette.databases[database] + table_metadata = datasette.table_metadata(database, table) + column_descriptions = table_metadata.get("columns") or {} + column_details = {col.name: col for col in await db.table_column_details(table)} + pks = await db.primary_keys(table) + pks_for_display = pks + if not pks_for_display: + pks_for_display = ["rowid"] - async def data(self, request, default_labels=False): - database_route = tilde_decode(request.url_vars["database"]) - table = tilde_decode(request.url_vars["table"]) - try: - db = self.ds.get_database(route=database_route) - except KeyError: - raise NotFound("Database not found: {}".format(database_route)) - database = db.name - await self.ds.ensure_permissions( - request.actor, - [ - ("view-table", (database, table)), - ("view-database", database), - "view-instance", - ], - ) - pk_values = urlsafe_components(request.url_vars["pks"]) - try: - db = self.ds.get_database(route=database_route) - except KeyError: - raise NotFound("Database not found: {}".format(database_route)) - database = db.name - sql, params, pks = await _sql_params_pks(db, table, pk_values) - results = await db.execute(sql, params, truncate=True) - columns = [r[0] for r in results.description] - rows = list(results.rows) - if not rows: - raise NotFound(f"Record not found: {pk_values}") - - async def template_data(): - display_columns, display_rows = await self.display_columns_and_rows( - database, - table, - results.description, - rows, - link_column=False, - truncate_cells=0, - ) - for column in display_columns: - column["sortable"] = False - return { - "foreign_key_tables": await self.foreign_key_tables( - database, table, pk_values - ), - "display_columns": display_columns, - "display_rows": display_rows, - "custom_table_templates": [ - f"_table-{to_css_class(database)}-{to_css_class(table)}.html", - f"_table-row-{to_css_class(database)}-{to_css_class(table)}.html", - "_table.html", - ], - "metadata": (self.ds.metadata("databases") or {}) - .get(database, {}) - .get("tables", {}) - .get(table, {}), + columns = [] + for r in description: + if r[0] == "rowid" and "rowid" not in column_details: + type_ = "integer" + notnull = 0 + else: + type_ = column_details[r[0]].type + notnull = column_details[r[0]].notnull + columns.append( + { + "name": r[0], + "sortable": r[0] in sortable_columns, + "is_pk": r[0] in pks_for_display, + "type": type_, + "notnull": notnull, + "description": column_descriptions.get(r[0]), } - - data = { - "database": database, - "table": table, - "rows": rows, - "columns": columns, - "primary_keys": pks, - "primary_key_values": pk_values, - "units": self.ds.table_metadata(database, table).get("units", {}), - } - - if "foreign_key_tables" in (request.args.get("_extras") or "").split(","): - data["foreign_key_tables"] = await self.foreign_key_tables( - database, table, pk_values - ) - - return ( - data, - template_data, - ( - f"row-{to_css_class(database)}-{to_css_class(table)}.html", - "row.html", - ), ) - async def foreign_key_tables(self, database, table, pk_values): - if len(pk_values) != 1: - return [] - db = self.ds.databases[database] - all_foreign_keys = await db.get_all_foreign_keys() - foreign_keys = all_foreign_keys[table]["incoming"] - if len(foreign_keys) == 0: - return [] + column_to_foreign_key_table = { + fk["column"]: fk["other_table"] for fk in await db.foreign_keys_for_table(table) + } - sql = "select " + ", ".join( - [ - "(select count(*) from {table} where {column}=:id)".format( - table=escape_sqlite(fk["other_table"]), - column=escape_sqlite(fk["other_column"]), + cell_rows = [] + base_url = datasette.setting("base_url") + for row in rows: + cells = [] + # Unless we are a view, the first column is a link - either to the rowid + # or to the simple or compound primary key + if link_column: + is_special_link_column = len(pks) != 1 + pk_path = path_from_row_pks(row, pks, not pks, False) + cells.append( + { + "column": pks[0] if len(pks) == 1 else "Link", + "value_type": "pk", + "is_special_link_column": is_special_link_column, + "raw": pk_path, + "value": markupsafe.Markup( + '{flat_pks}'.format( + base_url=base_url, + table_path=datasette.urls.table(database, table), + flat_pks=str(markupsafe.escape(pk_path)), + flat_pks_quoted=path_from_row_pks(row, pks, not pks), + ) + ), + } + ) + + for value, column_dict in zip(row, columns): + column = column_dict["name"] + if link_column and len(pks) == 1 and column == pks[0]: + # If there's a simple primary key, don't repeat the value as it's + # already shown in the link column. + continue + + # First let the plugins have a go + # pylint: disable=no-member + plugin_display_value = None + for candidate in pm.hook.render_cell( + value=value, + column=column, + table=table, + database=database, + datasette=datasette, + ): + candidate = await await_me_maybe(candidate) + if candidate is not None: + plugin_display_value = candidate + break + if plugin_display_value: + display_value = plugin_display_value + elif isinstance(value, bytes): + formatted = format_bytes(len(value)) + display_value = markupsafe.Markup( + '<Binary: {:,} byte{}>'.format( + datasette.urls.row_blob( + database, + table, + path_from_row_pks(row, pks, not pks), + column, + ), + ' title="{}"'.format(formatted) + if "bytes" not in formatted + else "", + len(value), + "" if len(value) == 1 else "s", + ) ) - for fk in foreign_keys - ] - ) - try: - rows = list(await db.execute(sql, {"id": pk_values[0]})) - except QueryInterrupted: - # Almost certainly hit the timeout - return [] + elif isinstance(value, dict): + # It's an expanded foreign key - display link to other row + label = value["label"] + value = value["value"] + # The table we link to depends on the column + other_table = column_to_foreign_key_table[column] + link_template = LINK_WITH_LABEL if (label != value) else LINK_WITH_VALUE + display_value = markupsafe.Markup( + link_template.format( + database=database, + base_url=base_url, + table=tilde_encode(other_table), + link_id=tilde_encode(str(value)), + id=str(markupsafe.escape(value)), + label=str(markupsafe.escape(label)) or "-", + ) + ) + elif value in ("", None): + display_value = markupsafe.Markup(" ") + elif is_url(str(value).strip()): + display_value = markupsafe.Markup( + '{url}'.format( + url=markupsafe.escape(value.strip()) + ) + ) + elif column in table_metadata.get("units", {}) and value != "": + # Interpret units using pint + value = value * ureg(table_metadata["units"][column]) + # Pint uses floating point which sometimes introduces errors in the compact + # representation, which we have to round off to avoid ugliness. In the vast + # majority of cases this rounding will be inconsequential. I hope. + value = round(value.to_compact(), 6) + display_value = markupsafe.Markup(f"{value:~P}".replace(" ", " ")) + else: + display_value = str(value) + if truncate_cells and len(display_value) > truncate_cells: + display_value = display_value[:truncate_cells] + "\u2026" - foreign_table_counts = dict( - zip( - [(fk["other_table"], fk["other_column"]) for fk in foreign_keys], - list(rows[0]), + cells.append( + { + "column": column, + "value": display_value, + "raw": value, + "value_type": "none" + if value is None + else str(type(value).__name__), + } ) - ) - foreign_key_tables = [] - for fk in foreign_keys: - count = ( - foreign_table_counts.get((fk["other_table"], fk["other_column"])) or 0 - ) - key = fk["other_column"] - if key.startswith("_"): - key += "__exact" - link = "{}?{}={}".format( - self.ds.urls.table(database, fk["other_table"]), - key, - ",".join(pk_values), - ) - foreign_key_tables.append({**fk, **{"count": count, "link": link}}) - return foreign_key_tables + cell_rows.append(Row(cells)) + + if link_column: + # Add the link column header. + # If it's a simple primary key, we have to remove and re-add that column name at + # the beginning of the header row. + first_column = None + if len(pks) == 1: + columns = [col for col in columns if col["name"] != pks[0]] + first_column = { + "name": pks[0], + "sortable": len(pks) == 1, + "is_pk": True, + "type": column_details[pks[0]].type, + "notnull": column_details[pks[0]].notnull, + } + else: + first_column = { + "name": "Link", + "sortable": False, + "is_pk": False, + "type": "", + "notnull": 0, + } + columns = [first_column] + columns + return columns, cell_rows From c101f0efeec4f6e49298a542c5e2b59236cfa0ff Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 26 Apr 2022 15:34:29 -0700 Subject: [PATCH 059/952] datasette-total-page-time example of asgi_wrapper --- docs/plugin_hooks.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 4560ec9a..3c9ae2e2 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -801,7 +801,7 @@ This example plugin adds a ``x-databases`` HTTP header listing the currently att return wrap_with_databases_header -Examples: `datasette-cors `__, `datasette-pyinstrument `__ +Examples: `datasette-cors `__, `datasette-pyinstrument `__, `datasette-total-page-time `__ .. _plugin_hook_startup: From 8a0c38f0b89543e652a968a90d480859cb102510 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 26 Apr 2022 13:56:27 -0700 Subject: [PATCH 060/952] Rename database->database_name and table-> table_name, refs #1715 --- datasette/views/table.py | 143 +++++++++++++++++++++------------------ 1 file changed, 76 insertions(+), 67 deletions(-) diff --git a/datasette/views/table.py b/datasette/views/table.py index 37fb2ebb..d66adb82 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -68,22 +68,22 @@ class Row: class TableView(DataView): name = "table" - async def sortable_columns_for_table(self, database, table, use_rowid): - db = self.ds.databases[database] - table_metadata = self.ds.table_metadata(database, table) + async def sortable_columns_for_table(self, database_name, table_name, use_rowid): + db = self.ds.databases[database_name] + table_metadata = self.ds.table_metadata(database_name, table_name) if "sortable_columns" in table_metadata: sortable_columns = set(table_metadata["sortable_columns"]) else: - sortable_columns = set(await db.table_columns(table)) + sortable_columns = set(await db.table_columns(table_name)) if use_rowid: sortable_columns.add("rowid") return sortable_columns - async def expandable_columns(self, database, table): + async def expandable_columns(self, database_name, table_name): # Returns list of (fk_dict, label_column-or-None) pairs for that table expandables = [] - db = self.ds.databases[database] - for fk in await db.foreign_keys_for_table(table): + db = self.ds.databases[database_name] + for fk in await db.foreign_keys_for_table(table_name): label_column = await db.label_column_for_table(fk["other_table"]) expandables.append((fk, label_column)) return expandables @@ -94,17 +94,19 @@ class TableView(DataView): db = self.ds.get_database(route=database_route) except KeyError: raise NotFound("Database not found: {}".format(database_route)) - database = db.name - table = tilde_decode(request.url_vars["table"]) + database_name = db.name + table_name = tilde_decode(request.url_vars["table"]) # Handle POST to a canned query - canned_query = await self.ds.get_canned_query(database, table, request.actor) + canned_query = await self.ds.get_canned_query( + database_name, table_name, request.actor + ) assert canned_query, "You may only POST to a canned query" return await QueryView(self.ds).data( request, canned_query["sql"], metadata=canned_query, editable=False, - canned_query=table, + canned_query=table_name, named_parameters=canned_query.get("params"), write=bool(canned_query.get("write")), ) @@ -150,45 +152,47 @@ class TableView(DataView): _size=None, ): database_route = tilde_decode(request.url_vars["database"]) - table = tilde_decode(request.url_vars["table"]) + table_name = tilde_decode(request.url_vars["table"]) try: db = self.ds.get_database(route=database_route) except KeyError: raise NotFound("Database not found: {}".format(database_route)) - database = db.name + database_name = db.name # If this is a canned query, not a table, then dispatch to QueryView instead - canned_query = await self.ds.get_canned_query(database, table, request.actor) + canned_query = await self.ds.get_canned_query( + database_name, table_name, request.actor + ) if canned_query: return await QueryView(self.ds).data( request, canned_query["sql"], metadata=canned_query, editable=False, - canned_query=table, + canned_query=table_name, named_parameters=canned_query.get("params"), write=bool(canned_query.get("write")), ) - is_view = bool(await db.get_view_definition(table)) - table_exists = bool(await db.table_exists(table)) + is_view = bool(await db.get_view_definition(table_name)) + table_exists = bool(await db.table_exists(table_name)) # If table or view not found, return 404 if not is_view and not table_exists: - raise NotFound(f"Table not found: {table}") + raise NotFound(f"Table not found: {table_name}") # Ensure user has permission to view this table await self.ds.ensure_permissions( request.actor, [ - ("view-table", (database, table)), - ("view-database", database), + ("view-table", (database_name, table_name)), + ("view-database", database_name), "view-instance", ], ) private = not await self.ds.permission_allowed( - None, "view-table", (database, table), default=True + None, "view-table", (database_name, table_name), default=True ) # Handle ?_filter_column and redirect, if present @@ -216,8 +220,8 @@ class TableView(DataView): ) # Introspect columns and primary keys for table - pks = await db.primary_keys(table) - table_columns = await db.table_columns(table) + pks = await db.primary_keys(table_name) + table_columns = await db.table_columns(table_name) # Take ?_col= and ?_nocol= into account specified_columns = await self.columns_to_select(table_columns, pks, request) @@ -248,7 +252,7 @@ class TableView(DataView): nocount = True nofacet = True - table_metadata = self.ds.table_metadata(database, table) + table_metadata = self.ds.table_metadata(database_name, table_name) units = table_metadata.get("units", {}) # Arguments that start with _ and don't contain a __ are @@ -262,7 +266,7 @@ class TableView(DataView): # Build where clauses from query string arguments filters = Filters(sorted(filter_args), units, ureg) - where_clauses, params = filters.build_where_clauses(table) + where_clauses, params = filters.build_where_clauses(table_name) # Execute filters_from_request plugin hooks - including the default # ones that live in datasette/filters.py @@ -271,8 +275,8 @@ class TableView(DataView): for hook in pm.hook.filters_from_request( request=request, - table=table, - database=database, + table=table_name, + database=database_name, datasette=self.ds, ): filter_arguments = await await_me_maybe(hook) @@ -284,7 +288,7 @@ class TableView(DataView): # Deal with custom sort orders sortable_columns = await self.sortable_columns_for_table( - database, table, use_rowid + database_name, table_name, use_rowid ) sort = request.args.get("_sort") sort_desc = request.args.get("_sort_desc") @@ -309,7 +313,7 @@ class TableView(DataView): order_by = f"{escape_sqlite(sort_desc)} desc" from_sql = "from {table_name} {where}".format( - table_name=escape_sqlite(table), + table_name=escape_sqlite(table_name), where=("where {} ".format(" and ".join(where_clauses))) if where_clauses else "", @@ -422,7 +426,7 @@ class TableView(DataView): sql_no_order_no_limit = ( "select {select_all_columns} from {table_name} {where}".format( select_all_columns=select_all_columns, - table_name=escape_sqlite(table), + table_name=escape_sqlite(table_name), where=where_clause, ) ) @@ -430,7 +434,7 @@ class TableView(DataView): # This is the SQL that populates the main table on the page sql = "select {select_specified_columns} from {table_name} {where}{order_by} limit {page_size}{offset}".format( select_specified_columns=select_specified_columns, - table_name=escape_sqlite(table), + table_name=escape_sqlite(table_name), where=where_clause, order_by=order_by, page_size=page_size + 1, @@ -448,13 +452,13 @@ class TableView(DataView): if ( not db.is_mutable and self.ds.inspect_data - and count_sql == f"select count(*) from {table} " + and count_sql == f"select count(*) from {table_name} " ): # We can use a previously cached table row count try: - filtered_table_rows_count = self.ds.inspect_data[database]["tables"][ - table - ]["count"] + filtered_table_rows_count = self.ds.inspect_data[database_name][ + "tables" + ][table_name]["count"] except KeyError: pass @@ -484,10 +488,10 @@ class TableView(DataView): klass( self.ds, request, - database, + database_name, sql=sql_no_order_no_limit, params=params, - table=table, + table=table_name, metadata=table_metadata, row_count=filtered_table_rows_count, ) @@ -527,7 +531,7 @@ class TableView(DataView): # Expand labeled columns if requested expanded_columns = [] - expandable_columns = await self.expandable_columns(database, table) + expandable_columns = await self.expandable_columns(database_name, table_name) columns_to_expand = None try: all_labels = value_as_boolean(request.args.get("_labels", "")) @@ -554,7 +558,9 @@ class TableView(DataView): values = [row[column_index] for row in rows] # Expand them expanded_labels.update( - await self.ds.expand_foreign_keys(database, table, column, values) + await self.ds.expand_foreign_keys( + database_name, table_name, column, values + ) ) if expanded_labels: # Rewrite the rows @@ -621,21 +627,21 @@ class TableView(DataView): display_columns, display_rows = await display_columns_and_rows( self.ds, - database, - table, + database_name, + table_name, results.description, rows, link_column=not is_view, truncate_cells=self.ds.setting("truncate_cells_html"), sortable_columns=await self.sortable_columns_for_table( - database, table, use_rowid=True + database_name, table_name, use_rowid=True ), ) metadata = ( (self.ds.metadata("databases") or {}) - .get(database, {}) + .get(database_name, {}) .get("tables", {}) - .get(table, {}) + .get(table_name, {}) ) self.ds.update_with_inherited_metadata(metadata) @@ -661,8 +667,8 @@ class TableView(DataView): links = [] for hook in pm.hook.table_actions( datasette=self.ds, - table=table, - database=database, + table=table_name, + database=database_name, actor=request.actor, request=request, ): @@ -703,13 +709,13 @@ class TableView(DataView): "sort_desc": sort_desc, "disable_sort": is_view, "custom_table_templates": [ - f"_table-{to_css_class(database)}-{to_css_class(table)}.html", - f"_table-table-{to_css_class(database)}-{to_css_class(table)}.html", + f"_table-{to_css_class(database_name)}-{to_css_class(table_name)}.html", + f"_table-table-{to_css_class(database_name)}-{to_css_class(table_name)}.html", "_table.html", ], "metadata": metadata, - "view_definition": await db.get_view_definition(table), - "table_definition": await db.get_table_definition(table), + "view_definition": await db.get_view_definition(table_name), + "table_definition": await db.get_table_definition(table_name), "datasette_allow_facet": "true" if self.ds.setting("allow_facet") else "false", @@ -719,8 +725,8 @@ class TableView(DataView): return ( { - "database": database, - "table": table, + "database": database_name, + "table": table_name, "is_view": is_view, "human_description_en": human_description_en, "rows": rows[:page_size], @@ -738,12 +744,12 @@ class TableView(DataView): "next_url": next_url, "private": private, "allow_execute_sql": await self.ds.permission_allowed( - request.actor, "execute-sql", database, default=True + request.actor, "execute-sql", database_name, default=True ), }, extra_template, ( - f"table-{to_css_class(database)}-{to_css_class(table)}.html", + f"table-{to_css_class(database_name)}-{to_css_class(table_name)}.html", "table.html", ), ) @@ -766,8 +772,8 @@ async def _sql_params_pks(db, table, pk_values): async def display_columns_and_rows( datasette, - database, - table, + database_name, + table_name, description, rows, link_column=False, @@ -776,11 +782,13 @@ async def display_columns_and_rows( ): """Returns columns, rows for specified table - including fancy foreign key treatment""" sortable_columns = sortable_columns or set() - db = datasette.databases[database] - table_metadata = datasette.table_metadata(database, table) + db = datasette.databases[database_name] + table_metadata = datasette.table_metadata(database_name, table_name) column_descriptions = table_metadata.get("columns") or {} - column_details = {col.name: col for col in await db.table_column_details(table)} - pks = await db.primary_keys(table) + column_details = { + col.name: col for col in await db.table_column_details(table_name) + } + pks = await db.primary_keys(table_name) pks_for_display = pks if not pks_for_display: pks_for_display = ["rowid"] @@ -805,7 +813,8 @@ async def display_columns_and_rows( ) column_to_foreign_key_table = { - fk["column"]: fk["other_table"] for fk in await db.foreign_keys_for_table(table) + fk["column"]: fk["other_table"] + for fk in await db.foreign_keys_for_table(table_name) } cell_rows = [] @@ -826,7 +835,7 @@ async def display_columns_and_rows( "value": markupsafe.Markup( '{flat_pks}'.format( base_url=base_url, - table_path=datasette.urls.table(database, table), + table_path=datasette.urls.table(database_name, table_name), flat_pks=str(markupsafe.escape(pk_path)), flat_pks_quoted=path_from_row_pks(row, pks, not pks), ) @@ -847,8 +856,8 @@ async def display_columns_and_rows( for candidate in pm.hook.render_cell( value=value, column=column, - table=table, - database=database, + table=table_name, + database=database_name, datasette=datasette, ): candidate = await await_me_maybe(candidate) @@ -862,8 +871,8 @@ async def display_columns_and_rows( display_value = markupsafe.Markup( '<Binary: {:,} byte{}>'.format( datasette.urls.row_blob( - database, - table, + database_name, + table_name, path_from_row_pks(row, pks, not pks), column, ), @@ -883,7 +892,7 @@ async def display_columns_and_rows( link_template = LINK_WITH_LABEL if (label != value) else LINK_WITH_VALUE display_value = markupsafe.Markup( link_template.format( - database=database, + database=database_name, base_url=base_url, table=tilde_encode(other_table), link_id=tilde_encode(str(value)), From 942411ef946e9a34a2094944d3423cddad27efd3 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 26 Apr 2022 15:48:56 -0700 Subject: [PATCH 061/952] Execute some TableView queries in parallel Use ?_noparallel=1 to opt out (undocumented, useful for benchmark comparisons) Refs #1723, #1715 --- datasette/views/table.py | 91 +++++++++++++++++++++++++++++----------- 1 file changed, 66 insertions(+), 25 deletions(-) diff --git a/datasette/views/table.py b/datasette/views/table.py index d66adb82..23289b29 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -1,3 +1,4 @@ +import asyncio import itertools import json @@ -5,6 +6,7 @@ import markupsafe from datasette.plugins import pm from datasette.database import QueryInterrupted +from datasette import tracer from datasette.utils import ( await_me_maybe, CustomRow, @@ -150,6 +152,16 @@ class TableView(DataView): default_labels=False, _next=None, _size=None, + ): + with tracer.trace_child_tasks(): + return await self._data_traced(request, default_labels, _next, _size) + + async def _data_traced( + self, + request, + default_labels=False, + _next=None, + _size=None, ): database_route = tilde_decode(request.url_vars["database"]) table_name = tilde_decode(request.url_vars["table"]) @@ -159,6 +171,20 @@ class TableView(DataView): raise NotFound("Database not found: {}".format(database_route)) database_name = db.name + # For performance profiling purposes, ?_noparallel=1 turns off asyncio.gather + async def _gather_parallel(*args): + return await asyncio.gather(*args) + + async def _gather_sequential(*args): + results = [] + for fn in args: + results.append(await fn) + return results + + gather = ( + _gather_sequential if request.args.get("_noparallel") else _gather_parallel + ) + # If this is a canned query, not a table, then dispatch to QueryView instead canned_query = await self.ds.get_canned_query( database_name, table_name, request.actor @@ -174,8 +200,12 @@ class TableView(DataView): write=bool(canned_query.get("write")), ) - is_view = bool(await db.get_view_definition(table_name)) - table_exists = bool(await db.table_exists(table_name)) + is_view, table_exists = map( + bool, + await gather( + db.get_view_definition(table_name), db.table_exists(table_name) + ), + ) # If table or view not found, return 404 if not is_view and not table_exists: @@ -497,33 +527,44 @@ class TableView(DataView): ) ) - if not nofacet: - for facet in facet_instances: - ( + async def execute_facets(): + if not nofacet: + # Run them in parallel + facet_awaitables = [facet.facet_results() for facet in facet_instances] + facet_awaitable_results = await gather(*facet_awaitables) + for ( instance_facet_results, instance_facets_timed_out, - ) = await facet.facet_results() - for facet_info in instance_facet_results: - base_key = facet_info["name"] - key = base_key - i = 1 - while key in facet_results: - i += 1 - key = f"{base_key}_{i}" - facet_results[key] = facet_info - facets_timed_out.extend(instance_facets_timed_out) + ) in facet_awaitable_results: + for facet_info in instance_facet_results: + base_key = facet_info["name"] + key = base_key + i = 1 + while key in facet_results: + i += 1 + key = f"{base_key}_{i}" + facet_results[key] = facet_info + facets_timed_out.extend(instance_facets_timed_out) - # Calculate suggested facets suggested_facets = [] - if ( - self.ds.setting("suggest_facets") - and self.ds.setting("allow_facet") - and not _next - and not nofacet - and not nosuggest - ): - for facet in facet_instances: - suggested_facets.extend(await facet.suggest()) + + async def execute_suggested_facets(): + # Calculate suggested facets + if ( + self.ds.setting("suggest_facets") + and self.ds.setting("allow_facet") + and not _next + and not nofacet + and not nosuggest + ): + # Run them in parallel + facet_suggest_awaitables = [ + facet.suggest() for facet in facet_instances + ] + for suggest_result in await gather(*facet_suggest_awaitables): + suggested_facets.extend(suggest_result) + + await gather(execute_facets(), execute_suggested_facets()) # Figure out columns and rows for the query columns = [r[0] for r in results.description] From 94a3171b01fde5c52697aeeff052e3ad4bab5391 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 28 Apr 2022 13:29:11 -0700 Subject: [PATCH 062/952] .plugin_config() can return None --- docs/internals.rst | 4 ++++ docs/writing_plugins.rst | 2 ++ 2 files changed, 6 insertions(+) diff --git a/docs/internals.rst b/docs/internals.rst index aad608dc..18822d47 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -288,6 +288,10 @@ All databases are listed, irrespective of user permissions. This means that the This method lets you read plugin configuration values that were set in ``metadata.json``. See :ref:`writing_plugins_configuration` for full details of how this method should be used. +The return value will be the value from the configuration file - usually a dictionary. + +If the plugin is not configured the return value will be ``None``. + .. _datasette_render_template: await .render_template(template, context=None, request=None) diff --git a/docs/writing_plugins.rst b/docs/writing_plugins.rst index 89f7f5eb..9aee70f6 100644 --- a/docs/writing_plugins.rst +++ b/docs/writing_plugins.rst @@ -182,6 +182,8 @@ When you are writing plugins, you can access plugin configuration like this usin This will return the ``{"latitude_column": "lat", "longitude_column": "lng"}`` in the above example. +If there is no configuration for that plugin, the method will return ``None``. + If it cannot find the requested configuration at the table layer, it will fall back to the database layer and then the root layer. For example, a user may have set the plugin configuration option like so:: { From 4afc1afc721ac0d14f58b0f8339c1bf431d5313c Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 12:13:11 -0700 Subject: [PATCH 063/952] Depend on click-default-group-wheel>=1.2.2 Refs #1733 --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index 7f0562fd..fcb43aa1 100644 --- a/setup.py +++ b/setup.py @@ -44,7 +44,7 @@ setup( install_requires=[ "asgiref>=3.2.10,<3.6.0", "click>=7.1.1,<8.2.0", - "click-default-group~=1.2.2", + "click-default-group-wheel>=1.2.2", "Jinja2>=2.10.3,<3.1.0", "hupper~=1.9", "httpx>=0.20", From 7e03394734307a5761e4c98d902b6a8cab188562 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 12:20:14 -0700 Subject: [PATCH 064/952] Optional uvicorn import for Pyodide, refs #1733 --- datasette/app.py | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index d269372c..a5330458 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -26,7 +26,6 @@ from itsdangerous import URLSafeSerializer from jinja2 import ChoiceLoader, Environment, FileSystemLoader, PrefixLoader from jinja2.environment import Template from jinja2.exceptions import TemplateNotFound -import uvicorn from .views.base import DatasetteError, ureg from .views.database import DatabaseDownload, DatabaseView @@ -806,6 +805,15 @@ class Datasette: datasette_version = {"version": __version__} if self.version_note: datasette_version["note"] = self.version_note + + try: + # Optional import to avoid breaking Pyodide + # https://github.com/simonw/datasette/issues/1733#issuecomment-1115268245 + import uvicorn + + uvicorn_version = uvicorn.__version__ + except ImportError: + uvicorn_version = None info = { "python": { "version": ".".join(map(str, sys.version_info[:3])), @@ -813,7 +821,7 @@ class Datasette: }, "datasette": datasette_version, "asgi": "3.0", - "uvicorn": uvicorn.__version__, + "uvicorn": uvicorn_version, "sqlite": { "version": sqlite_version, "fts_versions": fts_versions, From 687907aa2b1bde4de6ae7155b0e2a949ca015ca9 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 12:39:06 -0700 Subject: [PATCH 065/952] Remove python-baseconv dependency, refs #1733, closes #1734 --- datasette/actor_auth_cookie.py | 2 +- datasette/utils/baseconv.py | 59 ++++++++++++++++++++++++++++++++++ docs/authentication.rst | 4 +-- setup.py | 1 - tests/test_auth.py | 2 +- 5 files changed, 63 insertions(+), 5 deletions(-) create mode 100644 datasette/utils/baseconv.py diff --git a/datasette/actor_auth_cookie.py b/datasette/actor_auth_cookie.py index 15ecd331..368213af 100644 --- a/datasette/actor_auth_cookie.py +++ b/datasette/actor_auth_cookie.py @@ -1,6 +1,6 @@ from datasette import hookimpl from itsdangerous import BadSignature -import baseconv +from datasette.utils import baseconv import time diff --git a/datasette/utils/baseconv.py b/datasette/utils/baseconv.py new file mode 100644 index 00000000..27e4fb00 --- /dev/null +++ b/datasette/utils/baseconv.py @@ -0,0 +1,59 @@ +""" +Convert numbers from base 10 integers to base X strings and back again. + +Sample usage: + +>>> base20 = BaseConverter('0123456789abcdefghij') +>>> base20.from_decimal(1234) +'31e' +>>> base20.to_decimal('31e') +1234 + +Originally shared here: https://www.djangosnippets.org/snippets/1431/ +""" + + +class BaseConverter(object): + decimal_digits = "0123456789" + + def __init__(self, digits): + self.digits = digits + + def from_decimal(self, i): + return self.convert(i, self.decimal_digits, self.digits) + + def to_decimal(self, s): + return int(self.convert(s, self.digits, self.decimal_digits)) + + def convert(number, fromdigits, todigits): + # Based on http://code.activestate.com/recipes/111286/ + if str(number)[0] == "-": + number = str(number)[1:] + neg = 1 + else: + neg = 0 + + # make an integer out of the number + x = 0 + for digit in str(number): + x = x * len(fromdigits) + fromdigits.index(digit) + + # create the result in base 'len(todigits)' + if x == 0: + res = todigits[0] + else: + res = "" + while x > 0: + digit = x % len(todigits) + res = todigits[digit] + res + x = int(x / len(todigits)) + if neg: + res = "-" + res + return res + + convert = staticmethod(convert) + + +bin = BaseConverter("01") +hexconv = BaseConverter("0123456789ABCDEF") +base62 = BaseConverter("ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789abcdefghijklmnopqrstuvwxyz") diff --git a/docs/authentication.rst b/docs/authentication.rst index 24960733..685dab15 100644 --- a/docs/authentication.rst +++ b/docs/authentication.rst @@ -401,12 +401,12 @@ Including an expiry time ``ds_actor`` cookies can optionally include a signed expiry timestamp, after which the cookies will no longer be valid. Authentication plugins may chose to use this mechanism to limit the lifetime of the cookie. For example, if a plugin implements single-sign-on against another source it may decide to set short-lived cookies so that if the user is removed from the SSO system their existing Datasette cookies will stop working shortly afterwards. -To include an expiry, add a ``"e"`` key to the cookie value containing a `base62-encoded integer `__ representing the timestamp when the cookie should expire. For example, here's how to set a cookie that expires after 24 hours: +To include an expiry, add a ``"e"`` key to the cookie value containing a base62-encoded integer representing the timestamp when the cookie should expire. For example, here's how to set a cookie that expires after 24 hours: .. code-block:: python import time - import baseconv + from datasette.utils import baseconv expires_at = int(time.time()) + (24 * 60 * 60) diff --git a/setup.py b/setup.py index fcb43aa1..ca449f02 100644 --- a/setup.py +++ b/setup.py @@ -57,7 +57,6 @@ setup( "PyYAML>=5.3,<7.0", "mergedeep>=1.1.1,<1.4.0", "itsdangerous>=1.1,<3.0", - "python-baseconv==1.2.2", ], entry_points=""" [console_scripts] diff --git a/tests/test_auth.py b/tests/test_auth.py index 974f89ea..4ef35a76 100644 --- a/tests/test_auth.py +++ b/tests/test_auth.py @@ -1,5 +1,5 @@ from .fixtures import app_client -import baseconv +from datasette.utils import baseconv import pytest import time From a29c1277896b6a7905ef5441c42a37bc15f67599 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 12:44:09 -0700 Subject: [PATCH 066/952] Rename to_decimal/from_decimal to decode/encode, refs #1734 --- datasette/utils/baseconv.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/datasette/utils/baseconv.py b/datasette/utils/baseconv.py index 27e4fb00..c4b64908 100644 --- a/datasette/utils/baseconv.py +++ b/datasette/utils/baseconv.py @@ -19,10 +19,10 @@ class BaseConverter(object): def __init__(self, digits): self.digits = digits - def from_decimal(self, i): + def encode(self, i): return self.convert(i, self.decimal_digits, self.digits) - def to_decimal(self, s): + def decode(self, s): return int(self.convert(s, self.digits, self.decimal_digits)) def convert(number, fromdigits, todigits): From 3f00a29141bdea5be747f6d1c93871ccdb792167 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 13:15:27 -0700 Subject: [PATCH 067/952] Clean up compatibility with Pyodide (#1736) * Optional uvicorn import for Pyodide, refs #1733 * --setting num_sql_threads 0 to disable threading, refs #1735 --- datasette/app.py | 11 ++++++++--- datasette/database.py | 19 +++++++++++++++++++ docs/settings.rst | 2 ++ tests/test_internals_datasette.py | 14 +++++++++++++- 4 files changed, 42 insertions(+), 4 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index a5330458..b7b84371 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -288,9 +288,12 @@ class Datasette: self._settings = dict(DEFAULT_SETTINGS, **(settings or {})) self.renderers = {} # File extension -> (renderer, can_render) functions self.version_note = version_note - self.executor = futures.ThreadPoolExecutor( - max_workers=self.setting("num_sql_threads") - ) + if self.setting("num_sql_threads") == 0: + self.executor = None + else: + self.executor = futures.ThreadPoolExecutor( + max_workers=self.setting("num_sql_threads") + ) self.max_returned_rows = self.setting("max_returned_rows") self.sql_time_limit_ms = self.setting("sql_time_limit_ms") self.page_size = self.setting("default_page_size") @@ -862,6 +865,8 @@ class Datasette: ] def _threads(self): + if self.setting("num_sql_threads") == 0: + return {"num_threads": 0, "threads": []} threads = list(threading.enumerate()) d = { "num_threads": len(threads), diff --git a/datasette/database.py b/datasette/database.py index ba594a8c..44d32667 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -45,6 +45,9 @@ class Database: self._cached_table_counts = None self._write_thread = None self._write_queue = None + # These are used when in non-threaded mode: + self._read_connection = None + self._write_connection = None if not self.is_mutable and not self.is_memory: p = Path(path) self.hash = inspect_hash(p) @@ -134,6 +137,14 @@ class Database: return results async def execute_write_fn(self, fn, block=True): + if self.ds.executor is None: + # non-threaded mode + if self._write_connection is None: + self._write_connection = self.connect(write=True) + self.ds._prepare_connection(self._write_connection, self.name) + return fn(self._write_connection) + + # threaded mode task_id = uuid.uuid5(uuid.NAMESPACE_DNS, "datasette.io") if self._write_queue is None: self._write_queue = queue.Queue() @@ -177,6 +188,14 @@ class Database: task.reply_queue.sync_q.put(result) async def execute_fn(self, fn): + if self.ds.executor is None: + # non-threaded mode + if self._read_connection is None: + self._read_connection = self.connect() + self.ds._prepare_connection(self._read_connection, self.name) + return fn(self._read_connection) + + # threaded mode def in_thread(): conn = getattr(connections, self.name, None) if not conn: diff --git a/docs/settings.rst b/docs/settings.rst index 60c4b36d..8437fb04 100644 --- a/docs/settings.rst +++ b/docs/settings.rst @@ -107,6 +107,8 @@ Maximum number of threads in the thread pool Datasette uses to execute SQLite qu datasette mydatabase.db --setting num_sql_threads 10 +Setting this to 0 turns off threaded SQL queries entirely - useful for environments that do not support threading such as `Pyodide `__. + .. _setting_allow_facet: allow_facet diff --git a/tests/test_internals_datasette.py b/tests/test_internals_datasette.py index cc200a2d..1dc14cab 100644 --- a/tests/test_internals_datasette.py +++ b/tests/test_internals_datasette.py @@ -1,7 +1,7 @@ """ Tests for the datasette.app.Datasette class """ -from datasette.app import Datasette +from datasette.app import Datasette, Database from itsdangerous import BadSignature from .fixtures import app_client import pytest @@ -63,3 +63,15 @@ async def test_datasette_constructor(): "hash": None, } ] + + +@pytest.mark.asyncio +async def test_num_sql_threads_zero(): + ds = Datasette([], memory=True, settings={"num_sql_threads": 0}) + db = ds.add_database(Database(ds, memory_name="test_num_sql_threads_zero")) + await db.execute_write("create table t(id integer primary key)") + await db.execute_write("insert into t (id) values (1)") + response = await ds.client.get("/-/threads.json") + assert response.json() == {"num_threads": 0, "threads": []} + response2 = await ds.client.get("/test_num_sql_threads_zero/t.json?_shape=array") + assert response2.json() == [{"id": 1}] From 943aa2e1f7341cb51e60332cde46bde650c64217 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 14:38:34 -0700 Subject: [PATCH 068/952] Release 0.62a0 Refs #1683, #1701, #1712, #1717, #1718, #1733 --- datasette/version.py | 2 +- docs/changelog.rst | 14 ++++++++++++++ 2 files changed, 15 insertions(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index 02451a1e..cf18c441 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.61.1" +__version__ = "0.62a0" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index 03cf62b6..74814fcb 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,6 +4,20 @@ Changelog ========= +.. _v0_62a0: + +0.62a0 (2022-05-02) +------------------- + +- Datasette now runs some SQL queries in parallel. This has limited impact on performance, see `this research issue `__ for details. +- Datasette should now be compatible with Pyodide. (:issue:`1733`) +- ``datasette publish cloudrun`` has a new ``--timeout`` option which can be used to increase the time limit applied by the Google Cloud build environment. Thanks, Tim Sherratt. (`#1717 `__) +- Spaces in database names are now encoded as ``+`` rather than ``~20``. (:issue:`1701`) +- ```` is now displayed as ```` and is accompanied by tooltip showing "2.3MB". (:issue:`1712`) +- Don't show the facet option in the cog menu if faceting is not allowed. (:issue:`1683`) +- Code examples in the documentation are now all formatted using Black. (:issue:`1718`) +- ``Request.fake()`` method is now documented, see :ref:`internals_request`. + .. _v0_61_1: 0.61.1 (2022-03-23) From 847d6b1aac38c3e776e8c600eed07ba4c9ac9942 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 16:32:24 -0700 Subject: [PATCH 069/952] Test wheel against Pyodide, refs #1737, #1733 --- .github/workflows/test-pyodide.yml | 28 ++++++++++++++++++ test-in-pyodide-with-shot-scraper.sh | 43 ++++++++++++++++++++++++++++ 2 files changed, 71 insertions(+) create mode 100644 .github/workflows/test-pyodide.yml create mode 100755 test-in-pyodide-with-shot-scraper.sh diff --git a/.github/workflows/test-pyodide.yml b/.github/workflows/test-pyodide.yml new file mode 100644 index 00000000..3715d055 --- /dev/null +++ b/.github/workflows/test-pyodide.yml @@ -0,0 +1,28 @@ +name: Test in Pyodide with shot-scraper + +on: + workflow_dispatch: + +jobs: + test: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - name: Set up Python 3.10 + uses: actions/setup-python@v3 + with: + python-version: "3.10" + cache: 'pip' + cache-dependency-path: '**/setup.py' + - name: Cache Playwright browsers + uses: actions/cache@v2 + with: + path: ~/.cache/ms-playwright/ + key: ${{ runner.os }}-browsers + - name: Install Playwright dependencies + run: | + pip install shot-scraper + shot-scraper install + - name: Run test + run: | + ./test-in-pyodide-with-shot-scraper.sh diff --git a/test-in-pyodide-with-shot-scraper.sh b/test-in-pyodide-with-shot-scraper.sh new file mode 100755 index 00000000..0f29c0e0 --- /dev/null +++ b/test-in-pyodide-with-shot-scraper.sh @@ -0,0 +1,43 @@ +#!/bin/bash + +# Build the wheel +python3 -m build + +# Find name of wheel +wheel=$(basename $(ls dist/*.whl)) +# strip off the dist/ + + +# Create a blank index page +echo ' + +' > dist/index.html + +# Run a server for that dist/ folder +cd dist +python3 -m http.server 8529 & +cd .. + +shot-scraper javascript http://localhost:8529/ " +async () => { + let pyodide = await loadPyodide(); + await pyodide.loadPackage(['micropip', 'ssl', 'setuptools']); + let output = await pyodide.runPythonAsync(\` + import micropip + await micropip.install('h11==0.12.0') + await micropip.install('http://localhost:8529/$wheel') + import ssl + import setuptools + from datasette.app import Datasette + ds = Datasette(memory=True, settings={'num_sql_threads': 0}) + (await ds.client.get('/_memory.json?sql=select+55+as+itworks&_shape=array')).text + \`); + if (JSON.parse(output)[0].itworks != 55) { + throw 'Got ' + output + ', expected itworks: 55'; + } + return 'Test passed!'; +} +" + +# Shut down the server +pkill -f 'http.server 8529' From c0cbcf2aba0d8393ba464acc515803ebf2eeda12 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 16:36:58 -0700 Subject: [PATCH 070/952] Tweaks to test scripts, refs #1737 --- .github/workflows/test-pyodide.yml | 2 +- test-in-pyodide-with-shot-scraper.sh | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/.github/workflows/test-pyodide.yml b/.github/workflows/test-pyodide.yml index 3715d055..beb6a5fb 100644 --- a/.github/workflows/test-pyodide.yml +++ b/.github/workflows/test-pyodide.yml @@ -21,7 +21,7 @@ jobs: key: ${{ runner.os }}-browsers - name: Install Playwright dependencies run: | - pip install shot-scraper + pip install shot-scraper build shot-scraper install - name: Run test run: | diff --git a/test-in-pyodide-with-shot-scraper.sh b/test-in-pyodide-with-shot-scraper.sh index 0f29c0e0..e5df7398 100755 --- a/test-in-pyodide-with-shot-scraper.sh +++ b/test-in-pyodide-with-shot-scraper.sh @@ -1,12 +1,12 @@ #!/bin/bash +set -e +# So the script fails if there are any errors # Build the wheel python3 -m build -# Find name of wheel +# Find name of wheel, strip off the dist/ wheel=$(basename $(ls dist/*.whl)) -# strip off the dist/ - # Create a blank index page echo ' From d60f163528f466b1127b2935c3b6869c34fd6545 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 2 May 2022 16:40:49 -0700 Subject: [PATCH 071/952] Run on push and PR, closes #1737 --- .github/workflows/test-pyodide.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/.github/workflows/test-pyodide.yml b/.github/workflows/test-pyodide.yml index beb6a5fb..1b75aade 100644 --- a/.github/workflows/test-pyodide.yml +++ b/.github/workflows/test-pyodide.yml @@ -1,6 +1,8 @@ name: Test in Pyodide with shot-scraper on: + push: + pull_request: workflow_dispatch: jobs: From 280ff372ab30df244f6c54f6f3002da57334b3d7 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 3 May 2022 07:59:18 -0700 Subject: [PATCH 072/952] ETag support for .db downloads, closes #1739 --- datasette/utils/testing.py | 20 ++++++++++++++++++-- datasette/views/database.py | 7 +++++++ tests/test_html.py | 10 +++++++++- 3 files changed, 34 insertions(+), 3 deletions(-) diff --git a/datasette/utils/testing.py b/datasette/utils/testing.py index 94750b1f..640c94e6 100644 --- a/datasette/utils/testing.py +++ b/datasette/utils/testing.py @@ -55,10 +55,21 @@ class TestClient: @async_to_sync async def get( - self, path, follow_redirects=False, redirect_count=0, method="GET", cookies=None + self, + path, + follow_redirects=False, + redirect_count=0, + method="GET", + cookies=None, + if_none_match=None, ): return await self._request( - path, follow_redirects, redirect_count, method, cookies + path=path, + follow_redirects=follow_redirects, + redirect_count=redirect_count, + method=method, + cookies=cookies, + if_none_match=if_none_match, ) @async_to_sync @@ -110,6 +121,7 @@ class TestClient: headers=None, post_body=None, content_type=None, + if_none_match=None, ): return await self._request( path, @@ -120,6 +132,7 @@ class TestClient: headers=headers, post_body=post_body, content_type=content_type, + if_none_match=if_none_match, ) async def _request( @@ -132,10 +145,13 @@ class TestClient: headers=None, post_body=None, content_type=None, + if_none_match=None, ): headers = headers or {} if content_type: headers["content-type"] = content_type + if if_none_match: + headers["if-none-match"] = if_none_match httpx_response = await self.ds.client.request( method, path, diff --git a/datasette/views/database.py b/datasette/views/database.py index 9a8aca32..bc08ba05 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -183,6 +183,13 @@ class DatabaseDownload(DataView): headers = {} if self.ds.cors: add_cors_headers(headers) + if db.hash: + etag = '"{}"'.format(db.hash) + headers["Etag"] = etag + # Has user seen this already? + if_none_match = request.headers.get("if-none-match") + if if_none_match and if_none_match == etag: + return Response("", status=304) headers["Transfer-Encoding"] = "chunked" return AsgiFileDownload( filepath, diff --git a/tests/test_html.py b/tests/test_html.py index 42f1a3ee..409fec68 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -401,7 +401,7 @@ def test_database_download_for_immutable(): assert len(soup.findAll("a", {"href": re.compile(r"\.db$")})) # Check we can actually download it download_response = client.get("/fixtures.db") - assert 200 == download_response.status + assert download_response.status == 200 # Check the content-length header exists assert "content-length" in download_response.headers content_length = download_response.headers["content-length"] @@ -413,6 +413,14 @@ def test_database_download_for_immutable(): == 'attachment; filename="fixtures.db"' ) assert download_response.headers["transfer-encoding"] == "chunked" + # ETag header should be present and match db.hash + assert "etag" in download_response.headers + etag = download_response.headers["etag"] + assert etag == '"{}"'.format(client.ds.databases["fixtures"].hash) + # Try a second download with If-None-Match: current-etag + download_response2 = client.get("/fixtures.db", if_none_match=etag) + assert download_response2.body == b"" + assert download_response2.status == 304 def test_database_download_disallowed_for_mutable(app_client): From a5acfff4bd364d30ce8878e19f9839890371ef14 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 16 May 2022 17:06:40 -0700 Subject: [PATCH 073/952] Empty Datasette([]) list is no longer required --- docs/testing_plugins.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/testing_plugins.rst b/docs/testing_plugins.rst index 1bbaaac1..41046bfb 100644 --- a/docs/testing_plugins.rst +++ b/docs/testing_plugins.rst @@ -15,7 +15,7 @@ If you use the template described in :ref:`writing_plugins_cookiecutter` your pl @pytest.mark.asyncio async def test_plugin_is_installed(): - datasette = Datasette([], memory=True) + datasette = Datasette(memory=True) response = await datasette.client.get("/-/plugins.json") assert response.status_code == 200 installed_plugins = {p["name"] for p in response.json()} From 3508bf7875f8d62b2725222f3b07747974d54b97 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 17 May 2022 12:40:05 -0700 Subject: [PATCH 074/952] --nolock mode to ignore locked files, closes #1744 --- datasette/app.py | 2 ++ datasette/cli.py | 7 +++++++ datasette/database.py | 2 ++ docs/cli-reference.rst | 1 + docs/getting_started.rst | 4 +++- 5 files changed, 15 insertions(+), 1 deletion(-) diff --git a/datasette/app.py b/datasette/app.py index b7b84371..f43700d4 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -213,6 +213,7 @@ class Datasette: config_dir=None, pdb=False, crossdb=False, + nolock=False, ): assert config_dir is None or isinstance( config_dir, Path @@ -238,6 +239,7 @@ class Datasette: self.databases = collections.OrderedDict() self._refresh_schemas_lock = asyncio.Lock() self.crossdb = crossdb + self.nolock = nolock if memory or crossdb or not self.files: self.add_database(Database(self, is_memory=True), name="_memory") # memory_name is a random string so that each Datasette instance gets its own diff --git a/datasette/cli.py b/datasette/cli.py index 3c6e1b2c..8781747c 100644 --- a/datasette/cli.py +++ b/datasette/cli.py @@ -452,6 +452,11 @@ def uninstall(packages, yes): is_flag=True, help="Enable cross-database joins using the /_memory database", ) +@click.option( + "--nolock", + is_flag=True, + help="Ignore locking, open locked files in read-only mode", +) @click.option( "--ssl-keyfile", help="SSL key file", @@ -486,6 +491,7 @@ def serve( open_browser, create, crossdb, + nolock, ssl_keyfile, ssl_certfile, return_instance=False, @@ -545,6 +551,7 @@ def serve( version_note=version_note, pdb=pdb, crossdb=crossdb, + nolock=nolock, ) # if files is a single directory, use that as config_dir= diff --git a/datasette/database.py b/datasette/database.py index 44d32667..fa558045 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -89,6 +89,8 @@ class Database: # mode=ro or immutable=1? if self.is_mutable: qs = "?mode=ro" + if self.ds.nolock: + qs += "&nolock=1" else: qs = "?immutable=1" assert not (write and not self.is_mutable) diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index 2a6fbfc8..1c1aff15 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -115,6 +115,7 @@ datasette serve --help --create Create database files if they do not exist --crossdb Enable cross-database joins using the /_memory database + --nolock Ignore locking, open locked files in read-only mode --ssl-keyfile TEXT SSL key file --ssl-certfile TEXT SSL certificate file --help Show this message and exit. diff --git a/docs/getting_started.rst b/docs/getting_started.rst index 3e357afb..502a9e5a 100644 --- a/docs/getting_started.rst +++ b/docs/getting_started.rst @@ -56,7 +56,9 @@ like so: :: - datasette ~/Library/Application\ Support/Google/Chrome/Default/History + datasette ~/Library/Application\ Support/Google/Chrome/Default/History --nolock + +The `--nolock` option ignores any file locks. This is safe as Datasette will open the file in read-only mode. Now visiting http://localhost:8001/History/downloads will show you a web interface to browse your downloads data: From 5555bc8aef043f75d2200f66de90c54aeeaa08c3 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 17 May 2022 12:43:44 -0700 Subject: [PATCH 075/952] How to run cog, closes #1745 --- docs/contributing.rst | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/docs/contributing.rst b/docs/contributing.rst index c193ba49..bddceafe 100644 --- a/docs/contributing.rst +++ b/docs/contributing.rst @@ -211,6 +211,17 @@ For added productivity, you can use use `sphinx-autobuild `__. + +To update these pages, run the following command:: + + cog -r docs/*.rst + .. _contributing_continuous_deployment: Continuously deployed demo instances From b393e164dc9e962702546d6f1ad9c857b5788dc0 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 17 May 2022 12:45:28 -0700 Subject: [PATCH 076/952] ReST fix --- docs/getting_started.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/getting_started.rst b/docs/getting_started.rst index 502a9e5a..af3a1385 100644 --- a/docs/getting_started.rst +++ b/docs/getting_started.rst @@ -58,7 +58,7 @@ like so: datasette ~/Library/Application\ Support/Google/Chrome/Default/History --nolock -The `--nolock` option ignores any file locks. This is safe as Datasette will open the file in read-only mode. +The ``--nolock`` option ignores any file locks. This is safe as Datasette will open the file in read-only mode. Now visiting http://localhost:8001/History/downloads will show you a web interface to browse your downloads data: From 7d1e004ff679b3fb4dca36d1d751a1ad16688fe6 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 17 May 2022 12:59:28 -0700 Subject: [PATCH 077/952] Fix test I broke in #1744 --- tests/test_cli.py | 1 + 1 file changed, 1 insertion(+) diff --git a/tests/test_cli.py b/tests/test_cli.py index dca65f26..d0f6e26c 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -150,6 +150,7 @@ def test_metadata_yaml(): help_settings=False, pdb=False, crossdb=False, + nolock=False, open_browser=False, create=False, ssl_keyfile=None, From 0e2f6f1f82f4445a63f1251470a7778a34f5c8b9 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 18 May 2022 17:37:46 -0700 Subject: [PATCH 078/952] datasette-copyable is an example of register_output_renderer --- docs/plugin_hooks.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 3c9ae2e2..c0d88964 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -557,7 +557,7 @@ And here is an example ``can_render`` function which returns ``True`` only if th "atom_updated", }.issubset(columns) -Examples: `datasette-atom `_, `datasette-ics `_, `datasette-geojson `__ +Examples: `datasette-atom `_, `datasette-ics `_, `datasette-geojson `__, `datasette-copyable `__ .. _plugin_register_routes: From 18a6e05887abf1ac946a6e0d36ce662dfd8aeff1 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 20 May 2022 12:05:33 -0700 Subject: [PATCH 079/952] Added "follow a tutorial" to getting started docs Closes #1747 --- docs/getting_started.rst | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/docs/getting_started.rst b/docs/getting_started.rst index af3a1385..00b753a9 100644 --- a/docs/getting_started.rst +++ b/docs/getting_started.rst @@ -1,6 +1,8 @@ Getting started =============== +.. _getting_started_demo: + Play with a live demo --------------------- @@ -9,6 +11,16 @@ The best way to experience Datasette for the first time is with a demo: * `global-power-plants.datasettes.com `__ provides a searchable database of power plants around the world, using data from the `World Resources Institude `__ rendered using the `datasette-cluster-map `__ plugin. * `fivethirtyeight.datasettes.com `__ shows Datasette running against over 400 datasets imported from the `FiveThirtyEight GitHub repository `__. +.. _getting_started_tutorial: + +Follow a tutorial +----------------- + +Datasette has several `tutorials `__ to help you get started with the tool. Try one of the following: + +- `Exploring a database with Datasette `__ shows how to use the Datasette web interface to explore a new database. +- `Learn SQL with Datasette `__ introduces SQL, and shows how to use that query language to ask questions of your data. + .. _getting_started_glitch: Try Datasette without installing anything using Glitch From 1465fea4798599eccfe7e8f012bd8d9adfac3039 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 20 May 2022 12:11:08 -0700 Subject: [PATCH 080/952] sphinx-copybutton for docs, closes #1748 --- docs/conf.py | 2 +- setup.py | 8 +++++++- 2 files changed, 8 insertions(+), 2 deletions(-) diff --git a/docs/conf.py b/docs/conf.py index d114bc52..351cb1b1 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -31,7 +31,7 @@ # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. -extensions = ["sphinx.ext.extlinks", "sphinx.ext.autodoc"] +extensions = ["sphinx.ext.extlinks", "sphinx.ext.autodoc", "sphinx_copybutton"] extlinks = { "issue": ("https://github.com/simonw/datasette/issues/%s", "#"), diff --git a/setup.py b/setup.py index ca449f02..aad05840 100644 --- a/setup.py +++ b/setup.py @@ -64,7 +64,13 @@ setup( """, setup_requires=["pytest-runner"], extras_require={ - "docs": ["sphinx_rtd_theme", "sphinx-autobuild", "codespell", "blacken-docs"], + "docs": [ + "sphinx_rtd_theme", + "sphinx-autobuild", + "codespell", + "blacken-docs", + "sphinx-copybutton", + ], "test": [ "pytest>=5.2.2,<7.2.0", "pytest-xdist>=2.2.1,<2.6", From 1d33fd03b3c211e0f48a8f3bde83880af89e4e69 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 20 May 2022 13:34:51 -0700 Subject: [PATCH 081/952] Switch docs theme to Furo, refs #1746 --- docs/_static/css/custom.css | 7 ++-- .../layout.html => _static/js/custom.js} | 34 ------------------- docs/_templates/base.html | 6 ++++ docs/_templates/sidebar/brand.html | 16 +++++++++ docs/_templates/sidebar/navigation.html | 11 ++++++ docs/conf.py | 24 +++---------- docs/installation.rst | 1 + docs/plugin_hooks.rst | 1 + setup.py | 2 +- 9 files changed, 45 insertions(+), 57 deletions(-) rename docs/{_templates/layout.html => _static/js/custom.js} (55%) create mode 100644 docs/_templates/base.html create mode 100644 docs/_templates/sidebar/brand.html create mode 100644 docs/_templates/sidebar/navigation.html diff --git a/docs/_static/css/custom.css b/docs/_static/css/custom.css index 4dabb725..0a6f8799 100644 --- a/docs/_static/css/custom.css +++ b/docs/_static/css/custom.css @@ -1,7 +1,8 @@ a.external { overflow-wrap: anywhere; } - -div .wy-side-nav-search > div.version { - color: rgba(0,0,0,0.75); +body[data-theme="dark"] .sidebar-logo-container { + background-color: white; + padding: 5px; + opacity: 0.6; } diff --git a/docs/_templates/layout.html b/docs/_static/js/custom.js similarity index 55% rename from docs/_templates/layout.html rename to docs/_static/js/custom.js index 785cdc7c..efca33ed 100644 --- a/docs/_templates/layout.html +++ b/docs/_static/js/custom.js @@ -1,35 +1,3 @@ -{%- extends "!layout.html" %} - -{% block htmltitle %} -{{ super() }} - -{% endblock %} - -{% block sidebartitle %} - - - - - -{% if theme_display_version %} - {%- set nav_version = version %} - {% if READTHEDOCS and current_version %} - {%- set nav_version = current_version %} - {% endif %} - {% if nav_version %} -
- {{ nav_version }} -
- {% endif %} -{% endif %} - -{% include "searchbox.html" %} - -{% endblock %} - -{% block footer %} -{{ super() }} - -{% endblock %} diff --git a/docs/_templates/base.html b/docs/_templates/base.html new file mode 100644 index 00000000..969de5ab --- /dev/null +++ b/docs/_templates/base.html @@ -0,0 +1,6 @@ +{%- extends "!base.html" %} + +{% block site_meta %} +{{ super() }} + +{% endblock %} diff --git a/docs/_templates/sidebar/brand.html b/docs/_templates/sidebar/brand.html new file mode 100644 index 00000000..8be9e8ee --- /dev/null +++ b/docs/_templates/sidebar/brand.html @@ -0,0 +1,16 @@ + diff --git a/docs/_templates/sidebar/navigation.html b/docs/_templates/sidebar/navigation.html new file mode 100644 index 00000000..c460a17e --- /dev/null +++ b/docs/_templates/sidebar/navigation.html @@ -0,0 +1,11 @@ + \ No newline at end of file diff --git a/docs/conf.py b/docs/conf.py index 351cb1b1..25d2acfe 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -90,18 +90,15 @@ todo_include_todos = False # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. # -html_theme = "sphinx_rtd_theme" +html_theme = "furo" # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. html_theme_options = { - "logo_only": True, - "style_nav_header_background": "white", - "prev_next_buttons_location": "both", + "sidebar_hide_name": True, } - # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". @@ -112,20 +109,9 @@ html_logo = "datasette-logo.svg" html_css_files = [ "css/custom.css", ] - - -# Custom sidebar templates, must be a dictionary that maps document names -# to template names. -# -# This is required for the alabaster theme -# refs: http://alabaster.readthedocs.io/en/latest/installation.html#sidebars -html_sidebars = { - "**": [ - "relations.html", # needs 'show_related': True theme option to display - "searchbox.html", - ] -} - +html_js_files = [ + "js/custom.js" +] # -- Options for HTMLHelp output ------------------------------------------ diff --git a/docs/installation.rst b/docs/installation.rst index e8bef9cd..a4757736 100644 --- a/docs/installation.rst +++ b/docs/installation.rst @@ -13,6 +13,7 @@ If you want to start making contributions to the Datasette project by installing .. contents:: :local: + :class: this-will-duplicate-information-and-it-is-still-useful-here .. _installation_basic: diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index c0d88964..7d10fe37 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -20,6 +20,7 @@ For example, you can implement the ``render_cell`` plugin hook like this even th .. contents:: List of plugin hooks :local: + :class: this-will-duplicate-information-and-it-is-still-useful-here .. _plugin_hook_prepare_connection: diff --git a/setup.py b/setup.py index aad05840..d3fcdbd1 100644 --- a/setup.py +++ b/setup.py @@ -65,7 +65,7 @@ setup( setup_requires=["pytest-runner"], extras_require={ "docs": [ - "sphinx_rtd_theme", + "furo==2022.4.7", "sphinx-autobuild", "codespell", "blacken-docs", From 4446075334ea7231beb56b630bc7ec363afc2d08 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 20 May 2022 13:44:23 -0700 Subject: [PATCH 082/952] Append warning to the write element, refs #1746 --- docs/_static/js/custom.js | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/docs/_static/js/custom.js b/docs/_static/js/custom.js index efca33ed..91c3e306 100644 --- a/docs/_static/js/custom.js +++ b/docs/_static/js/custom.js @@ -17,11 +17,7 @@ jQuery(function ($) { ` ); warning.find("a").attr("href", stableUrl); - var body = $("div.body"); - if (!body.length) { - body = $("div.document"); - } - body.prepend(warning); + $("article[role=main]").prepend(warning); } }); }); From b010af7bb85856aeb44f69e7e980f617c1fc0db1 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 20 May 2022 15:23:09 -0700 Subject: [PATCH 083/952] Updated copyright years in documentation footer --- docs/conf.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/conf.py b/docs/conf.py index 25d2acfe..7ffeedd0 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -51,7 +51,7 @@ master_doc = "index" # General information about the project. project = "Datasette" -copyright = "2017-2021, Simon Willison" +copyright = "2017-2022, Simon Willison" author = "Simon Willison" # Disable -- turning into – From adedd85b68ec66e03b97fb62ff4da8987734436e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 28 May 2022 18:42:31 -0700 Subject: [PATCH 084/952] Clarify that request.headers names are converted to lowercase --- docs/internals.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/internals.rst b/docs/internals.rst index 18822d47..da135282 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -26,7 +26,7 @@ The request object is passed to various plugin hooks. It represents an incoming The request scheme - usually ``https`` or ``http``. ``.headers`` - dictionary (str -> str) - A dictionary of incoming HTTP request headers. + A dictionary of incoming HTTP request headers. Header names have been converted to lowercase. ``.cookies`` - dictionary (str -> str) A dictionary of incoming cookies From 8dd816bc76937f1e37f86acce10dc2cb4fa31e52 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 30 May 2022 15:42:38 -0700 Subject: [PATCH 085/952] Applied Black --- docs/conf.py | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/docs/conf.py b/docs/conf.py index 7ffeedd0..4ef6b768 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -109,9 +109,7 @@ html_logo = "datasette-logo.svg" html_css_files = [ "css/custom.css", ] -html_js_files = [ - "js/custom.js" -] +html_js_files = ["js/custom.js"] # -- Options for HTMLHelp output ------------------------------------------ From 2e9751672d4fe329b3c359d5b7b1992283185820 Mon Sep 17 00:00:00 2001 From: Naveen <172697+naveensrinivasan@users.noreply.github.com> Date: Tue, 31 May 2022 14:28:40 -0500 Subject: [PATCH 086/952] chore: Set permissions for GitHub actions (#1740) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Restrict the GitHub token permissions only to the required ones; this way, even if the attackers will succeed in compromising your workflow, they won’t be able to do much. - Included permissions for the action. https://github.com/ossf/scorecard/blob/main/docs/checks.md#token-permissions https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#permissions https://docs.github.com/en/actions/using-jobs/assigning-permissions-to-jobs [Keeping your GitHub Actions and workflows secure Part 1: Preventing pwn requests](https://securitylab.github.com/research/github-actions-preventing-pwn-requests/) Signed-off-by: naveen <172697+naveensrinivasan@users.noreply.github.com> --- .github/workflows/deploy-latest.yml | 3 +++ .github/workflows/prettier.yml | 3 +++ .github/workflows/publish.yml | 3 +++ .github/workflows/push_docker_tag.yml | 3 +++ .github/workflows/spellcheck.yml | 3 +++ .github/workflows/test-coverage.yml | 3 +++ .github/workflows/test-pyodide.yml | 3 +++ .github/workflows/test.yml | 3 +++ .github/workflows/tmate-mac.yml | 3 +++ .github/workflows/tmate.yml | 3 +++ 10 files changed, 30 insertions(+) diff --git a/.github/workflows/deploy-latest.yml b/.github/workflows/deploy-latest.yml index a61f6629..2b94a7f1 100644 --- a/.github/workflows/deploy-latest.yml +++ b/.github/workflows/deploy-latest.yml @@ -5,6 +5,9 @@ on: branches: - main +permissions: + contents: read + jobs: deploy: runs-on: ubuntu-latest diff --git a/.github/workflows/prettier.yml b/.github/workflows/prettier.yml index 9dfe7ee0..ded41040 100644 --- a/.github/workflows/prettier.yml +++ b/.github/workflows/prettier.yml @@ -2,6 +2,9 @@ name: Check JavaScript for conformance with Prettier on: [push] +permissions: + contents: read + jobs: prettier: runs-on: ubuntu-latest diff --git a/.github/workflows/publish.yml b/.github/workflows/publish.yml index 3e4f8146..9ef09d2e 100644 --- a/.github/workflows/publish.yml +++ b/.github/workflows/publish.yml @@ -4,6 +4,9 @@ on: release: types: [created] +permissions: + contents: read + jobs: test: runs-on: ubuntu-latest diff --git a/.github/workflows/push_docker_tag.yml b/.github/workflows/push_docker_tag.yml index 9a3969f0..afe8d6b2 100644 --- a/.github/workflows/push_docker_tag.yml +++ b/.github/workflows/push_docker_tag.yml @@ -6,6 +6,9 @@ on: version_tag: description: Tag to build and push +permissions: + contents: read + jobs: deploy_docker: runs-on: ubuntu-latest diff --git a/.github/workflows/spellcheck.yml b/.github/workflows/spellcheck.yml index 2e24d3eb..a2621ecc 100644 --- a/.github/workflows/spellcheck.yml +++ b/.github/workflows/spellcheck.yml @@ -2,6 +2,9 @@ name: Check spelling in documentation on: [push, pull_request] +permissions: + contents: read + jobs: spellcheck: runs-on: ubuntu-latest diff --git a/.github/workflows/test-coverage.yml b/.github/workflows/test-coverage.yml index 1d1cf332..bd720664 100644 --- a/.github/workflows/test-coverage.yml +++ b/.github/workflows/test-coverage.yml @@ -7,6 +7,9 @@ on: pull_request: branches: - main +permissions: + contents: read + jobs: test: runs-on: ubuntu-latest diff --git a/.github/workflows/test-pyodide.yml b/.github/workflows/test-pyodide.yml index 1b75aade..bc9593a8 100644 --- a/.github/workflows/test-pyodide.yml +++ b/.github/workflows/test-pyodide.yml @@ -5,6 +5,9 @@ on: pull_request: workflow_dispatch: +permissions: + contents: read + jobs: test: runs-on: ubuntu-latest diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 8d916e49..90b6555e 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -2,6 +2,9 @@ name: Test on: [push, pull_request] +permissions: + contents: read + jobs: test: runs-on: ubuntu-latest diff --git a/.github/workflows/tmate-mac.yml b/.github/workflows/tmate-mac.yml index 46be117e..fcee0f21 100644 --- a/.github/workflows/tmate-mac.yml +++ b/.github/workflows/tmate-mac.yml @@ -3,6 +3,9 @@ name: tmate session mac on: workflow_dispatch: +permissions: + contents: read + jobs: build: runs-on: macos-latest diff --git a/.github/workflows/tmate.yml b/.github/workflows/tmate.yml index 02e7bd33..9792245d 100644 --- a/.github/workflows/tmate.yml +++ b/.github/workflows/tmate.yml @@ -3,6 +3,9 @@ name: tmate session on: workflow_dispatch: +permissions: + contents: read + jobs: build: runs-on: ubuntu-latest From e780b2f5d662ef3579d801d33567440055d4e84d Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 20 Jun 2022 10:54:23 -0700 Subject: [PATCH 087/952] Trying out one-sentence-per-line As suggested here: https://sive.rs/1s Markdown and reStructuredText will display this as if it is a single paragraph, even though the sentences themselves are separated by newlines. This could result in more useful diffs. Trying it out on this page first. --- docs/facets.rst | 22 +++++++++++++++------- 1 file changed, 15 insertions(+), 7 deletions(-) diff --git a/docs/facets.rst b/docs/facets.rst index 0228aa84..2a2eb039 100644 --- a/docs/facets.rst +++ b/docs/facets.rst @@ -3,7 +3,9 @@ Facets ====== -Datasette facets can be used to add a faceted browse interface to any database table. With facets, tables are displayed along with a summary showing the most common values in specified columns. These values can be selected to further filter the table. +Datasette facets can be used to add a faceted browse interface to any database table. +With facets, tables are displayed along with a summary showing the most common values in specified columns. +These values can be selected to further filter the table. .. image:: facets.png @@ -12,11 +14,13 @@ Facets can be specified in two ways: using query string parameters, or in ``meta Facets in query strings ----------------------- -To turn on faceting for specific columns on a Datasette table view, add one or more ``_facet=COLUMN`` parameters to the URL. For example, if you want to turn on facets for the ``city_id`` and ``state`` columns, construct a URL that looks like this:: +To turn on faceting for specific columns on a Datasette table view, add one or more ``_facet=COLUMN`` parameters to the URL. +For example, if you want to turn on facets for the ``city_id`` and ``state`` columns, construct a URL that looks like this:: /dbname/tablename?_facet=state&_facet=city_id -This works for both the HTML interface and the ``.json`` view. When enabled, facets will cause a ``facet_results`` block to be added to the JSON output, looking something like this: +This works for both the HTML interface and the ``.json`` view. +When enabled, facets will cause a ``facet_results`` block to be added to the JSON output, looking something like this: .. code-block:: json @@ -86,7 +90,8 @@ This works for both the HTML interface and the ``.json`` view. When enabled, fac If Datasette detects that a column is a foreign key, the ``"label"`` property will be automatically derived from the detected label column on the referenced table. -The default number of facet results returned is 30, controlled by the :ref:`setting_default_facet_size` setting. You can increase this on an individual page by adding ``?_facet_size=100`` to the query string, up to a maximum of :ref:`setting_max_returned_rows` (which defaults to 1000). +The default number of facet results returned is 30, controlled by the :ref:`setting_default_facet_size` setting. +You can increase this on an individual page by adding ``?_facet_size=100`` to the query string, up to a maximum of :ref:`setting_max_returned_rows` (which defaults to 1000). .. _facets_metadata: @@ -137,12 +142,14 @@ For the currently filtered data are there any columns which, if applied as a fac * Will return less unique options than the total number of filtered rows * And the query used to evaluate this criteria can be completed in under 50ms -That last point is particularly important: Datasette runs a query for every column that is displayed on a page, which could get expensive - so to avoid slow load times it sets a time limit of just 50ms for each of those queries. This means suggested facets are unlikely to appear for tables with millions of records in them. +That last point is particularly important: Datasette runs a query for every column that is displayed on a page, which could get expensive - so to avoid slow load times it sets a time limit of just 50ms for each of those queries. +This means suggested facets are unlikely to appear for tables with millions of records in them. Speeding up facets with indexes ------------------------------- -The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by. Adding indexes can be performed using the ``sqlite3`` command-line utility. Here's how to add an index on the ``state`` column in a table called ``Food_Trucks``:: +The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by. +Adding indexes can be performed using the ``sqlite3`` command-line utility. Here's how to add an index on the ``state`` column in a table called ``Food_Trucks``:: $ sqlite3 mydatabase.db SQLite version 3.19.3 2017-06-27 16:48:08 @@ -169,6 +176,7 @@ Example here: `latest.datasette.io/fixtures/facetable?_facet_array=tags `__ From 00e59ec461dc0150772b999c7cc15fcb9b507d58 Mon Sep 17 00:00:00 2001 From: "M. Nasimul Haque" Date: Mon, 20 Jun 2022 19:05:44 +0100 Subject: [PATCH 088/952] Extract facet pieces of table.html into included templates Thanks, @nsmgr8 --- datasette/templates/_facet_results.html | 28 ++++++++++++++++++ datasette/templates/_suggested_facets.html | 3 ++ datasette/templates/table.html | 33 ++-------------------- 3 files changed, 33 insertions(+), 31 deletions(-) create mode 100644 datasette/templates/_facet_results.html create mode 100644 datasette/templates/_suggested_facets.html diff --git a/datasette/templates/_facet_results.html b/datasette/templates/_facet_results.html new file mode 100644 index 00000000..d0cbcf77 --- /dev/null +++ b/datasette/templates/_facet_results.html @@ -0,0 +1,28 @@ +
+ {% for facet_info in sorted_facet_results %} +
+

+ {{ facet_info.name }}{% if facet_info.type != "column" %} ({{ facet_info.type }}){% endif %} + {% if facet_info.truncated %}>{% endif %}{{ facet_info.results|length }} + + {% if facet_info.hideable %} + + {% endif %} +

+
    + {% for facet_value in facet_info.results %} + {% if not facet_value.selected %} +
  • {{ (facet_value.label | string()) or "-" }} {{ "{:,}".format(facet_value.count) }}
  • + {% else %} +
  • {{ facet_value.label or "-" }} · {{ "{:,}".format(facet_value.count) }}
  • + {% endif %} + {% endfor %} + {% if facet_info.truncated %} +
  • {% if request.args._facet_size != "max" -%} + {% else -%}…{% endif %} +
  • + {% endif %} +
+
+ {% endfor %} +
diff --git a/datasette/templates/_suggested_facets.html b/datasette/templates/_suggested_facets.html new file mode 100644 index 00000000..ec98fb36 --- /dev/null +++ b/datasette/templates/_suggested_facets.html @@ -0,0 +1,3 @@ +

+ Suggested facets: {% for facet in suggested_facets %}{{ facet.name }}{% if facet.type %} ({{ facet.type }}){% endif %}{% if not loop.last %}, {% endif %}{% endfor %} +

diff --git a/datasette/templates/table.html b/datasette/templates/table.html index a9e88330..a86398ea 100644 --- a/datasette/templates/table.html +++ b/datasette/templates/table.html @@ -142,9 +142,7 @@ {% if suggested_facets %} -

- Suggested facets: {% for facet in suggested_facets %}{{ facet.name }}{% if facet.type %} ({{ facet.type }}){% endif %}{% if not loop.last %}, {% endif %}{% endfor %} -

+ {% include "_suggested_facets.html" %} {% endif %} {% if facets_timed_out %} @@ -152,34 +150,7 @@ {% endif %} {% if facet_results %} -
- {% for facet_info in sorted_facet_results %} -
-

- {{ facet_info.name }}{% if facet_info.type != "column" %} ({{ facet_info.type }}){% endif %} - {% if facet_info.truncated %}>{% endif %}{{ facet_info.results|length }} - - {% if facet_info.hideable %} - - {% endif %} -

-
    - {% for facet_value in facet_info.results %} - {% if not facet_value.selected %} -
  • {{ (facet_value.label | string()) or "-" }} {{ "{:,}".format(facet_value.count) }}
  • - {% else %} -
  • {{ facet_value.label or "-" }} · {{ "{:,}".format(facet_value.count) }}
  • - {% endif %} - {% endfor %} - {% if facet_info.truncated %} -
  • {% if request.args._facet_size != "max" -%} - {% else -%}…{% endif %} -
  • - {% endif %} -
-
- {% endfor %} -
+ {% include "_facet_results.html" %} {% endif %} {% include custom_table_templates %} From 9f1eb0d4eac483b953392157bd9fd6cc4df37de7 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 28 Jun 2022 10:40:24 -0700 Subject: [PATCH 089/952] Bump black from 22.1.0 to 22.6.0 (#1763) Bumps [black](https://github.com/psf/black) from 22.1.0 to 22.6.0. - [Release notes](https://github.com/psf/black/releases) - [Changelog](https://github.com/psf/black/blob/main/CHANGES.md) - [Commits](https://github.com/psf/black/compare/22.1.0...22.6.0) --- updated-dependencies: - dependency-name: black dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index d3fcdbd1..29cb77bf 100644 --- a/setup.py +++ b/setup.py @@ -76,7 +76,7 @@ setup( "pytest-xdist>=2.2.1,<2.6", "pytest-asyncio>=0.17,<0.19", "beautifulsoup4>=4.8.1,<4.12.0", - "black==22.1.0", + "black==22.6.0", "blacken-docs==1.12.1", "pytest-timeout>=1.4.2,<2.2", "trustme>=0.7,<0.10", From 6373bb341457e5becfd5b67792ac2c8b9ed7c384 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 7 Jul 2022 09:30:49 -0700 Subject: [PATCH 090/952] Expose current SQLite row to render_cell hook, closes #1300 --- datasette/hookspecs.py | 2 +- datasette/views/database.py | 1 + datasette/views/table.py | 1 + docs/plugin_hooks.rst | 9 ++++++--- tests/plugins/my_plugin.py | 3 ++- tests/test_plugins.py | 5 +++-- 6 files changed, 14 insertions(+), 7 deletions(-) diff --git a/datasette/hookspecs.py b/datasette/hookspecs.py index 8f4fecab..c84db0a3 100644 --- a/datasette/hookspecs.py +++ b/datasette/hookspecs.py @@ -60,7 +60,7 @@ def publish_subcommand(publish): @hookspec -def render_cell(value, column, table, database, datasette): +def render_cell(row, value, column, table, database, datasette): """Customize rendering of HTML table cell values""" diff --git a/datasette/views/database.py b/datasette/views/database.py index bc08ba05..42058752 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -375,6 +375,7 @@ class QueryView(DataView): # pylint: disable=no-member plugin_display_value = None for candidate in pm.hook.render_cell( + row=row, value=value, column=column, table=None, diff --git a/datasette/views/table.py b/datasette/views/table.py index 23289b29..cd4be823 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -895,6 +895,7 @@ async def display_columns_and_rows( # pylint: disable=no-member plugin_display_value = None for candidate in pm.hook.render_cell( + row=row, value=value, column=column, table=table_name, diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 7d10fe37..f5c3ee83 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -373,12 +373,15 @@ Examples: `datasette-publish-fly Date: Sat, 9 Jul 2022 10:25:37 -0700 Subject: [PATCH 091/952] More than 90 plugins now --- docs/writing_plugins.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/writing_plugins.rst b/docs/writing_plugins.rst index 9aee70f6..01ee8c90 100644 --- a/docs/writing_plugins.rst +++ b/docs/writing_plugins.rst @@ -5,7 +5,7 @@ Writing plugins You can write one-off plugins that apply to just one Datasette instance, or you can write plugins which can be installed using ``pip`` and can be shipped to the Python Package Index (`PyPI `__) for other people to install. -Want to start by looking at an example? The `Datasette plugins directory `__ lists more than 50 open source plugins with code you can explore. The :ref:`plugin hooks ` page includes links to example plugins for each of the documented hooks. +Want to start by looking at an example? The `Datasette plugins directory `__ lists more than 90 open source plugins with code you can explore. The :ref:`plugin hooks ` page includes links to example plugins for each of the documented hooks. .. _writing_plugins_one_off: From 5d76c1f81b2d978f48b85c70d041a2142cf8ee26 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 14 Jul 2022 15:03:33 -0700 Subject: [PATCH 092/952] Discord badge Refs https://github.com/simonw/datasette.io/issues/112 --- README.md | 1 + docs/index.rst | 2 ++ 2 files changed, 3 insertions(+) diff --git a/README.md b/README.md index 557d9290..c57ee604 100644 --- a/README.md +++ b/README.md @@ -7,6 +7,7 @@ [![Documentation Status](https://readthedocs.org/projects/datasette/badge/?version=latest)](https://docs.datasette.io/en/latest/?badge=latest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette/blob/main/LICENSE) [![docker: datasette](https://img.shields.io/badge/docker-datasette-blue)](https://hub.docker.com/r/datasetteproject/datasette) +[![discord](https://img.shields.io/discord/823971286308356157?label=Discord)](https://discord.gg/ktd74dm5mw) *An open source multi-tool for exploring and publishing data* diff --git a/docs/index.rst b/docs/index.rst index a2888822..62ed70f8 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -16,6 +16,8 @@ datasette| :target: https://github.com/simonw/datasette/blob/main/LICENSE .. |docker: datasette| image:: https://img.shields.io/badge/docker-datasette-blue :target: https://hub.docker.com/r/datasetteproject/datasette +.. |discord| image:: https://img.shields.io/discord/823971286308356157?label=Discord + :target: https://discord.gg/ktd74dm5mw *An open source multi-tool for exploring and publishing data* From c133545fe9c7ac2d509e55bf4bf6164bfbe892ad Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 14 Jul 2022 15:04:38 -0700 Subject: [PATCH 093/952] Make discord badge lowercase Refs https://github.com/simonw/datasette.io/issues/112 --- README.md | 2 +- docs/index.rst | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index c57ee604..032180aa 100644 --- a/README.md +++ b/README.md @@ -7,7 +7,7 @@ [![Documentation Status](https://readthedocs.org/projects/datasette/badge/?version=latest)](https://docs.datasette.io/en/latest/?badge=latest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette/blob/main/LICENSE) [![docker: datasette](https://img.shields.io/badge/docker-datasette-blue)](https://hub.docker.com/r/datasetteproject/datasette) -[![discord](https://img.shields.io/discord/823971286308356157?label=Discord)](https://discord.gg/ktd74dm5mw) +[![discord](https://img.shields.io/discord/823971286308356157?label=discord)](https://discord.gg/ktd74dm5mw) *An open source multi-tool for exploring and publishing data* diff --git a/docs/index.rst b/docs/index.rst index 62ed70f8..051898b1 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -16,7 +16,7 @@ datasette| :target: https://github.com/simonw/datasette/blob/main/LICENSE .. |docker: datasette| image:: https://img.shields.io/badge/docker-datasette-blue :target: https://hub.docker.com/r/datasetteproject/datasette -.. |discord| image:: https://img.shields.io/discord/823971286308356157?label=Discord +.. |discord| image:: https://img.shields.io/discord/823971286308356157?label=discord :target: https://discord.gg/ktd74dm5mw *An open source multi-tool for exploring and publishing data* From 950cc7677f65aa2543067b3bbfc2b6acb98b62c8 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 14 Jul 2022 15:18:28 -0700 Subject: [PATCH 094/952] Fix missing Discord image Refs https://github.com/simonw/datasette.io/issues/112 --- docs/index.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/index.rst b/docs/index.rst index 051898b1..efe196b3 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -2,7 +2,7 @@ Datasette ========= |PyPI| |Changelog| |Python 3.x| |Tests| |License| |docker: -datasette| +datasette| |discord| .. |PyPI| image:: https://img.shields.io/pypi/v/datasette.svg :target: https://pypi.org/project/datasette/ From 8188f55efc0fcca1be692b0d0c875f2d1ee99f17 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 17 Jul 2022 15:24:16 -0700 Subject: [PATCH 095/952] Rename handle_500 to handle_exception, refs #1770 --- datasette/app.py | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index f43700d4..43e60dbc 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -1275,7 +1275,7 @@ class DatasetteRouter: except NotFound as exception: return await self.handle_404(request, send, exception) except Exception as exception: - return await self.handle_500(request, send, exception) + return await self.handle_exception(request, send, exception) async def handle_404(self, request, send, exception=None): # If path contains % encoding, redirect to tilde encoding @@ -1354,7 +1354,7 @@ class DatasetteRouter: view_name="page", ) except NotFoundExplicit as e: - await self.handle_500(request, send, e) + await self.handle_exception(request, send, e) return # Pull content-type out into separate parameter content_type = "text/html; charset=utf-8" @@ -1369,9 +1369,9 @@ class DatasetteRouter: content_type=content_type, ) else: - await self.handle_500(request, send, exception or NotFound("404")) + await self.handle_exception(request, send, exception or NotFound("404")) - async def handle_500(self, request, send, exception): + async def handle_exception(self, request, send, exception): if self.ds.pdb: import pdb From c09c53f3455a7b9574cf7695478f2b87d20897db Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 17 Jul 2022 16:24:39 -0700 Subject: [PATCH 096/952] New handle_exception plugin hook, refs #1770 Also refs: - https://github.com/simonw/datasette-sentry/issues/1 - https://github.com/simonw/datasette-show-errors/issues/2 --- datasette/app.py | 97 +++++++++-------------------------- datasette/forbidden.py | 20 ++++++++ datasette/handle_exception.py | 74 ++++++++++++++++++++++++++ datasette/hookspecs.py | 5 ++ datasette/plugins.py | 2 + docs/plugin_hooks.rst | 78 ++++++++++++++++++++-------- tests/fixtures.py | 1 + tests/plugins/my_plugin_2.py | 18 +++++++ tests/test_permissions.py | 1 + tests/test_plugins.py | 14 +++++ 10 files changed, 215 insertions(+), 95 deletions(-) create mode 100644 datasette/forbidden.py create mode 100644 datasette/handle_exception.py diff --git a/datasette/app.py b/datasette/app.py index 43e60dbc..edd05bb3 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -16,7 +16,6 @@ import re import secrets import sys import threading -import traceback import urllib.parse from concurrent import futures from pathlib import Path @@ -27,7 +26,7 @@ from jinja2 import ChoiceLoader, Environment, FileSystemLoader, PrefixLoader from jinja2.environment import Template from jinja2.exceptions import TemplateNotFound -from .views.base import DatasetteError, ureg +from .views.base import ureg from .views.database import DatabaseDownload, DatabaseView from .views.index import IndexView from .views.special import ( @@ -49,7 +48,6 @@ from .utils import ( PrefixedUrlString, SPATIALITE_FUNCTIONS, StartupError, - add_cors_headers, async_call_with_supported_arguments, await_me_maybe, call_with_supported_arguments, @@ -87,11 +85,6 @@ from .tracer import AsgiTracer from .plugins import pm, DEFAULT_PLUGINS, get_plugins from .version import __version__ -try: - import rich -except ImportError: - rich = None - app_root = Path(__file__).parent.parent # https://github.com/simonw/datasette/issues/283#issuecomment-781591015 @@ -1274,6 +1267,16 @@ class DatasetteRouter: return except NotFound as exception: return await self.handle_404(request, send, exception) + except Forbidden as exception: + # Try the forbidden() plugin hook + for custom_response in pm.hook.forbidden( + datasette=self.ds, request=request, message=exception.args[0] + ): + custom_response = await await_me_maybe(custom_response) + assert ( + custom_response + ), "Default forbidden() hook should have been called" + return await custom_response.asgi_send(send) except Exception as exception: return await self.handle_exception(request, send, exception) @@ -1372,72 +1375,20 @@ class DatasetteRouter: await self.handle_exception(request, send, exception or NotFound("404")) async def handle_exception(self, request, send, exception): - if self.ds.pdb: - import pdb + responses = [] + for hook in pm.hook.handle_exception( + datasette=self.ds, + request=request, + exception=exception, + ): + response = await await_me_maybe(hook) + if response is not None: + responses.append(response) - pdb.post_mortem(exception.__traceback__) - - if rich is not None: - rich.get_console().print_exception(show_locals=True) - - title = None - if isinstance(exception, Forbidden): - status = 403 - info = {} - message = exception.args[0] - # Try the forbidden() plugin hook - for custom_response in pm.hook.forbidden( - datasette=self.ds, request=request, message=message - ): - custom_response = await await_me_maybe(custom_response) - if custom_response is not None: - await custom_response.asgi_send(send) - return - elif isinstance(exception, Base400): - status = exception.status - info = {} - message = exception.args[0] - elif isinstance(exception, DatasetteError): - status = exception.status - info = exception.error_dict - message = exception.message - if exception.message_is_html: - message = Markup(message) - title = exception.title - else: - status = 500 - info = {} - message = str(exception) - traceback.print_exc() - templates = [f"{status}.html", "error.html"] - info.update( - { - "ok": False, - "error": message, - "status": status, - "title": title, - } - ) - headers = {} - if self.ds.cors: - add_cors_headers(headers) - if request.path.split("?")[0].endswith(".json"): - await asgi_send_json(send, info, status=status, headers=headers) - else: - template = self.ds.jinja_env.select_template(templates) - await asgi_send_html( - send, - await template.render_async( - dict( - info, - urls=self.ds.urls, - app_css_hash=self.ds.app_css_hash(), - menu_links=lambda: [], - ) - ), - status=status, - headers=headers, - ) + assert responses, "Default exception handler should have returned something" + # Even if there are multiple responses use just the first one + response = responses[0] + await response.asgi_send(send) _cleaner_task_str_re = re.compile(r"\S*site-packages/") diff --git a/datasette/forbidden.py b/datasette/forbidden.py new file mode 100644 index 00000000..156a44d4 --- /dev/null +++ b/datasette/forbidden.py @@ -0,0 +1,20 @@ +from os import stat +from datasette import hookimpl, Response + + +@hookimpl(trylast=True) +def forbidden(datasette, request, message): + async def inner(): + return Response.html( + await datasette.render_template( + "error.html", + { + "title": "Forbidden", + "error": message, + }, + request=request, + ), + status=403, + ) + + return inner diff --git a/datasette/handle_exception.py b/datasette/handle_exception.py new file mode 100644 index 00000000..8b7e83e3 --- /dev/null +++ b/datasette/handle_exception.py @@ -0,0 +1,74 @@ +from datasette import hookimpl, Response +from .utils import await_me_maybe, add_cors_headers +from .utils.asgi import ( + Base400, + Forbidden, +) +from .views.base import DatasetteError +from markupsafe import Markup +import pdb +import traceback +from .plugins import pm + +try: + import rich +except ImportError: + rich = None + + +@hookimpl(trylast=True) +def handle_exception(datasette, request, exception): + async def inner(): + if datasette.pdb: + pdb.post_mortem(exception.__traceback__) + + if rich is not None: + rich.get_console().print_exception(show_locals=True) + + title = None + if isinstance(exception, Base400): + status = exception.status + info = {} + message = exception.args[0] + elif isinstance(exception, DatasetteError): + status = exception.status + info = exception.error_dict + message = exception.message + if exception.message_is_html: + message = Markup(message) + title = exception.title + else: + status = 500 + info = {} + message = str(exception) + traceback.print_exc() + templates = [f"{status}.html", "error.html"] + info.update( + { + "ok": False, + "error": message, + "status": status, + "title": title, + } + ) + headers = {} + if datasette.cors: + add_cors_headers(headers) + if request.path.split("?")[0].endswith(".json"): + return Response.json(info, status=status, headers=headers) + else: + template = datasette.jinja_env.select_template(templates) + return Response.html( + await template.render_async( + dict( + info, + urls=datasette.urls, + app_css_hash=datasette.app_css_hash(), + menu_links=lambda: [], + ) + ), + status=status, + headers=headers, + ) + + return inner diff --git a/datasette/hookspecs.py b/datasette/hookspecs.py index c84db0a3..a5fb536f 100644 --- a/datasette/hookspecs.py +++ b/datasette/hookspecs.py @@ -138,3 +138,8 @@ def database_actions(datasette, actor, database, request): @hookspec def skip_csrf(datasette, scope): """Mechanism for skipping CSRF checks for certain requests""" + + +@hookspec +def handle_exception(datasette, request, exception): + """Handle an uncaught exception. Can return a Response or None.""" diff --git a/datasette/plugins.py b/datasette/plugins.py index 76b46a47..fef0c8e9 100644 --- a/datasette/plugins.py +++ b/datasette/plugins.py @@ -15,6 +15,8 @@ DEFAULT_PLUGINS = ( "datasette.default_magic_parameters", "datasette.blob_renderer", "datasette.default_menu_links", + "datasette.handle_exception", + "datasette.forbidden", ) pm = pluggy.PluginManager("datasette") diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index f5c3ee83..6020a941 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -107,8 +107,8 @@ Extra template variables that should be made available in the rendered template ``view_name`` - string The name of the view being displayed. (``index``, ``database``, ``table``, and ``row`` are the most important ones.) -``request`` - object or None - The current HTTP :ref:`internals_request`. This can be ``None`` if the request object is not available. +``request`` - :ref:`internals_request` or None + The current HTTP request. This can be ``None`` if the request object is not available. ``datasette`` - :ref:`internals_datasette` You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)`` @@ -504,7 +504,7 @@ When a request is received, the ``"render"`` callback function is called with ze The table or view, if one is being rendered. ``request`` - :ref:`internals_request` - The incoming HTTP request. + The current HTTP request. ``view_name`` - string The name of the current view being called. ``index``, ``database``, ``table``, and ``row`` are the most important ones. @@ -599,8 +599,8 @@ The optional view function arguments are as follows: ``datasette`` - :ref:`internals_datasette` You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries. -``request`` - Request object - The current HTTP :ref:`internals_request`. +``request`` - :ref:`internals_request` + The current HTTP request. ``scope`` - dictionary The incoming ASGI scope dictionary. @@ -947,8 +947,8 @@ actor_from_request(datasette, request) ``datasette`` - :ref:`internals_datasette` You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries. -``request`` - object - The current HTTP :ref:`internals_request`. +``request`` - :ref:`internals_request` + The current HTTP request. This is part of Datasette's :ref:`authentication and permissions system `. The function should attempt to authenticate an actor (either a user or an API actor of some sort) based on information in the request. @@ -1010,8 +1010,8 @@ Example: `datasette-auth-tokens `__ and then renders a custom error page: + +.. code-block:: python + + from datasette import hookimpl, Response + import sentry_sdk + + + @hookimpl + def handle_exception(datasette, exception): + sentry_sdk.capture_exception(exception) + async def inner(): + return Response.html( + await datasette.render_template("custom_error.html", request=request) + ) + return inner + .. _plugin_hook_menu_links: menu_links(datasette, actor, request) @@ -1232,8 +1266,8 @@ menu_links(datasette, actor, request) ``actor`` - dictionary or None The currently authenticated :ref:`actor `. -``request`` - object or None - The current HTTP :ref:`internals_request`. This can be ``None`` if the request object is not available. +``request`` - :ref:`internals_request` + The current HTTP request. This can be ``None`` if the request object is not available. This hook allows additional items to be included in the menu displayed by Datasette's top right menu icon. @@ -1281,8 +1315,8 @@ table_actions(datasette, actor, database, table, request) ``table`` - string The name of the table. -``request`` - object - The current HTTP :ref:`internals_request`. This can be ``None`` if the request object is not available. +``request`` - :ref:`internals_request` + The current HTTP request. This can be ``None`` if the request object is not available. This hook allows table actions to be displayed in a menu accessed via an action icon at the top of the table page. It should return a list of ``{"href": "...", "label": "..."}`` menu items. @@ -1325,8 +1359,8 @@ database_actions(datasette, actor, database, request) ``database`` - string The name of the database. -``request`` - object - The current HTTP :ref:`internals_request`. +``request`` - :ref:`internals_request` + The current HTTP request. This hook is similar to :ref:`plugin_hook_table_actions` but populates an actions menu on the database page. diff --git a/tests/fixtures.py b/tests/fixtures.py index e0e4ec7b..c145ac78 100644 --- a/tests/fixtures.py +++ b/tests/fixtures.py @@ -68,6 +68,7 @@ EXPECTED_PLUGINS = [ "canned_queries", "extra_js_urls", "extra_template_vars", + "handle_exception", "menu_links", "permission_allowed", "register_routes", diff --git a/tests/plugins/my_plugin_2.py b/tests/plugins/my_plugin_2.py index f5ce36b3..4df02343 100644 --- a/tests/plugins/my_plugin_2.py +++ b/tests/plugins/my_plugin_2.py @@ -185,3 +185,21 @@ def register_routes(datasette): # Also serves to demonstrate over-ride of default paths: (r"/(?P[^/]+)/(?P[^/]+?$)", new_table), ] + + +@hookimpl +def handle_exception(datasette, request, exception): + datasette._exception_hook_fired = (request, exception) + if request.args.get("_custom_error"): + return Response.text("_custom_error") + elif request.args.get("_custom_error_async"): + + async def inner(): + return Response.text("_custom_error_async") + + return inner + + +@hookimpl(specname="register_routes") +def register_triger_error(): + return ((r"/trigger-error", lambda: 1 / 0),) diff --git a/tests/test_permissions.py b/tests/test_permissions.py index f4169dbe..2a519e76 100644 --- a/tests/test_permissions.py +++ b/tests/test_permissions.py @@ -332,6 +332,7 @@ def test_permissions_debug(app_client): assert checks == [ {"action": "permissions-debug", "result": True, "used_default": False}, {"action": "view-instance", "result": None, "used_default": True}, + {"action": "debug-menu", "result": False, "used_default": True}, {"action": "permissions-debug", "result": False, "used_default": True}, {"action": "view-instance", "result": None, "used_default": True}, ] diff --git a/tests/test_plugins.py b/tests/test_plugins.py index 4a7ad7c6..948a40b8 100644 --- a/tests/test_plugins.py +++ b/tests/test_plugins.py @@ -824,6 +824,20 @@ def test_hook_forbidden(restore_working_directory): assert "view-database" == client.ds._last_forbidden_message +def test_hook_handle_exception(app_client): + app_client.get("/trigger-error?x=123") + assert hasattr(app_client.ds, "_exception_hook_fired") + request, exception = app_client.ds._exception_hook_fired + assert request.url == "http://localhost/trigger-error?x=123" + assert isinstance(exception, ZeroDivisionError) + + +@pytest.mark.parametrize("param", ("_custom_error", "_custom_error_async")) +def test_hook_handle_exception_custom_response(app_client, param): + response = app_client.get("/trigger-error?{}=1".format(param)) + assert response.text == param + + def test_hook_menu_links(app_client): def get_menu_links(html): soup = Soup(html, "html.parser") From 58fd1e33ec7ac5ed85431d5c86d60600cd5280fb Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 17 Jul 2022 16:30:58 -0700 Subject: [PATCH 097/952] Hint that you can render templates for these hooks, refs #1770 --- docs/plugin_hooks.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 6020a941..b4869606 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -1176,7 +1176,7 @@ forbidden(datasette, request, message) -------------------------------------- ``datasette`` - :ref:`internals_datasette` - You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries. + You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to render templates or execute SQL queries. ``request`` - :ref:`internals_request` The current HTTP request. @@ -1224,7 +1224,7 @@ handle_exception(datasette, request, exception) ----------------------------------------------- ``datasette`` - :ref:`internals_datasette` - You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries. + You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to render templates or execute SQL queries. ``request`` - :ref:`internals_request` The current HTTP request. From e543a095cc4c1ca895b082cfd1263ca25203a7c0 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 17 Jul 2022 17:57:41 -0700 Subject: [PATCH 098/952] Updated default plugins in docs, refs #1770 --- docs/plugins.rst | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/docs/plugins.rst b/docs/plugins.rst index f2ed02f7..29078054 100644 --- a/docs/plugins.rst +++ b/docs/plugins.rst @@ -172,6 +172,24 @@ If you run ``datasette plugins --all`` it will include default plugins that ship "filters_from_request" ] }, + { + "name": "datasette.forbidden", + "static": false, + "templates": false, + "version": null, + "hooks": [ + "forbidden" + ] + }, + { + "name": "datasette.handle_exception", + "static": false, + "templates": false, + "version": null, + "hooks": [ + "handle_exception" + ] + }, { "name": "datasette.publish.cloudrun", "static": false, From 6d5e1955470424cf4faf5d35788d328ebdd6d463 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 17 Jul 2022 17:59:20 -0700 Subject: [PATCH 099/952] Release 0.62a1 Refs #1300, #1739, #1744, #1746, #1748, #1759, #1770 --- datasette/version.py | 2 +- docs/changelog.rst | 14 ++++++++++++++ 2 files changed, 15 insertions(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index cf18c441..86f4cf7e 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.62a0" +__version__ = "0.62a1" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index 74814fcb..3f105811 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,6 +4,20 @@ Changelog ========= +.. _v0_62a1: + +0.62a1 (2022-07-17) +------------------- + +- New plugin hook: :ref:`handle_exception() `, for custom handling of exceptions caught by Datasette. (:issue:`1770`) +- The :ref:`render_cell() ` plugin hook is now also passed a ``row`` argument, representing the ``sqlite3.Row`` object that is being rendered. (:issue:`1300`) +- New ``--nolock`` option for ignoring file locks when opening read-only databases. (:issue:`1744`) +- Documentation now uses the `Furo `__ Sphinx theme. (:issue:`1746`) +- Datasette now has a `Discord community `__. +- Database file downloads now implement conditional GET using ETags. (:issue:`1739`) +- Examples in the documentation now include a copy-to-clipboard button. (:issue:`1748`) +- HTML for facet results and suggested results has been extracted out into new templates ``_facet_results.html`` and ``_suggested_facets.html``. Thanks, M. Nasimul Haque. (`#1759 `__) + .. _v0_62a0: 0.62a0 (2022-05-02) From ed1ebc0f1d4153e3e0934f2af19f82e5fdf137d3 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 17 Jul 2022 18:03:33 -0700 Subject: [PATCH 100/952] Run blacken-docs, refs #1770 --- docs/plugin_hooks.rst | 10 ++++++++-- 1 file changed, 8 insertions(+), 2 deletions(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index b4869606..aec1df56 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -1213,7 +1213,9 @@ The function can alternatively return an awaitable function if it needs to make def forbidden(datasette): async def inner(): return Response.html( - await datasette.render_template("render_message.html", request=request) + await datasette.render_template( + "render_message.html", request=request + ) ) return inner @@ -1249,10 +1251,14 @@ This example logs an error to `Sentry `__ and then renders a @hookimpl def handle_exception(datasette, exception): sentry_sdk.capture_exception(exception) + async def inner(): return Response.html( - await datasette.render_template("custom_error.html", request=request) + await datasette.render_template( + "custom_error.html", request=request + ) ) + return inner .. _plugin_hook_menu_links: From ea6161f8475d9fa41c4879049511c58f692cce04 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sun, 17 Jul 2022 18:06:26 -0700 Subject: [PATCH 101/952] Bump furo from 2022.4.7 to 2022.6.21 (#1760) Bumps [furo](https://github.com/pradyunsg/furo) from 2022.4.7 to 2022.6.21. - [Release notes](https://github.com/pradyunsg/furo/releases) - [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md) - [Commits](https://github.com/pradyunsg/furo/compare/2022.04.07...2022.06.21) --- updated-dependencies: - dependency-name: furo dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index 29cb77bf..558b5c87 100644 --- a/setup.py +++ b/setup.py @@ -65,7 +65,7 @@ setup( setup_requires=["pytest-runner"], extras_require={ "docs": [ - "furo==2022.4.7", + "furo==2022.6.21", "sphinx-autobuild", "codespell", "blacken-docs", From 22354c48ce4d514d7a1b321e5651c7f1340e3f5e Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sun, 17 Jul 2022 18:06:37 -0700 Subject: [PATCH 102/952] Update pytest-asyncio requirement from <0.19,>=0.17 to >=0.17,<0.20 (#1769) Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version. - [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases) - [Changelog](https://github.com/pytest-dev/pytest-asyncio/blob/master/CHANGELOG.rst) - [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.17.0...v0.19.0) --- updated-dependencies: - dependency-name: pytest-asyncio dependency-type: direct:development ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index 558b5c87..a1c51d0b 100644 --- a/setup.py +++ b/setup.py @@ -74,7 +74,7 @@ setup( "test": [ "pytest>=5.2.2,<7.2.0", "pytest-xdist>=2.2.1,<2.6", - "pytest-asyncio>=0.17,<0.19", + "pytest-asyncio>=0.17,<0.20", "beautifulsoup4>=4.8.1,<4.12.0", "black==22.6.0", "blacken-docs==1.12.1", From 01369176b0a8943ab45292ffc6f9c929b80a00e8 Mon Sep 17 00:00:00 2001 From: Chris Amico Date: Sun, 17 Jul 2022 21:12:45 -0400 Subject: [PATCH 103/952] Keep track of datasette.config_dir (#1766) Thanks, @eyeseast - closes #1764 --- datasette/app.py | 1 + tests/test_config_dir.py | 9 +++++++++ 2 files changed, 10 insertions(+) diff --git a/datasette/app.py b/datasette/app.py index edd05bb3..1a9afc10 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -211,6 +211,7 @@ class Datasette: assert config_dir is None or isinstance( config_dir, Path ), "config_dir= should be a pathlib.Path" + self.config_dir = config_dir self.pdb = pdb self._secret = secret or secrets.token_hex(32) self.files = tuple(files or []) + tuple(immutables or []) diff --git a/tests/test_config_dir.py b/tests/test_config_dir.py index 015c6ace..fe927c42 100644 --- a/tests/test_config_dir.py +++ b/tests/test_config_dir.py @@ -1,4 +1,5 @@ import json +import pathlib import pytest from datasette.app import Datasette @@ -150,3 +151,11 @@ def test_metadata_yaml(tmp_path_factory, filename): response = client.get("/-/metadata.json") assert 200 == response.status assert {"title": "Title from metadata"} == response.json + + +def test_store_config_dir(config_dir_client): + ds = config_dir_client.ds + + assert hasattr(ds, "config_dir") + assert ds.config_dir is not None + assert isinstance(ds.config_dir, pathlib.Path) From 7af67b54b7d9bca43e948510fc62f6db2b748fa8 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 18 Jul 2022 14:31:09 -0700 Subject: [PATCH 104/952] How to register temporary plugins in tests, closes #903 --- docs/testing_plugins.rst | 36 ++++++++++++++++++++++++++++++++++++ 1 file changed, 36 insertions(+) diff --git a/docs/testing_plugins.rst b/docs/testing_plugins.rst index 41046bfb..d02003a9 100644 --- a/docs/testing_plugins.rst +++ b/docs/testing_plugins.rst @@ -219,3 +219,39 @@ Here's a test for that plugin that mocks the HTTPX outbound request: assert ( outbound_request.url == "https://www.example.com/" ) + +.. _testing_plugins_register_in_test: + +Registering a plugin for the duration of a test +----------------------------------------------- + +When writing tests for plugins you may find it useful to register a test plugin just for the duration of a single test. You can do this using ``pm.register()`` and ``pm.unregister()`` like this: + +.. code-block:: python + + from datasette import hookimpl + from datasette.app import Datasette + from datasette.plugins import pm + import pytest + + + @pytest.mark.asyncio + async def test_using_test_plugin(): + class TestPlugin: + __name__ = "TestPlugin" + + # Use hookimpl and method names to register hooks + @hookimpl + def register_routes(self): + return [ + (r"^/error$", lambda: 1/0), + ] + + pm.register(TestPlugin(), name="undo") + try: + # The test implementation goes here + datasette = Datasette() + response = await datasette.client.get("/error") + assert response.status_code == 500 + finally: + pm.unregister(name="undo") From bca2d95d0228f80a108e13408f8e72b2c06c2c7b Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 2 Aug 2022 16:38:02 -0700 Subject: [PATCH 105/952] Configure readthedocs/readthedocs-preview --- .github/workflows/documentation-links.yml | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) create mode 100644 .github/workflows/documentation-links.yml diff --git a/.github/workflows/documentation-links.yml b/.github/workflows/documentation-links.yml new file mode 100644 index 00000000..e7062a46 --- /dev/null +++ b/.github/workflows/documentation-links.yml @@ -0,0 +1,16 @@ +name: Read the Docs Pull Request Preview +on: + pull_request_target: + types: + - opened + +permissions: + pull-requests: write + +jobs: + documentation-links: + runs-on: ubuntu-latest + steps: + - uses: readthedocs/readthedocs-preview@main + with: + project-slug: "datasette" From 8cfc72336878dd846d149658e99cc598e835b661 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 9 Aug 2022 11:21:53 -0700 Subject: [PATCH 106/952] Ran blacken-docs --- docs/testing_plugins.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/testing_plugins.rst b/docs/testing_plugins.rst index d02003a9..992b4b0e 100644 --- a/docs/testing_plugins.rst +++ b/docs/testing_plugins.rst @@ -244,7 +244,7 @@ When writing tests for plugins you may find it useful to register a test plugin @hookimpl def register_routes(self): return [ - (r"^/error$", lambda: 1/0), + (r"^/error$", lambda: 1 / 0), ] pm.register(TestPlugin(), name="undo") From 05d9c682689a0f1d23cbb502e027364ab3363910 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 08:16:53 -0700 Subject: [PATCH 107/952] Promote Discord more in the README --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 032180aa..7ebbca57 100644 --- a/README.md +++ b/README.md @@ -22,7 +22,7 @@ Datasette is aimed at data journalists, museum curators, archivists, local gover * Comprehensive documentation: https://docs.datasette.io/ * Examples: https://datasette.io/examples * Live demo of current main: https://latest.datasette.io/ -* Support questions, feedback? Join our [GitHub Discussions forum](https://github.com/simonw/datasette/discussions) +* Questions, feedback or want to talk about the project? Join our [Discord](https://discord.gg/ktd74dm5mw) Want to stay up-to-date with the project? Subscribe to the [Datasette newsletter](https://datasette.substack.com/) for tips, tricks and news on what's new in the Datasette ecosystem. From db00c00f6397287749331e8042fe998ee7f3b919 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 08:19:30 -0700 Subject: [PATCH 108/952] Promote Datasette Lite in the README, refs #1781 --- README.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 7ebbca57..1af20129 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,7 @@ Datasette is aimed at data journalists, museum curators, archivists, local gover * Latest [Datasette News](https://datasette.io/news) * Comprehensive documentation: https://docs.datasette.io/ * Examples: https://datasette.io/examples -* Live demo of current main: https://latest.datasette.io/ +* Live demo of current `main` branch: https://latest.datasette.io/ * Questions, feedback or want to talk about the project? Join our [Discord](https://discord.gg/ktd74dm5mw) Want to stay up-to-date with the project? Subscribe to the [Datasette newsletter](https://datasette.substack.com/) for tips, tricks and news on what's new in the Datasette ecosystem. @@ -85,3 +85,7 @@ Or: This will create a docker image containing both the datasette application and the specified SQLite database files. It will then deploy that image to Heroku or Cloud Run and give you a URL to access the resulting website and API. See [Publishing data](https://docs.datasette.io/en/stable/publish.html) in the documentation for more details. + +## Datasette Lite + +[Datasette Lite](https://lite.datasette.io/) is Datasette packaged using WebAssembly so that it runs entirely in your browser, no Python web application server required. Read more about that in the [Datasette Lite documentation](https://github.com/simonw/datasette-lite/blob/main/README.md). From 8eb699de7becdefc6d72555d9fb17c9f06235dc4 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 08:24:39 -0700 Subject: [PATCH 109/952] Datasette Lite in Getting Started docs, closes #1781 --- docs/getting_started.rst | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/docs/getting_started.rst b/docs/getting_started.rst index 00b753a9..571540cf 100644 --- a/docs/getting_started.rst +++ b/docs/getting_started.rst @@ -21,6 +21,17 @@ Datasette has several `tutorials `__ to help you - `Exploring a database with Datasette `__ shows how to use the Datasette web interface to explore a new database. - `Learn SQL with Datasette `__ introduces SQL, and shows how to use that query language to ask questions of your data. +.. _getting_started_datasette_lite: + +Datasette in your browser with Datasette Lite +--------------------------------------------- + +`Datasette Lite `__ is Datasette packaged using WebAssembly so that it runs entirely in your browser, no Python web application server required. + +You can pass a URL to a CSV, SQLite or raw SQL file directly to Datasette Lite to explore that data in your browser. + +This `example link `__ opens Datasette Lite and loads the SQL Murder Mystery example database from `Northwestern University Knight Lab `__. + .. _getting_started_glitch: Try Datasette without installing anything using Glitch From df4fd2d7ddca8956d8a51c72ce007b8c75227f32 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 08:44:02 -0700 Subject: [PATCH 110/952] _sort= works even if sort column not selected, closes #1773 --- datasette/views/table.py | 22 +++++++++++++++++++++- tests/test_table_api.py | 2 ++ 2 files changed, 23 insertions(+), 1 deletion(-) diff --git a/datasette/views/table.py b/datasette/views/table.py index cd4be823..94d2673b 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -630,7 +630,27 @@ class TableView(DataView): next_value = path_from_row_pks(rows[-2], pks, use_rowid) # If there's a sort or sort_desc, add that value as a prefix if (sort or sort_desc) and not is_view: - prefix = rows[-2][sort or sort_desc] + try: + prefix = rows[-2][sort or sort_desc] + except IndexError: + # sort/sort_desc column missing from SELECT - look up value by PK instead + prefix_where_clause = " and ".join( + "[{}] = :pk{}".format(pk, i) for i, pk in enumerate(pks) + ) + prefix_lookup_sql = "select [{}] from [{}] where {}".format( + sort or sort_desc, table_name, prefix_where_clause + ) + prefix = ( + await db.execute( + prefix_lookup_sql, + { + **{ + "pk{}".format(i): rows[-2][pk] + for i, pk in enumerate(pks) + } + }, + ) + ).single_value() if isinstance(prefix, dict) and "value" in prefix: prefix = prefix["value"] if prefix is None: diff --git a/tests/test_table_api.py b/tests/test_table_api.py index 9db383c3..e56a72b5 100644 --- a/tests/test_table_api.py +++ b/tests/test_table_api.py @@ -288,6 +288,8 @@ def test_paginate_compound_keys_with_extra_filters(app_client): ), # text column contains '$null' - ensure it doesn't confuse pagination: ("_sort=text", lambda row: row["text"], "sorted by text"), + # Still works if sort column removed using _col= + ("_sort=text&_col=content", lambda row: row["text"], "sorted by text"), ], ) def test_sortable(app_client, query_string, sort_key, human_description_en): From 668415df9f6334bd255c22ab02018bed5bc14edd Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 08:47:17 -0700 Subject: [PATCH 111/952] Upgrade Docker baes to 3.10.6-slim-bullseye - refs #1768 --- Dockerfile | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Dockerfile b/Dockerfile index 42f5529b..ee7ed957 100644 --- a/Dockerfile +++ b/Dockerfile @@ -1,4 +1,4 @@ -FROM python:3.9.7-slim-bullseye as build +FROM python:3.10.6-slim-bullseye as build # Version of Datasette to install, e.g. 0.55 # docker build . -t datasette --build-arg VERSION=0.55 From 080d4b3e065d78faf977c6ded6ead31aae24e2ae Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 08:49:14 -0700 Subject: [PATCH 112/952] Switch to python:3.10.6-slim-bullseye for datasette publish - refs #1768 --- datasette/utils/__init__.py | 2 +- demos/apache-proxy/Dockerfile | 2 +- docs/publish.rst | 2 +- tests/test_package.py | 2 +- tests/test_publish_cloudrun.py | 4 ++-- 5 files changed, 6 insertions(+), 6 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index 77768112..d148cc2c 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -390,7 +390,7 @@ def make_dockerfile( "SQLITE_EXTENSIONS" ] = "/usr/lib/x86_64-linux-gnu/mod_spatialite.so" return """ -FROM python:3.8 +FROM python:3.10.6-slim-bullseye COPY . /app WORKDIR /app {apt_get_extras} diff --git a/demos/apache-proxy/Dockerfile b/demos/apache-proxy/Dockerfile index 6c921963..70b33bec 100644 --- a/demos/apache-proxy/Dockerfile +++ b/demos/apache-proxy/Dockerfile @@ -1,4 +1,4 @@ -FROM python:3.9.7-slim-bullseye +FROM python:3.10.6-slim-bullseye RUN apt-get update && \ apt-get install -y apache2 supervisor && \ diff --git a/docs/publish.rst b/docs/publish.rst index 166f2883..9c7c99cc 100644 --- a/docs/publish.rst +++ b/docs/publish.rst @@ -144,7 +144,7 @@ Here's example output for the package command:: $ datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500" Sending build context to Docker daemon 4.459MB - Step 1/7 : FROM python:3 + Step 1/7 : FROM python:3.10.6-slim-bullseye ---> 79e1dc9af1c1 Step 2/7 : COPY . /app ---> Using cache diff --git a/tests/test_package.py b/tests/test_package.py index 02ed1775..ac15e61e 100644 --- a/tests/test_package.py +++ b/tests/test_package.py @@ -12,7 +12,7 @@ class CaptureDockerfile: EXPECTED_DOCKERFILE = """ -FROM python:3.8 +FROM python:3.10.6-slim-bullseye COPY . /app WORKDIR /app diff --git a/tests/test_publish_cloudrun.py b/tests/test_publish_cloudrun.py index 3427f4f7..60079ab3 100644 --- a/tests/test_publish_cloudrun.py +++ b/tests/test_publish_cloudrun.py @@ -223,7 +223,7 @@ def test_publish_cloudrun_plugin_secrets( ) expected = textwrap.dedent( r""" - FROM python:3.8 + FROM python:3.10.6-slim-bullseye COPY . /app WORKDIR /app @@ -290,7 +290,7 @@ def test_publish_cloudrun_apt_get_install( ) expected = textwrap.dedent( r""" - FROM python:3.8 + FROM python:3.10.6-slim-bullseye COPY . /app WORKDIR /app From 1563c22a8c65e6cff5194aa07df54d0ab8d4eecb Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 09:13:12 -0700 Subject: [PATCH 113/952] Don't duplicate _sort_desc, refs #1738 --- datasette/views/table.py | 2 +- tests/test_table_html.py | 1 + 2 files changed, 2 insertions(+), 1 deletion(-) diff --git a/datasette/views/table.py b/datasette/views/table.py index 94d2673b..49c30c9c 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -710,7 +710,7 @@ class TableView(DataView): for key in request.args: if ( key.startswith("_") - and key not in ("_sort", "_search", "_next") + and key not in ("_sort", "_sort_desc", "_search", "_next") and "__" not in key ): for value in request.args.getlist(key): diff --git a/tests/test_table_html.py b/tests/test_table_html.py index d3cb3e17..f3808ea3 100644 --- a/tests/test_table_html.py +++ b/tests/test_table_html.py @@ -828,6 +828,7 @@ def test_other_hidden_form_fields(app_client, path, expected_hidden): [ ("/fixtures/searchable?_search=terry", []), ("/fixtures/searchable?_sort=text2", []), + ("/fixtures/searchable?_sort_desc=text2", []), ("/fixtures/searchable?_sort=text2&_where=1", [("_where", "1")]), ], ) From c1396bf86033a7bd99fa0c0431f585475391a11a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 09:34:31 -0700 Subject: [PATCH 114/952] Don't allow canned write queries on immutable DBs, closes #1728 --- datasette/templates/query.html | 6 ++++- datasette/views/database.py | 4 ++++ tests/test_canned_queries.py | 40 ++++++++++++++++++++++++++++++++++ 3 files changed, 49 insertions(+), 1 deletion(-) diff --git a/datasette/templates/query.html b/datasette/templates/query.html index 8c920527..cee779fc 100644 --- a/datasette/templates/query.html +++ b/datasette/templates/query.html @@ -28,6 +28,10 @@ {% block content %} +{% if canned_write and db_is_immutable %} +

This query cannot be executed because the database is immutable.

+{% endif %} +

{{ metadata.title or database }}{% if canned_query and not metadata.title %}: {{ canned_query }}{% endif %}{% if private %} 🔒{% endif %}

{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %} @@ -61,7 +65,7 @@

{% if not hide_sql %}{% endif %} {% if canned_write %}{% endif %} - + {{ show_hide_hidden }} {% if canned_query and edit_sql_url %}Edit SQL{% endif %}

diff --git a/datasette/views/database.py b/datasette/views/database.py index 42058752..77632b9d 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -273,6 +273,9 @@ class QueryView(DataView): # Execute query - as write or as read if write: if request.method == "POST": + # If database is immutable, return an error + if not db.is_mutable: + raise Forbidden("Database is immutable") body = await request.post_body() body = body.decode("utf-8").strip() if body.startswith("{") and body.endswith("}"): @@ -326,6 +329,7 @@ class QueryView(DataView): async def extra_template(): return { "request": request, + "db_is_immutable": not db.is_mutable, "path_with_added_args": path_with_added_args, "path_with_removed_args": path_with_removed_args, "named_parameter_values": named_parameter_values, diff --git a/tests/test_canned_queries.py b/tests/test_canned_queries.py index 5abffdcc..976aa0db 100644 --- a/tests/test_canned_queries.py +++ b/tests/test_canned_queries.py @@ -53,6 +53,26 @@ def canned_write_client(tmpdir): yield client +@pytest.fixture +def canned_write_immutable_client(): + with make_app_client( + is_immutable=True, + metadata={ + "databases": { + "fixtures": { + "queries": { + "add": { + "sql": "insert into sortable (text) values (:text)", + "write": True, + }, + } + } + } + }, + ) as client: + yield client + + def test_canned_query_with_named_parameter(app_client): response = app_client.get("/fixtures/neighborhood_search.json?text=town") assert [ @@ -373,3 +393,23 @@ def test_canned_write_custom_template(canned_write_client): response.headers["link"] == 'http://localhost/data/update_name.json; rel="alternate"; type="application/json+datasette"' ) + + +def test_canned_write_query_disabled_for_immutable_database( + canned_write_immutable_client, +): + response = canned_write_immutable_client.get("/fixtures/add") + assert response.status == 200 + assert ( + "This query cannot be executed because the database is immutable." + in response.text + ) + assert '' in response.text + # Submitting form should get a forbidden error + response = canned_write_immutable_client.post( + "/fixtures/add", + {"text": "text"}, + csrftoken_from=True, + ) + assert response.status == 403 + assert "Database is immutable" in response.text From 82167105ee699c850cc106ea927de1ad09276cfe Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 10:07:30 -0700 Subject: [PATCH 115/952] --min-instances and --max-instances Cloud Run publish options, closes #1779 --- datasette/publish/cloudrun.py | 26 +++++++++++++++++--- docs/cli-reference.rst | 2 ++ tests/test_publish_cloudrun.py | 43 ++++++++++++++++++++++++---------- 3 files changed, 56 insertions(+), 15 deletions(-) diff --git a/datasette/publish/cloudrun.py b/datasette/publish/cloudrun.py index 50b2b2fd..77274eb0 100644 --- a/datasette/publish/cloudrun.py +++ b/datasette/publish/cloudrun.py @@ -52,6 +52,16 @@ def publish_subcommand(publish): multiple=True, help="Additional packages to apt-get install", ) + @click.option( + "--max-instances", + type=int, + help="Maximum Cloud Run instances", + ) + @click.option( + "--min-instances", + type=int, + help="Minimum Cloud Run instances", + ) def cloudrun( files, metadata, @@ -79,6 +89,8 @@ def publish_subcommand(publish): cpu, timeout, apt_get_extras, + max_instances, + min_instances, ): "Publish databases to Datasette running on Cloud Run" fail_if_publish_binary_not_installed( @@ -168,12 +180,20 @@ def publish_subcommand(publish): ), shell=True, ) + extra_deploy_options = [] + for option, value in ( + ("--memory", memory), + ("--cpu", cpu), + ("--max-instances", max_instances), + ("--min-instances", min_instances), + ): + if value: + extra_deploy_options.append("{} {}".format(option, value)) check_call( - "gcloud run deploy --allow-unauthenticated --platform=managed --image {} {}{}{}".format( + "gcloud run deploy --allow-unauthenticated --platform=managed --image {} {}{}".format( image_id, service, - " --memory {}".format(memory) if memory else "", - " --cpu {}".format(cpu) if cpu else "", + " " + " ".join(extra_deploy_options) if extra_deploy_options else "", ), shell=True, ) diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index 1c1aff15..415af13c 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -251,6 +251,8 @@ datasette publish cloudrun --help --cpu [1|2|4] Number of vCPUs to allocate in Cloud Run --timeout INTEGER Build timeout in seconds --apt-get-install TEXT Additional packages to apt-get install + --max-instances INTEGER Maximum Cloud Run instances + --min-instances INTEGER Minimum Cloud Run instances --help Show this message and exit. diff --git a/tests/test_publish_cloudrun.py b/tests/test_publish_cloudrun.py index 60079ab3..e64534d2 100644 --- a/tests/test_publish_cloudrun.py +++ b/tests/test_publish_cloudrun.py @@ -105,19 +105,36 @@ def test_publish_cloudrun(mock_call, mock_output, mock_which, tmp_path_factory): @mock.patch("datasette.publish.cloudrun.check_output") @mock.patch("datasette.publish.cloudrun.check_call") @pytest.mark.parametrize( - "memory,cpu,timeout,expected_gcloud_args", + "memory,cpu,timeout,min_instances,max_instances,expected_gcloud_args", [ - ["1Gi", None, None, "--memory 1Gi"], - ["2G", None, None, "--memory 2G"], - ["256Mi", None, None, "--memory 256Mi"], - ["4", None, None, None], - ["GB", None, None, None], - [None, 1, None, "--cpu 1"], - [None, 2, None, "--cpu 2"], - [None, 3, None, None], - [None, 4, None, "--cpu 4"], - ["2G", 4, None, "--memory 2G --cpu 4"], - [None, None, 1800, "--timeout 1800"], + ["1Gi", None, None, None, None, "--memory 1Gi"], + ["2G", None, None, None, None, "--memory 2G"], + ["256Mi", None, None, None, None, "--memory 256Mi"], + [ + "4", + None, + None, + None, + None, + None, + ], + [ + "GB", + None, + None, + None, + None, + None, + ], + [None, 1, None, None, None, "--cpu 1"], + [None, 2, None, None, None, "--cpu 2"], + [None, 3, None, None, None, None], + [None, 4, None, None, None, "--cpu 4"], + ["2G", 4, None, None, None, "--memory 2G --cpu 4"], + [None, None, 1800, None, None, "--timeout 1800"], + [None, None, None, 2, None, "--min-instances 2"], + [None, None, None, 2, 4, "--min-instances 2 --max-instances 4"], + [None, 2, None, None, 4, "--cpu 2 --max-instances 4"], ], ) def test_publish_cloudrun_memory_cpu( @@ -127,6 +144,8 @@ def test_publish_cloudrun_memory_cpu( memory, cpu, timeout, + min_instances, + max_instances, expected_gcloud_args, tmp_path_factory, ): From 5e6c5c9e3191a80f17a91c5205d9d69efdebb73f Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 10:18:47 -0700 Subject: [PATCH 116/952] Document datasette.config_dir, refs #1766 --- docs/internals.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/internals.rst b/docs/internals.rst index da135282..20797e98 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -260,6 +260,7 @@ Constructor parameters include: - ``files=[...]`` - a list of database files to open - ``immutables=[...]`` - a list of database files to open in immutable mode - ``metadata={...}`` - a dictionary of :ref:`metadata` +- ``config_dir=...`` - the :ref:`configuration directory ` to use, stored in ``datasette.config_dir`` .. _datasette_databases: From 815162cf029fab9f1c9308c1d6ecdba7ee369ebe Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 10:32:42 -0700 Subject: [PATCH 117/952] Release 0.62 Refs #903, #1300, #1683, #1701, #1712, #1717, #1718, #1728, #1733, #1738, #1739, #1744, #1746, #1748, #1759, #1766, #1768, #1770, #1773, #1779 Closes #1782 --- datasette/version.py | 2 +- docs/changelog.rst | 53 ++++++++++++++++++++++++++++++-------------- 2 files changed, 37 insertions(+), 18 deletions(-) diff --git a/datasette/version.py b/datasette/version.py index 86f4cf7e..0453346c 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.62a1" +__version__ = "0.62" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index 3f105811..1225c63f 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,33 +4,52 @@ Changelog ========= -.. _v0_62a1: +.. _v0_62: -0.62a1 (2022-07-17) +0.62 (2022-08-14) ------------------- +Datasette can now run entirely in your browser using WebAssembly. Try out `Datasette Lite `__, take a look `at the code `__ or read more about it in `Datasette Lite: a server-side Python web application running in a browser `__. + +Datasette now has a `Discord community `__ for questions and discussions about Datasette and its ecosystem of projects. + +Features +~~~~~~~~ + +- Datasette is now compatible with `Pyodide `__. This is the enabling technology behind `Datasette Lite `__. (:issue:`1733`) +- Database file downloads now implement conditional GET using ETags. (:issue:`1739`) +- HTML for facet results and suggested results has been extracted out into new templates ``_facet_results.html`` and ``_suggested_facets.html``. Thanks, M. Nasimul Haque. (`#1759 `__) +- Datasette now runs some SQL queries in parallel. This has limited impact on performance, see `this research issue `__ for details. +- New ``--nolock`` option for ignoring file locks when opening read-only databases. (:issue:`1744`) +- Spaces in the database names in URLs are now encoded as ``+`` rather than ``~20``. (:issue:`1701`) +- ```` is now displayed as ```` and is accompanied by tooltip showing "2.3MB". (:issue:`1712`) +- The base Docker image used by ``datasette publish cloudrun``, ``datasette package`` and the `official Datasette image `__ has been upgraded to ``3.10.6-slim-bullseye``. (:issue:`1768`) +- Canned writable queries against immutable databases now show a warning message. (:issue:`1728`) +- ``datasette publish cloudrun`` has a new ``--timeout`` option which can be used to increase the time limit applied by the Google Cloud build environment. Thanks, Tim Sherratt. (`#1717 `__) +- ``datasette publish cloudrun`` has new ``--min-instances`` and ``--max-instances`` options. (:issue:`1779`) + +Plugin hooks +~~~~~~~~~~~~ + - New plugin hook: :ref:`handle_exception() `, for custom handling of exceptions caught by Datasette. (:issue:`1770`) - The :ref:`render_cell() ` plugin hook is now also passed a ``row`` argument, representing the ``sqlite3.Row`` object that is being rendered. (:issue:`1300`) -- New ``--nolock`` option for ignoring file locks when opening read-only databases. (:issue:`1744`) -- Documentation now uses the `Furo `__ Sphinx theme. (:issue:`1746`) -- Datasette now has a `Discord community `__. -- Database file downloads now implement conditional GET using ETags. (:issue:`1739`) -- Examples in the documentation now include a copy-to-clipboard button. (:issue:`1748`) -- HTML for facet results and suggested results has been extracted out into new templates ``_facet_results.html`` and ``_suggested_facets.html``. Thanks, M. Nasimul Haque. (`#1759 `__) +- The :ref:`configuration directory ` is now stored in ``datasette.config_dir``, making it available to plugins. Thanks, Chris Amico. (`#1766 `__) -.. _v0_62a0: +Bug fixes +~~~~~~~~~ -0.62a0 (2022-05-02) -------------------- - -- Datasette now runs some SQL queries in parallel. This has limited impact on performance, see `this research issue `__ for details. -- Datasette should now be compatible with Pyodide. (:issue:`1733`) -- ``datasette publish cloudrun`` has a new ``--timeout`` option which can be used to increase the time limit applied by the Google Cloud build environment. Thanks, Tim Sherratt. (`#1717 `__) -- Spaces in database names are now encoded as ``+`` rather than ``~20``. (:issue:`1701`) -- ```` is now displayed as ```` and is accompanied by tooltip showing "2.3MB". (:issue:`1712`) - Don't show the facet option in the cog menu if faceting is not allowed. (:issue:`1683`) +- ``?_sort`` and ``?_sort_desc`` now work if the column that is being sorted has been excluded from the query using ``?_col=`` or ``?_nocol=``. (:issue:`1773`) +- Fixed bug where ``?_sort_desc`` was duplicated in the URL every time the Apply button was clicked. (:issue:`1738`) + +Documentation +~~~~~~~~~~~~~ + +- Examples in the documentation now include a copy-to-clipboard button. (:issue:`1748`) +- Documentation now uses the `Furo `__ Sphinx theme. (:issue:`1746`) - Code examples in the documentation are now all formatted using Black. (:issue:`1718`) - ``Request.fake()`` method is now documented, see :ref:`internals_request`. +- New documentation for plugin authors: :ref:`testing_plugins_register_in_test`. (:issue:`903`) .. _v0_61_1: From a107e3a028923c1ab3911c0f880011283f93f368 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 14 Aug 2022 16:07:46 -0700 Subject: [PATCH 118/952] datasette-sentry is an example of handle_exception --- docs/plugin_hooks.rst | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index aec1df56..c6f35d06 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -1261,6 +1261,8 @@ This example logs an error to `Sentry `__ and then renders a return inner +Example: `datasette-sentry `_ + .. _plugin_hook_menu_links: menu_links(datasette, actor, request) From 481eb96d85291cdfa5767a83884a1525dfc382d8 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 15 Aug 2022 13:17:28 -0700 Subject: [PATCH 119/952] https://datasette.io/tutorials/clean-data tutorial Refs #1783 --- docs/getting_started.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/getting_started.rst b/docs/getting_started.rst index 571540cf..a9eaa404 100644 --- a/docs/getting_started.rst +++ b/docs/getting_started.rst @@ -20,6 +20,7 @@ Datasette has several `tutorials `__ to help you - `Exploring a database with Datasette `__ shows how to use the Datasette web interface to explore a new database. - `Learn SQL with Datasette `__ introduces SQL, and shows how to use that query language to ask questions of your data. +- `Cleaning data with sqlite-utils and Datasette `__ guides you through using `sqlite-utils `__ to turn a CSV file into a database that you can explore using Datasette. .. _getting_started_datasette_lite: From a3e6f1b16757fb2d39e7ddba4e09eda2362508bf Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 18 Aug 2022 09:06:02 -0700 Subject: [PATCH 120/952] Increase height of non-JS textarea to fit query Closes #1786 --- datasette/templates/query.html | 3 ++- tests/test_html.py | 6 ++---- 2 files changed, 4 insertions(+), 5 deletions(-) diff --git a/datasette/templates/query.html b/datasette/templates/query.html index cee779fc..a35e3afe 100644 --- a/datasette/templates/query.html +++ b/datasette/templates/query.html @@ -45,7 +45,8 @@ {% endif %} {% if not hide_sql %} {% if editable and allow_execute_sql %} -

+

{% else %}
{% if query %}{{ query.sql }}{% endif %}
{% endif %} diff --git a/tests/test_html.py b/tests/test_html.py index 409fec68..be21bd84 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -695,10 +695,8 @@ def test_query_error(app_client): response = app_client.get("/fixtures?sql=select+*+from+notatable") html = response.text assert '

no such table: notatable

' in html - assert ( - '' - in html - ) + assert '" in html assert "0 results" not in html From 09a41662e70b788469157bb58ed9ca4acdf2f904 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 18 Aug 2022 09:10:48 -0700 Subject: [PATCH 121/952] Fix typo --- docs/plugin_hooks.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index c6f35d06..30bd75b7 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -874,7 +874,7 @@ canned_queries(datasette, database, actor) ``actor`` - dictionary or None The currently authenticated :ref:`actor `. -Ues this hook to return a dictionary of additional :ref:`canned query ` definitions for the specified database. The return value should be the same shape as the JSON described in the :ref:`canned query ` documentation. +Use this hook to return a dictionary of additional :ref:`canned query ` definitions for the specified database. The return value should be the same shape as the JSON described in the :ref:`canned query ` documentation. .. code-block:: python From 6c0ba7c00c2ae3ecbb5309efa59079cea1c850b3 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 18 Aug 2022 14:52:04 -0700 Subject: [PATCH 122/952] Improved CLI reference documentation, refs #1787 --- datasette/cli.py | 2 +- docs/changelog.rst | 2 +- docs/cli-reference.rst | 325 ++++++++++++++++++++++++++++++--------- docs/getting_started.rst | 50 ------ docs/index.rst | 2 +- docs/publish.rst | 2 + 6 files changed, 259 insertions(+), 124 deletions(-) diff --git a/datasette/cli.py b/datasette/cli.py index 8781747c..f2a03d53 100644 --- a/datasette/cli.py +++ b/datasette/cli.py @@ -282,7 +282,7 @@ def package( port, **extra_metadata, ): - """Package specified SQLite files into a new datasette Docker container""" + """Package SQLite files into a Datasette Docker container""" if not shutil.which("docker"): click.secho( ' The package command requires "docker" to be installed and configured ', diff --git a/docs/changelog.rst b/docs/changelog.rst index 1225c63f..f9dcc980 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -621,7 +621,7 @@ See also `Datasette 0.49: The annotated release notes `__ for conversations about the project that go beyond just bug reports and issues. - Datasette can now be installed on macOS using Homebrew! Run ``brew install simonw/datasette/datasette``. See :ref:`installation_homebrew`. (:issue:`335`) - Two new commands: ``datasette install name-of-plugin`` and ``datasette uninstall name-of-plugin``. These are equivalent to ``pip install`` and ``pip uninstall`` but automatically run in the same virtual environment as Datasette, so users don't have to figure out where that virtual environment is - useful for installations created using Homebrew or ``pipx``. See :ref:`plugins_installing`. (:issue:`925`) -- A new command-line option, ``datasette --get``, accepts a path to a URL within the Datasette instance. It will run that request through Datasette (without starting a web server) and print out the response. See :ref:`getting_started_datasette_get` for an example. (:issue:`926`) +- A new command-line option, ``datasette --get``, accepts a path to a URL within the Datasette instance. It will run that request through Datasette (without starting a web server) and print out the response. See :ref:`cli_datasette_get` for an example. (:issue:`926`) .. _v0_46: diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index 415af13c..a1e56774 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -4,44 +4,34 @@ CLI reference =============== -This page lists the ``--help`` for every ``datasette`` CLI command. +The ``datasette`` CLI tool provides a number of commands. + +Running ``datasette`` without specifying a command runs the default command, ``datasette serve``. See :ref:`cli_help_serve___help` for the full list of options for that command. .. [[[cog from datasette import cli from click.testing import CliRunner import textwrap - commands = [ - ["--help"], - ["serve", "--help"], - ["serve", "--help-settings"], - ["plugins", "--help"], - ["publish", "--help"], - ["publish", "cloudrun", "--help"], - ["publish", "heroku", "--help"], - ["package", "--help"], - ["inspect", "--help"], - ["install", "--help"], - ["uninstall", "--help"], - ] - cog.out("\n") - for command in commands: - title = "datasette " + " ".join(command) - ref = "_cli_help_" + ("_".join(command).replace("-", "_")) - cog.out(".. {}:\n\n".format(ref)) - cog.out(title + "\n") - cog.out(("=" * len(title)) + "\n\n") + def help(args): + title = "datasette " + " ".join(args) cog.out("::\n\n") - result = CliRunner().invoke(cli.cli, command) + result = CliRunner().invoke(cli.cli, args) output = result.output.replace("Usage: cli ", "Usage: datasette ") cog.out(textwrap.indent(output, ' ')) cog.out("\n\n") .. ]]] +.. [[[end]]] .. _cli_help___help: datasette --help ================ +Running ``datasette --help`` shows a list of all of the available commands. + +.. [[[cog + help(["--help"]) +.. ]]] :: Usage: datasette [OPTIONS] COMMAND [ARGS]... @@ -59,17 +49,34 @@ datasette --help serve* Serve up specified SQLite database files with a web UI inspect Generate JSON summary of provided database files install Install plugins and packages from PyPI into the same... - package Package specified SQLite files into a new datasette Docker... + package Package SQLite files into a Datasette Docker container plugins List currently installed plugins publish Publish specified SQLite database files to the internet along... uninstall Uninstall plugins and Python packages from the Datasette... +.. [[[end]]] + +Additional commands added by plugins that use the :ref:`plugin_hook_register_commands` hook will be listed here as well. + .. _cli_help_serve___help: -datasette serve --help -====================== +datasette serve +=============== +This command starts the Datasette web application running on your machine:: + + datasette serve mydatabase.db + +Or since this is the default command you can run this instead:: + + datasette mydatabase.db + +Once started you can access it at ``http://localhost:8001`` + +.. [[[cog + help(["serve", "--help"]) +.. ]]] :: Usage: datasette serve [OPTIONS] [FILES]... @@ -121,11 +128,75 @@ datasette serve --help --help Show this message and exit. +.. [[[end]]] + + +.. _cli_datasette_get: + +datasette --get +--------------- + +The ``--get`` option to ``datasette serve`` (or just ``datasette``) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server. + +This means that all of Datasette's functionality can be accessed directly from the command-line. + +For example:: + + $ datasette --get '/-/versions.json' | jq . + { + "python": { + "version": "3.8.5", + "full": "3.8.5 (default, Jul 21 2020, 10:48:26) \n[Clang 11.0.3 (clang-1103.0.32.62)]" + }, + "datasette": { + "version": "0.46+15.g222a84a.dirty" + }, + "asgi": "3.0", + "uvicorn": "0.11.8", + "sqlite": { + "version": "3.32.3", + "fts_versions": [ + "FTS5", + "FTS4", + "FTS3" + ], + "extensions": { + "json1": null + }, + "compile_options": [ + "COMPILER=clang-11.0.3", + "ENABLE_COLUMN_METADATA", + "ENABLE_FTS3", + "ENABLE_FTS3_PARENTHESIS", + "ENABLE_FTS4", + "ENABLE_FTS5", + "ENABLE_GEOPOLY", + "ENABLE_JSON1", + "ENABLE_PREUPDATE_HOOK", + "ENABLE_RTREE", + "ENABLE_SESSION", + "MAX_VARIABLE_NUMBER=250000", + "THREADSAFE=1" + ] + } + } + +The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error. + +This lets you use ``datasette --get /`` to run tests against a Datasette application in a continuous integration environment such as GitHub Actions. + .. _cli_help_serve___help_settings: datasette serve --help-settings -=============================== +------------------------------- +This command outputs all of the available Datasette :ref:`settings `. + +These can be passed to ``datasette serve`` using ``datasette serve --setting name value``. + +.. [[[cog + help(["--help-settings"]) +.. ]]] :: Settings: @@ -170,11 +241,18 @@ datasette serve --help-settings +.. [[[end]]] + .. _cli_help_plugins___help: -datasette plugins --help -======================== +datasette plugins +================= +Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which :ref:`plugin_hooks` they use. + +.. [[[cog + help(["plugins", "--help"]) +.. ]]] :: Usage: datasette plugins [OPTIONS] @@ -187,11 +265,110 @@ datasette plugins --help --help Show this message and exit. +.. [[[end]]] + +Example output: + +.. code-block:: json + + [ + { + "name": "datasette-geojson", + "static": false, + "templates": false, + "version": "0.3.1", + "hooks": [ + "register_output_renderer" + ] + }, + { + "name": "datasette-geojson-map", + "static": true, + "templates": false, + "version": "0.4.0", + "hooks": [ + "extra_body_script", + "extra_css_urls", + "extra_js_urls" + ] + }, + { + "name": "datasette-leaflet", + "static": true, + "templates": false, + "version": "0.2.2", + "hooks": [ + "extra_body_script", + "extra_template_vars" + ] + } + ] + + +.. _cli_help_install___help: + +datasette install +================= + +Install new Datasette plugins. This command works like ``pip install`` but ensures that your plugins will be installed into the same environment as Datasette. + +This command:: + + datasette install datasette-cluster-map + +Would install the `datasette-cluster-map `__ plugin. + +.. [[[cog + help(["install", "--help"]) +.. ]]] +:: + + Usage: datasette install [OPTIONS] PACKAGES... + + Install plugins and packages from PyPI into the same environment as Datasette + + Options: + -U, --upgrade Upgrade packages to latest version + --help Show this message and exit. + + +.. [[[end]]] + +.. _cli_help_uninstall___help: + +datasette uninstall +=================== + +Uninstall one or more plugins. + +.. [[[cog + help(["uninstall", "--help"]) +.. ]]] +:: + + Usage: datasette uninstall [OPTIONS] PACKAGES... + + Uninstall plugins and Python packages from the Datasette environment + + Options: + -y, --yes Don't ask for confirmation + --help Show this message and exit. + + +.. [[[end]]] + .. _cli_help_publish___help: -datasette publish --help -======================== +datasette publish +================= +Shows a list of available deployment targets for :ref:`publishing data ` with Datasette. + +Additional deployment targets can be added by plugins that use the :ref:`plugin_hook_publish_subcommand` hook. + +.. [[[cog + help(["publish", "--help"]) +.. ]]] :: Usage: datasette publish [OPTIONS] COMMAND [ARGS]... @@ -207,11 +384,19 @@ datasette publish --help heroku Publish databases to Datasette running on Heroku +.. [[[end]]] + + .. _cli_help_publish_cloudrun___help: -datasette publish cloudrun --help -================================= +datasette publish cloudrun +========================== +See :ref:`publish_cloud_run`. + +.. [[[cog + help(["publish", "cloudrun", "--help"]) +.. ]]] :: Usage: datasette publish cloudrun [OPTIONS] [FILES]... @@ -256,11 +441,19 @@ datasette publish cloudrun --help --help Show this message and exit. +.. [[[end]]] + + .. _cli_help_publish_heroku___help: -datasette publish heroku --help -=============================== +datasette publish heroku +======================== +See :ref:`publish_heroku`. + +.. [[[cog + help(["publish", "heroku", "--help"]) +.. ]]] :: Usage: datasette publish heroku [OPTIONS] [FILES]... @@ -297,16 +490,23 @@ datasette publish heroku --help --help Show this message and exit. +.. [[[end]]] + .. _cli_help_package___help: -datasette package --help -======================== +datasette package +================= +Package SQLite files into a Datasette Docker container, see :ref:`cli_package`. + +.. [[[cog + help(["package", "--help"]) +.. ]]] :: Usage: datasette package [OPTIONS] FILES... - Package specified SQLite files into a new datasette Docker container + Package SQLite files into a Datasette Docker container Options: -t, --tag TEXT Name for the resulting Docker container, can @@ -335,11 +535,26 @@ datasette package --help --help Show this message and exit. +.. [[[end]]] + + .. _cli_help_inspect___help: -datasette inspect --help -======================== +datasette inspect +================= +Outputs JSON representing introspected data about one or more SQLite database files. + +If you are opening an immutable database, you can pass this file to the ``--inspect-data`` option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running:: + + datasette inspect mydatabase.db > inspect-data.json + datasette serve -i mydatabase.db --inspect-file inspect-data.json + +This performance optimization is used automatically by some of the ``datasette publish`` commands. You are unlikely to need to apply this optimization manually. + +.. [[[cog + help(["inspect", "--help"]) +.. ]]] :: Usage: datasette inspect [OPTIONS] [FILES]... @@ -355,36 +570,4 @@ datasette inspect --help --help Show this message and exit. -.. _cli_help_install___help: - -datasette install --help -======================== - -:: - - Usage: datasette install [OPTIONS] PACKAGES... - - Install plugins and packages from PyPI into the same environment as Datasette - - Options: - -U, --upgrade Upgrade packages to latest version - --help Show this message and exit. - - -.. _cli_help_uninstall___help: - -datasette uninstall --help -========================== - -:: - - Usage: datasette uninstall [OPTIONS] PACKAGES... - - Uninstall plugins and Python packages from the Datasette environment - - Options: - -y, --yes Don't ask for confirmation - --help Show this message and exit. - - .. [[[end]]] diff --git a/docs/getting_started.rst b/docs/getting_started.rst index a9eaa404..6515ef8d 100644 --- a/docs/getting_started.rst +++ b/docs/getting_started.rst @@ -138,53 +138,3 @@ JSON in a more convenient format: } ] } - -.. _getting_started_datasette_get: - -datasette --get ---------------- - -The ``--get`` option can specify the path to a page within Datasette and cause Datasette to output the content from that path without starting the web server. This means that all of Datasette's functionality can be accessed directly from the command-line. For example:: - - $ datasette --get '/-/versions.json' | jq . - { - "python": { - "version": "3.8.5", - "full": "3.8.5 (default, Jul 21 2020, 10:48:26) \n[Clang 11.0.3 (clang-1103.0.32.62)]" - }, - "datasette": { - "version": "0.46+15.g222a84a.dirty" - }, - "asgi": "3.0", - "uvicorn": "0.11.8", - "sqlite": { - "version": "3.32.3", - "fts_versions": [ - "FTS5", - "FTS4", - "FTS3" - ], - "extensions": { - "json1": null - }, - "compile_options": [ - "COMPILER=clang-11.0.3", - "ENABLE_COLUMN_METADATA", - "ENABLE_FTS3", - "ENABLE_FTS3_PARENTHESIS", - "ENABLE_FTS4", - "ENABLE_FTS5", - "ENABLE_GEOPOLY", - "ENABLE_JSON1", - "ENABLE_PREUPDATE_HOOK", - "ENABLE_RTREE", - "ENABLE_SESSION", - "MAX_VARIABLE_NUMBER=250000", - "THREADSAFE=1" - ] - } - } - -The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error. This means you can use ``datasette --get /`` to run tests against a Datasette application in a continuous integration environment such as GitHub Actions. - -Running ``datasette`` without specifying a command runs the default command, ``datasette serve``. See :ref:`cli_help_serve___help` for the full list of options for that command. diff --git a/docs/index.rst b/docs/index.rst index efe196b3..5a9cc7ed 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -40,6 +40,7 @@ Contents getting_started installation ecosystem + cli-reference pages publish deploying @@ -61,6 +62,5 @@ Contents plugin_hooks testing_plugins internals - cli-reference contributing changelog diff --git a/docs/publish.rst b/docs/publish.rst index 9c7c99cc..dd8566ed 100644 --- a/docs/publish.rst +++ b/docs/publish.rst @@ -56,6 +56,8 @@ Cloud Run provides a URL on the ``.run.app`` domain, but you can also point your See :ref:`cli_help_publish_cloudrun___help` for the full list of options for this command. +.. _publish_heroku: + Publishing to Heroku -------------------- From aff3df03d4fe0806ce432d1818f6643cdb2a854e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 18 Aug 2022 14:55:08 -0700 Subject: [PATCH 123/952] Ignore ro which stands for read only Refs #1787 where it caused tests to break --- docs/codespell-ignore-words.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/codespell-ignore-words.txt b/docs/codespell-ignore-words.txt index a625cde5..d6744d05 100644 --- a/docs/codespell-ignore-words.txt +++ b/docs/codespell-ignore-words.txt @@ -1 +1 @@ -AddWordsToIgnoreHere +ro From 0d9d33955b503c88a2c712144d97f094baa5d46d Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 18 Aug 2022 16:06:12 -0700 Subject: [PATCH 124/952] Clarify you can publish multiple files, closes #1788 --- docs/publish.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/publish.rst b/docs/publish.rst index dd8566ed..d817ed31 100644 --- a/docs/publish.rst +++ b/docs/publish.rst @@ -31,7 +31,7 @@ Publishing to Google Cloud Run You will first need to install and configure the Google Cloud CLI tools by following `these instructions `__. -You can then publish a database to Google Cloud Run using the following command:: +You can then publish one or more SQLite database files to Google Cloud Run using the following command:: datasette publish cloudrun mydatabase.db --service=my-database @@ -63,7 +63,7 @@ Publishing to Heroku To publish your data using `Heroku `__, first create an account there and install and configure the `Heroku CLI tool `_. -You can publish a database to Heroku using the following command:: +You can publish one or more databases to Heroku using the following command:: datasette publish heroku mydatabase.db @@ -138,7 +138,7 @@ If a plugin has any :ref:`plugins_configuration_secret` you can use the ``--plug datasette package ================= -If you have docker installed (e.g. using `Docker for Mac `_) you can use the ``datasette package`` command to create a new Docker image in your local repository containing the datasette app bundled together with your selected SQLite databases:: +If you have docker installed (e.g. using `Docker for Mac `_) you can use the ``datasette package`` command to create a new Docker image in your local repository containing the datasette app bundled together with one or more SQLite databases:: datasette package mydatabase.db From 663ac431fe7202c85967568d82b2034f92b9aa43 Mon Sep 17 00:00:00 2001 From: Manuel Kaufmann Date: Sat, 20 Aug 2022 02:04:16 +0200 Subject: [PATCH 125/952] Use Read the Docs action v1 (#1778) Read the Docs repository was renamed from `readthedocs/readthedocs-preview` to `readthedocs/actions/`. Now, the `preview` action is under `readthedocs/actions/preview` and is tagged as `v1` --- .github/workflows/documentation-links.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/documentation-links.yml b/.github/workflows/documentation-links.yml index e7062a46..a54bd83a 100644 --- a/.github/workflows/documentation-links.yml +++ b/.github/workflows/documentation-links.yml @@ -11,6 +11,6 @@ jobs: documentation-links: runs-on: ubuntu-latest steps: - - uses: readthedocs/readthedocs-preview@main + - uses: readthedocs/actions/preview@v1 with: project-slug: "datasette" From 1d64c9a8dac45b9a3452acf8e76dfadea2b0bc49 Mon Sep 17 00:00:00 2001 From: Alex Garcia Date: Tue, 23 Aug 2022 11:34:30 -0700 Subject: [PATCH 126/952] Add new entrypoint option to --load-extensions. (#1789) Thanks, @asg017 --- .gitignore | 6 ++++ datasette/app.py | 8 ++++- datasette/cli.py | 4 ++- datasette/utils/__init__.py | 11 ++++++ tests/ext.c | 48 ++++++++++++++++++++++++++ tests/test_load_extensions.py | 65 +++++++++++++++++++++++++++++++++++ 6 files changed, 140 insertions(+), 2 deletions(-) create mode 100644 tests/ext.c create mode 100644 tests/test_load_extensions.py diff --git a/.gitignore b/.gitignore index 066009f0..277ff653 100644 --- a/.gitignore +++ b/.gitignore @@ -118,3 +118,9 @@ ENV/ .DS_Store node_modules .*.swp + +# In case someone compiled tests/ext.c for test_load_extensions, don't +# include it in source control. +tests/*.dylib +tests/*.so +tests/*.dll \ No newline at end of file diff --git a/datasette/app.py b/datasette/app.py index 1a9afc10..bb9232c9 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -559,7 +559,13 @@ class Datasette: if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: - conn.execute("SELECT load_extension(?)", [extension]) + # "extension" is either a string path to the extension + # or a 2-item tuple that specifies which entrypoint to load. + if isinstance(extension, tuple): + path, entrypoint = extension + conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) + else: + conn.execute("SELECT load_extension(?)", [extension]) if self.setting("cache_size_kb"): conn.execute(f"PRAGMA cache_size=-{self.setting('cache_size_kb')}") # pylint: disable=no-member diff --git a/datasette/cli.py b/datasette/cli.py index f2a03d53..6eb42712 100644 --- a/datasette/cli.py +++ b/datasette/cli.py @@ -21,6 +21,7 @@ from .app import ( pm, ) from .utils import ( + LoadExtension, StartupError, check_connection, find_spatialite, @@ -128,9 +129,10 @@ def sqlite_extensions(fn): return click.option( "sqlite_extensions", "--load-extension", + type=LoadExtension(), envvar="SQLITE_EXTENSIONS", multiple=True, - help="Path to a SQLite extension to load", + help="Path to a SQLite extension to load, and optional entrypoint", )(fn) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index d148cc2c..0fc87d51 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -833,6 +833,17 @@ class StaticMount(click.ParamType): self.fail(f"{value} is not a valid directory path", param, ctx) return path, dirpath +# The --load-extension parameter can optionally include a specific entrypoint. +# This is done by appending ":entrypoint_name" after supplying the path to the extension +class LoadExtension(click.ParamType): + name = "path:entrypoint?" + + def convert(self, value, param, ctx): + if ":" not in value: + return value + path, entrypoint = value.split(":", 1) + return path, entrypoint + def format_bytes(bytes): current = float(bytes) diff --git a/tests/ext.c b/tests/ext.c new file mode 100644 index 00000000..5fe970d9 --- /dev/null +++ b/tests/ext.c @@ -0,0 +1,48 @@ +/* +** This file implements a SQLite extension with multiple entrypoints. +** +** The default entrypoint, sqlite3_ext_init, has a single function "a". +** The 1st alternate entrypoint, sqlite3_ext_b_init, has a single function "b". +** The 2nd alternate entrypoint, sqlite3_ext_c_init, has a single function "c". +** +** Compiling instructions: +** https://www.sqlite.org/loadext.html#compiling_a_loadable_extension +** +*/ + +#include "sqlite3ext.h" + +SQLITE_EXTENSION_INIT1 + +// SQL function that returns back the value supplied during sqlite3_create_function() +static void func(sqlite3_context *context, int argc, sqlite3_value **argv) { + sqlite3_result_text(context, (char *) sqlite3_user_data(context), -1, SQLITE_STATIC); +} + + +// The default entrypoint, since it matches the "ext.dylib"/"ext.so" name +#ifdef _WIN32 +__declspec(dllexport) +#endif +int sqlite3_ext_init(sqlite3 *db, char **pzErrMsg, const sqlite3_api_routines *pApi) { + SQLITE_EXTENSION_INIT2(pApi); + return sqlite3_create_function(db, "a", 0, 0, "a", func, 0, 0); +} + +// Alternate entrypoint #1 +#ifdef _WIN32 +__declspec(dllexport) +#endif +int sqlite3_ext_b_init(sqlite3 *db, char **pzErrMsg, const sqlite3_api_routines *pApi) { + SQLITE_EXTENSION_INIT2(pApi); + return sqlite3_create_function(db, "b", 0, 0, "b", func, 0, 0); +} + +// Alternate entrypoint #2 +#ifdef _WIN32 +__declspec(dllexport) +#endif +int sqlite3_ext_c_init(sqlite3 *db, char **pzErrMsg, const sqlite3_api_routines *pApi) { + SQLITE_EXTENSION_INIT2(pApi); + return sqlite3_create_function(db, "c", 0, 0, "c", func, 0, 0); +} diff --git a/tests/test_load_extensions.py b/tests/test_load_extensions.py new file mode 100644 index 00000000..360bc8f3 --- /dev/null +++ b/tests/test_load_extensions.py @@ -0,0 +1,65 @@ +from datasette.app import Datasette +import pytest +from pathlib import Path + +# not necessarily a full path - the full compiled path looks like "ext.dylib" +# or another suffix, but sqlite will, under the hood, decide which file +# extension to use based on the operating system (apple=dylib, windows=dll etc) +# this resolves to "./ext", which is enough for SQLite to calculate the rest +COMPILED_EXTENSION_PATH = str(Path(__file__).parent / "ext") + +# See if ext.c has been compiled, based off the different possible suffixes. +def has_compiled_ext(): + for ext in ["dylib", "so", "dll"]: + path = Path(__file__).parent / f"ext.{ext}" + if path.is_file(): + return True + return False + + +@pytest.mark.asyncio +@pytest.mark.skipif(not has_compiled_ext(), reason="Requires compiled ext.c") +async def test_load_extension_default_entrypoint(): + + # The default entrypoint only loads a() and NOT b() or c(), so those + # should fail. + ds = Datasette(sqlite_extensions=[COMPILED_EXTENSION_PATH]) + + response = await ds.client.get("/_memory.json?sql=select+a()") + assert response.status_code == 200 + assert response.json()["rows"][0][0] == "a" + + response = await ds.client.get("/_memory.json?sql=select+b()") + assert response.status_code == 400 + assert response.json()["error"] == "no such function: b" + + response = await ds.client.get("/_memory.json?sql=select+c()") + assert response.status_code == 400 + assert response.json()["error"] == "no such function: c" + + +@pytest.mark.asyncio +@pytest.mark.skipif(not has_compiled_ext(), reason="Requires compiled ext.c") +async def test_load_extension_multiple_entrypoints(): + + # Load in the default entrypoint and the other 2 custom entrypoints, now + # all a(), b(), and c() should run successfully. + ds = Datasette( + sqlite_extensions=[ + COMPILED_EXTENSION_PATH, + (COMPILED_EXTENSION_PATH, "sqlite3_ext_b_init"), + (COMPILED_EXTENSION_PATH, "sqlite3_ext_c_init"), + ] + ) + + response = await ds.client.get("/_memory.json?sql=select+a()") + assert response.status_code == 200 + assert response.json()["rows"][0][0] == "a" + + response = await ds.client.get("/_memory.json?sql=select+b()") + assert response.status_code == 200 + assert response.json()["rows"][0][0] == "b" + + response = await ds.client.get("/_memory.json?sql=select+c()") + assert response.status_code == 200 + assert response.json()["rows"][0][0] == "c" From fd1086c6867f3e3582b1eca456e4ea95f6cecf8b Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 23 Aug 2022 11:35:41 -0700 Subject: [PATCH 127/952] Applied Black, refs #1789 --- datasette/app.py | 4 ++-- datasette/utils/__init__.py | 1 + 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index bb9232c9..f2a6763a 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -559,8 +559,8 @@ class Datasette: if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: - # "extension" is either a string path to the extension - # or a 2-item tuple that specifies which entrypoint to load. + # "extension" is either a string path to the extension + # or a 2-item tuple that specifies which entrypoint to load. if isinstance(extension, tuple): path, entrypoint = extension conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index 0fc87d51..bbaa0510 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -833,6 +833,7 @@ class StaticMount(click.ParamType): self.fail(f"{value} is not a valid directory path", param, ctx) return path, dirpath + # The --load-extension parameter can optionally include a specific entrypoint. # This is done by appending ":entrypoint_name" after supplying the path to the extension class LoadExtension(click.ParamType): From 456dc155d491a009942ace71a4e1827cddc6b93d Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 23 Aug 2022 11:40:36 -0700 Subject: [PATCH 128/952] Ran cog, refs #1789 --- docs/cli-reference.rst | 95 +++++++++++++++++++++++------------------- 1 file changed, 51 insertions(+), 44 deletions(-) diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index a1e56774..f8419d58 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -84,48 +84,53 @@ Once started you can access it at ``http://localhost:8001`` Serve up specified SQLite database files with a web UI Options: - -i, --immutable PATH Database files to open in immutable mode - -h, --host TEXT Host for server. Defaults to 127.0.0.1 which means - only connections from the local machine will be - allowed. Use 0.0.0.0 to listen to all IPs and allow - access from other machines. - -p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to - automatically assign an available port. - [0<=x<=65535] - --uds TEXT Bind to a Unix domain socket - --reload Automatically reload if code or metadata change - detected - useful for development - --cors Enable CORS by serving Access-Control-Allow-Origin: - * - --load-extension TEXT Path to a SQLite extension to load - --inspect-file TEXT Path to JSON file created using "datasette inspect" - -m, --metadata FILENAME Path to JSON/YAML file containing license/source - metadata - --template-dir DIRECTORY Path to directory containing custom templates - --plugins-dir DIRECTORY Path to directory containing custom plugins - --static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/... - --memory Make /_memory database available - --config CONFIG Deprecated: set config option using - configname:value. Use --setting instead. - --setting SETTING... Setting, see - docs.datasette.io/en/stable/settings.html - --secret TEXT Secret used for signing secure values, such as - signed cookies - --root Output URL that sets a cookie authenticating the - root user - --get TEXT Run an HTTP GET request against this path, print - results and exit - --version-note TEXT Additional note to show on /-/versions - --help-settings Show available settings - --pdb Launch debugger on any errors - -o, --open Open Datasette in your web browser - --create Create database files if they do not exist - --crossdb Enable cross-database joins using the /_memory - database - --nolock Ignore locking, open locked files in read-only mode - --ssl-keyfile TEXT SSL key file - --ssl-certfile TEXT SSL certificate file - --help Show this message and exit. + -i, --immutable PATH Database files to open in immutable mode + -h, --host TEXT Host for server. Defaults to 127.0.0.1 which + means only connections from the local machine + will be allowed. Use 0.0.0.0 to listen to all + IPs and allow access from other machines. + -p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to + automatically assign an available port. + [0<=x<=65535] + --uds TEXT Bind to a Unix domain socket + --reload Automatically reload if code or metadata + change detected - useful for development + --cors Enable CORS by serving Access-Control-Allow- + Origin: * + --load-extension PATH:ENTRYPOINT? + Path to a SQLite extension to load, and + optional entrypoint + --inspect-file TEXT Path to JSON file created using "datasette + inspect" + -m, --metadata FILENAME Path to JSON/YAML file containing + license/source metadata + --template-dir DIRECTORY Path to directory containing custom templates + --plugins-dir DIRECTORY Path to directory containing custom plugins + --static MOUNT:DIRECTORY Serve static files from this directory at + /MOUNT/... + --memory Make /_memory database available + --config CONFIG Deprecated: set config option using + configname:value. Use --setting instead. + --setting SETTING... Setting, see + docs.datasette.io/en/stable/settings.html + --secret TEXT Secret used for signing secure values, such as + signed cookies + --root Output URL that sets a cookie authenticating + the root user + --get TEXT Run an HTTP GET request against this path, + print results and exit + --version-note TEXT Additional note to show on /-/versions + --help-settings Show available settings + --pdb Launch debugger on any errors + -o, --open Open Datasette in your web browser + --create Create database files if they do not exist + --crossdb Enable cross-database joins using the /_memory + database + --nolock Ignore locking, open locked files in read-only + mode + --ssl-keyfile TEXT SSL key file + --ssl-certfile TEXT SSL certificate file + --help Show this message and exit. .. [[[end]]] @@ -566,8 +571,10 @@ This performance optimization is used automatically by some of the ``datasette p Options: --inspect-file TEXT - --load-extension TEXT Path to a SQLite extension to load - --help Show this message and exit. + --load-extension PATH:ENTRYPOINT? + Path to a SQLite extension to load, and + optional entrypoint + --help Show this message and exit. .. [[[end]]] From ba35105eee2d3ba620e4f230028a02b2e2571df2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 23 Aug 2022 17:11:45 -0700 Subject: [PATCH 129/952] Test `--load-extension` in GitHub Actions (#1792) * Run the --load-extension test, refs #1789 * Ran cog, refs #1789 --- .github/workflows/test.yml | 3 +++ tests/test_api.py | 2 +- tests/test_html.py | 4 ++-- 3 files changed, 6 insertions(+), 3 deletions(-) diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 90b6555e..e38d5ee9 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -24,6 +24,9 @@ jobs: key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }} restore-keys: | ${{ runner.os }}-pip- + - name: Build extension for --load-extension test + run: |- + (cd tests && gcc ext.c -fPIC -shared -o ext.so) - name: Install dependencies run: | pip install -e '.[test]' diff --git a/tests/test_api.py b/tests/test_api.py index 253c1718..f6db2f9d 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -36,7 +36,7 @@ def test_homepage(app_client): # 4 hidden FTS tables + no_primary_key (hidden in metadata) assert d["hidden_tables_count"] == 6 # 201 in no_primary_key, plus 6 in other hidden tables: - assert d["hidden_table_rows_sum"] == 207 + assert d["hidden_table_rows_sum"] == 207, response.json assert d["views_count"] == 4 diff --git a/tests/test_html.py b/tests/test_html.py index be21bd84..d6e969ad 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -115,7 +115,7 @@ def test_database_page(app_client): assert fragment in response.text # And views - views_ul = soup.find("h2", text="Views").find_next_sibling("ul") + views_ul = soup.find("h2", string="Views").find_next_sibling("ul") assert views_ul is not None assert [ ("/fixtures/paginated_view", "paginated_view"), @@ -128,7 +128,7 @@ def test_database_page(app_client): ] == sorted([(a["href"], a.text) for a in views_ul.find_all("a")]) # And a list of canned queries - queries_ul = soup.find("h2", text="Queries").find_next_sibling("ul") + queries_ul = soup.find("h2", string="Queries").find_next_sibling("ul") assert queries_ul is not None assert [ ("/fixtures/from_async_hook", "from_async_hook"), From 51030df1869b3b574dd3584d1563415776b9cd4e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 5 Sep 2022 11:35:40 -0700 Subject: [PATCH 130/952] Don't use upper bound dependencies any more See https://iscinumpy.dev/post/bound-version-constraints/ for the rationale behind this change. Closes #1800 --- setup.py | 36 ++++++++++++++++++------------------ 1 file changed, 18 insertions(+), 18 deletions(-) diff --git a/setup.py b/setup.py index a1c51d0b..b2e50b38 100644 --- a/setup.py +++ b/setup.py @@ -42,21 +42,21 @@ setup( include_package_data=True, python_requires=">=3.7", install_requires=[ - "asgiref>=3.2.10,<3.6.0", - "click>=7.1.1,<8.2.0", + "asgiref>=3.2.10", + "click>=7.1.1", "click-default-group-wheel>=1.2.2", - "Jinja2>=2.10.3,<3.1.0", - "hupper~=1.9", + "Jinja2>=2.10.3", + "hupper>=1.9", "httpx>=0.20", - "pint~=0.9", - "pluggy>=1.0,<1.1", - "uvicorn~=0.11", - "aiofiles>=0.4,<0.9", - "janus>=0.6.2,<1.1", + "pint>=0.9", + "pluggy>=1.0", + "uvicorn>=0.11", + "aiofiles>=0.4", + "janus>=0.6.2", "asgi-csrf>=0.9", - "PyYAML>=5.3,<7.0", - "mergedeep>=1.1.1,<1.4.0", - "itsdangerous>=1.1,<3.0", + "PyYAML>=5.3", + "mergedeep>=1.1.1", + "itsdangerous>=1.1", ], entry_points=""" [console_scripts] @@ -72,14 +72,14 @@ setup( "sphinx-copybutton", ], "test": [ - "pytest>=5.2.2,<7.2.0", - "pytest-xdist>=2.2.1,<2.6", - "pytest-asyncio>=0.17,<0.20", - "beautifulsoup4>=4.8.1,<4.12.0", + "pytest>=5.2.2", + "pytest-xdist>=2.2.1", + "pytest-asyncio>=0.17", + "beautifulsoup4>=4.8.1", "black==22.6.0", "blacken-docs==1.12.1", - "pytest-timeout>=1.4.2,<2.2", - "trustme>=0.7,<0.10", + "pytest-timeout>=1.4.2", + "trustme>=0.7", "cogapp>=3.3.0", ], "rich": ["rich"], From 294ecd45f7801971dbeef383d0c5456ee95ab839 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon, 5 Sep 2022 11:51:51 -0700 Subject: [PATCH 131/952] Bump black from 22.6.0 to 22.8.0 (#1797) Bumps [black](https://github.com/psf/black) from 22.6.0 to 22.8.0. - [Release notes](https://github.com/psf/black/releases) - [Changelog](https://github.com/psf/black/blob/main/CHANGES.md) - [Commits](https://github.com/psf/black/compare/22.6.0...22.8.0) --- updated-dependencies: - dependency-name: black dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index b2e50b38..92fa60d0 100644 --- a/setup.py +++ b/setup.py @@ -76,7 +76,7 @@ setup( "pytest-xdist>=2.2.1", "pytest-asyncio>=0.17", "beautifulsoup4>=4.8.1", - "black==22.6.0", + "black==22.8.0", "blacken-docs==1.12.1", "pytest-timeout>=1.4.2", "trustme>=0.7", From b91e17280c05bbb9cf97432081bdcea8665879f9 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 5 Sep 2022 16:50:53 -0700 Subject: [PATCH 132/952] Run tests in serial, refs #1802 --- .github/workflows/test.yml | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index e38d5ee9..9c8c48ef 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -33,8 +33,7 @@ jobs: pip freeze - name: Run tests run: | - pytest -n auto -m "not serial" - pytest -m "serial" + pytest - name: Check if cog needs to be run run: | cog --check docs/*.rst From b2b901e8c4b939e50ee1117ffcd2881ed8a8e3bf Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 5 Sep 2022 17:05:23 -0700 Subject: [PATCH 133/952] Skip SpatiaLite test if no conn.enable_load_extension() Ran into this problem while working on #1802 --- tests/test_spatialite.py | 2 ++ tests/utils.py | 8 ++++++++ 2 files changed, 10 insertions(+) diff --git a/tests/test_spatialite.py b/tests/test_spatialite.py index 8b98c5d6..c07a30e8 100644 --- a/tests/test_spatialite.py +++ b/tests/test_spatialite.py @@ -1,5 +1,6 @@ from datasette.app import Datasette from datasette.utils import find_spatialite, SpatialiteNotFound, SPATIALITE_FUNCTIONS +from .utils import has_load_extension import pytest @@ -13,6 +14,7 @@ def has_spatialite(): @pytest.mark.asyncio @pytest.mark.skipif(not has_spatialite(), reason="Requires SpatiaLite") +@pytest.mark.skipif(not has_load_extension(), reason="Requires enable_load_extension") async def test_spatialite_version_info(): ds = Datasette(sqlite_extensions=["spatialite"]) response = await ds.client.get("/-/versions.json") diff --git a/tests/utils.py b/tests/utils.py index 972300db..191ead9b 100644 --- a/tests/utils.py +++ b/tests/utils.py @@ -1,3 +1,6 @@ +from datasette.utils.sqlite import sqlite3 + + def assert_footer_links(soup): footer_links = soup.find("footer").findAll("a") assert 4 == len(footer_links) @@ -22,3 +25,8 @@ def inner_html(soup): # This includes the parent tag - so remove that inner_html = html.split(">", 1)[1].rsplit("<", 1)[0] return inner_html.strip() + + +def has_load_extension(): + conn = sqlite3.connect(":memory:") + return hasattr(conn, "enable_load_extension") From 1c29b925d300d1ee17047504473f2517767aa05b Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 5 Sep 2022 17:10:52 -0700 Subject: [PATCH 134/952] Run tests in serial again Because this didn't fix the issue I'm seeing in #1802 Revert "Run tests in serial, refs #1802" This reverts commit b91e17280c05bbb9cf97432081bdcea8665879f9. --- .github/workflows/test.yml | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 9c8c48ef..e38d5ee9 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -33,7 +33,8 @@ jobs: pip freeze - name: Run tests run: | - pytest + pytest -n auto -m "not serial" + pytest -m "serial" - name: Check if cog needs to be run run: | cog --check docs/*.rst From 64288d827f7ff97f825e10f714da3f781ecf9345 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 5 Sep 2022 17:40:19 -0700 Subject: [PATCH 135/952] Workaround for test failure: RuntimeError: There is no current event loop (#1803) * Remove ensure_eventloop hack * Hack to recover from intermittent RuntimeError calling asyncio.Lock() --- datasette/app.py | 10 +++++++++- tests/test_cli.py | 27 ++++++++++----------------- 2 files changed, 19 insertions(+), 18 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index f2a6763a..c6bbdaf0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -231,7 +231,15 @@ class Datasette: self.inspect_data = inspect_data self.immutables = set(immutables or []) self.databases = collections.OrderedDict() - self._refresh_schemas_lock = asyncio.Lock() + try: + self._refresh_schemas_lock = asyncio.Lock() + except RuntimeError as rex: + # Workaround for intermittent test failure, see: + # https://github.com/simonw/datasette/issues/1802 + if "There is no current event loop in thread" in str(rex): + loop = asyncio.new_event_loop() + asyncio.set_event_loop(loop) + self._refresh_schemas_lock = asyncio.Lock() self.crossdb = crossdb self.nolock = nolock if memory or crossdb or not self.files: diff --git a/tests/test_cli.py b/tests/test_cli.py index d0f6e26c..f0d28037 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -22,13 +22,6 @@ from unittest import mock import urllib -@pytest.fixture -def ensure_eventloop(): - # Workaround for "Event loop is closed" error - if asyncio.get_event_loop().is_closed(): - asyncio.set_event_loop(asyncio.new_event_loop()) - - def test_inspect_cli(app_client): runner = CliRunner() result = runner.invoke(cli, ["inspect", "fixtures.db"]) @@ -72,7 +65,7 @@ def test_serve_with_inspect_file_prepopulates_table_counts_cache(): ), ) def test_spatialite_error_if_attempt_to_open_spatialite( - ensure_eventloop, spatialite_paths, should_suggest_load_extension + spatialite_paths, should_suggest_load_extension ): with mock.patch("datasette.utils.SPATIALITE_PATHS", spatialite_paths): runner = CliRunner() @@ -199,14 +192,14 @@ def test_version(): @pytest.mark.parametrize("invalid_port", ["-1", "0.5", "dog", "65536"]) -def test_serve_invalid_ports(ensure_eventloop, invalid_port): +def test_serve_invalid_ports(invalid_port): runner = CliRunner(mix_stderr=False) result = runner.invoke(cli, ["--port", invalid_port]) assert result.exit_code == 2 assert "Invalid value for '-p'" in result.stderr -def test_setting(ensure_eventloop): +def test_setting(): runner = CliRunner() result = runner.invoke( cli, ["--setting", "default_page_size", "5", "--get", "/-/settings.json"] @@ -215,14 +208,14 @@ def test_setting(ensure_eventloop): assert json.loads(result.output)["default_page_size"] == 5 -def test_setting_type_validation(ensure_eventloop): +def test_setting_type_validation(): runner = CliRunner(mix_stderr=False) result = runner.invoke(cli, ["--setting", "default_page_size", "dog"]) assert result.exit_code == 2 assert '"default_page_size" should be an integer' in result.stderr -def test_config_deprecated(ensure_eventloop): +def test_config_deprecated(): # The --config option should show a deprecation message runner = CliRunner(mix_stderr=False) result = runner.invoke( @@ -233,14 +226,14 @@ def test_config_deprecated(ensure_eventloop): assert "will be deprecated in" in result.stderr -def test_sql_errors_logged_to_stderr(ensure_eventloop): +def test_sql_errors_logged_to_stderr(): runner = CliRunner(mix_stderr=False) result = runner.invoke(cli, ["--get", "/_memory.json?sql=select+blah"]) assert result.exit_code == 1 assert "sql = 'select blah', params = {}: no such column: blah\n" in result.stderr -def test_serve_create(ensure_eventloop, tmpdir): +def test_serve_create(tmpdir): runner = CliRunner() db_path = tmpdir / "does_not_exist_yet.db" assert not db_path.exists() @@ -258,7 +251,7 @@ def test_serve_create(ensure_eventloop, tmpdir): assert db_path.exists() -def test_serve_duplicate_database_names(ensure_eventloop, tmpdir): +def test_serve_duplicate_database_names(tmpdir): "'datasette db.db nested/db.db' should attach two databases, /db and /db_2" runner = CliRunner() db_1_path = str(tmpdir / "db.db") @@ -273,7 +266,7 @@ def test_serve_duplicate_database_names(ensure_eventloop, tmpdir): assert {db["name"] for db in databases} == {"db", "db_2"} -def test_serve_deduplicate_same_database_path(ensure_eventloop, tmpdir): +def test_serve_deduplicate_same_database_path(tmpdir): "'datasette db.db db.db' should only attach one database, /db" runner = CliRunner() db_path = str(tmpdir / "db.db") @@ -287,7 +280,7 @@ def test_serve_deduplicate_same_database_path(ensure_eventloop, tmpdir): @pytest.mark.parametrize( "filename", ["test-database (1).sqlite", "database (1).sqlite"] ) -def test_weird_database_names(ensure_eventloop, tmpdir, filename): +def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) From c9d1943aede436fa3413fd49bc56335cbda4ad07 Mon Sep 17 00:00:00 2001 From: Daniel Rech Date: Tue, 6 Sep 2022 02:45:41 +0200 Subject: [PATCH 136/952] Fix word break in facets by adding ul.tight-bullets li word-break: break-all (#1794) Thanks, @dmr --- datasette/static/app.css | 1 + 1 file changed, 1 insertion(+) diff --git a/datasette/static/app.css b/datasette/static/app.css index af3e14d5..712b9925 100644 --- a/datasette/static/app.css +++ b/datasette/static/app.css @@ -260,6 +260,7 @@ ul.bullets li { ul.tight-bullets li { list-style-type: disc; margin-bottom: 0; + word-break: break-all; } a.not-underlined { text-decoration: none; From d80775a48d20917633792fdc9525f075d3bc2c7a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 5 Sep 2022 17:44:44 -0700 Subject: [PATCH 137/952] Raise error if it's not about loops, refs #1802 --- datasette/app.py | 2 ++ 1 file changed, 2 insertions(+) diff --git a/datasette/app.py b/datasette/app.py index c6bbdaf0..aeb81687 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -240,6 +240,8 @@ class Datasette: loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) self._refresh_schemas_lock = asyncio.Lock() + else: + raise self.crossdb = crossdb self.nolock = nolock if memory or crossdb or not self.files: From 8430c3bc7dd22b173c1a8c6cd7180e3b31240cd1 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 08:59:19 -0700 Subject: [PATCH 138/952] table facet_size in metadata, refs #1804 --- datasette/facets.py | 14 +++++++++++--- tests/test_facets.py | 17 +++++++++++++++++ 2 files changed, 28 insertions(+), 3 deletions(-) diff --git a/datasette/facets.py b/datasette/facets.py index b15a758c..e70d42df 100644 --- a/datasette/facets.py +++ b/datasette/facets.py @@ -102,11 +102,19 @@ class Facet: def get_facet_size(self): facet_size = self.ds.setting("default_facet_size") max_returned_rows = self.ds.setting("max_returned_rows") + table_facet_size = None + if self.table: + tables_metadata = self.ds.metadata("tables", database=self.database) or {} + table_metadata = tables_metadata.get(self.table) or {} + if table_metadata: + table_facet_size = table_metadata.get("facet_size") custom_facet_size = self.request.args.get("_facet_size") - if custom_facet_size == "max": - facet_size = max_returned_rows - elif custom_facet_size and custom_facet_size.isdigit(): + if custom_facet_size and custom_facet_size.isdigit(): facet_size = int(custom_facet_size) + elif table_facet_size: + facet_size = table_facet_size + if facet_size == "max": + facet_size = max_returned_rows return min(facet_size, max_returned_rows) async def suggest(self): diff --git a/tests/test_facets.py b/tests/test_facets.py index c28dc43c..cbee23b0 100644 --- a/tests/test_facets.py +++ b/tests/test_facets.py @@ -581,6 +581,23 @@ async def test_facet_size(): ) data5 = response5.json() assert len(data5["facet_results"]["city"]["results"]) == 20 + # Now try messing with facet_size in the table metadata + ds._metadata_local = { + "databases": { + "test_facet_size": {"tables": {"neighbourhoods": {"facet_size": 6}}} + } + } + response6 = await ds.client.get("/test_facet_size/neighbourhoods.json?_facet=city") + data6 = response6.json() + assert len(data6["facet_results"]["city"]["results"]) == 6 + # Setting it to max bumps it up to 50 again + ds._metadata_local["databases"]["test_facet_size"]["tables"]["neighbourhoods"][ + "facet_size" + ] = "max" + data7 = ( + await ds.client.get("/test_facet_size/neighbourhoods.json?_facet=city") + ).json() + assert len(data7["facet_results"]["city"]["results"]) == 20 def test_other_types_of_facet_in_metadata(): From 303c6c733d95a6133558ec1b468f5bea5827d0d2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 11:05:00 -0700 Subject: [PATCH 139/952] Fix for incorrectly handled _facet_size=max, refs #1804 --- datasette/facets.py | 19 +++++++++++++------ 1 file changed, 13 insertions(+), 6 deletions(-) diff --git a/datasette/facets.py b/datasette/facets.py index e70d42df..7fb0c68b 100644 --- a/datasette/facets.py +++ b/datasette/facets.py @@ -109,12 +109,19 @@ class Facet: if table_metadata: table_facet_size = table_metadata.get("facet_size") custom_facet_size = self.request.args.get("_facet_size") - if custom_facet_size and custom_facet_size.isdigit(): - facet_size = int(custom_facet_size) - elif table_facet_size: - facet_size = table_facet_size - if facet_size == "max": - facet_size = max_returned_rows + if custom_facet_size: + if custom_facet_size == "max": + facet_size = max_returned_rows + elif custom_facet_size.isdigit(): + facet_size = int(custom_facet_size) + else: + # Invalid value, ignore it + custom_facet_size = None + if table_facet_size and not custom_facet_size: + if table_facet_size == "max": + facet_size = max_returned_rows + else: + facet_size = table_facet_size return min(facet_size, max_returned_rows) async def suggest(self): From 0a7815d2038255a0834c955066a2a16c01f707b2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 11:06:49 -0700 Subject: [PATCH 140/952] Documentation for facet_size in metadata, closes #1804 --- docs/facets.rst | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/docs/facets.rst b/docs/facets.rst index 2a2eb039..6c9d99bd 100644 --- a/docs/facets.rst +++ b/docs/facets.rst @@ -129,6 +129,22 @@ You can specify :ref:`array ` or :ref:`date ] } +You can change the default facet size (the number of results shown for each facet) for a table using ``facet_size``: + +.. code-block:: json + + { + "databases": { + "sf-trees": { + "tables": { + "Street_Tree_List": { + "facets": ["qLegalStatus"], + "facet_size": 10 + } + } + } + } + } Suggested facets ---------------- From d0476897e10249bb4867473722270d02491c2c1f Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 11:24:30 -0700 Subject: [PATCH 141/952] Fixed Sphinx warning about language = None --- docs/conf.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/conf.py b/docs/conf.py index 4ef6b768..8965974a 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -71,7 +71,7 @@ release = "" # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. -language = None +language = "en" # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. From ff9c87197dde8b09f9787ee878804cb6842ea5dc Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 11:26:21 -0700 Subject: [PATCH 142/952] Fixed Sphinx warnings on cli-reference page --- docs/cli-reference.rst | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index f8419d58..4a8465cb 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -14,7 +14,7 @@ Running ``datasette`` without specifying a command runs the default command, ``d import textwrap def help(args): title = "datasette " + " ".join(args) - cog.out("::\n\n") + cog.out("\n::\n\n") result = CliRunner().invoke(cli.cli, args) output = result.output.replace("Usage: cli ", "Usage: datasette ") cog.out(textwrap.indent(output, ' ')) @@ -32,6 +32,7 @@ Running ``datasette --help`` shows a list of all of the available commands. .. [[[cog help(["--help"]) .. ]]] + :: Usage: datasette [OPTIONS] COMMAND [ARGS]... @@ -77,6 +78,7 @@ Once started you can access it at ``http://localhost:8001`` .. [[[cog help(["serve", "--help"]) .. ]]] + :: Usage: datasette serve [OPTIONS] [FILES]... @@ -202,6 +204,7 @@ These can be passed to ``datasette serve`` using ``datasette serve --setting nam .. [[[cog help(["--help-settings"]) .. ]]] + :: Settings: @@ -258,6 +261,7 @@ Output JSON showing all currently installed plugins, their versions, whether the .. [[[cog help(["plugins", "--help"]) .. ]]] + :: Usage: datasette plugins [OPTIONS] @@ -326,6 +330,7 @@ Would install the `datasette-cluster-map Date: Tue, 6 Sep 2022 16:50:43 -0700 Subject: [PATCH 143/952] truncate_cells_html now affects URLs too, refs #1805 --- datasette/utils/__init__.py | 10 ++++++++++ datasette/views/database.py | 11 ++++++++--- datasette/views/table.py | 8 ++++++-- tests/fixtures.py | 9 +++++---- tests/test_api.py | 2 +- tests/test_table_api.py | 11 +++++++---- tests/test_table_html.py | 11 +++++++++++ tests/test_utils.py | 20 ++++++++++++++++++++ 8 files changed, 68 insertions(+), 14 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index bbaa0510..2bdea673 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -1167,3 +1167,13 @@ def resolve_routes(routes, path): if match is not None: return match, view return None, None + + +def truncate_url(url, length): + if (not length) or (len(url) <= length): + return url + bits = url.rsplit(".", 1) + if len(bits) == 2 and 1 <= len(bits[1]) <= 4 and "/" not in bits[1]: + rest, ext = bits + return rest[: length - 1 - len(ext)] + "…." + ext + return url[: length - 1] + "…" diff --git a/datasette/views/database.py b/datasette/views/database.py index 77632b9d..fc344245 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -20,6 +20,7 @@ from datasette.utils import ( path_with_format, path_with_removed_args, sqlite3, + truncate_url, InvalidSql, ) from datasette.utils.asgi import AsgiFileDownload, NotFound, Response, Forbidden @@ -371,6 +372,7 @@ class QueryView(DataView): async def extra_template(): display_rows = [] + truncate_cells = self.ds.setting("truncate_cells_html") for row in results.rows if results else []: display_row = [] for column, value in zip(results.columns, row): @@ -396,9 +398,12 @@ class QueryView(DataView): if value in ("", None): display_value = Markup(" ") elif is_url(str(display_value).strip()): - display_value = Markup( - '{url}'.format( - url=escape(value.strip()) + display_value = markupsafe.Markup( + '{truncated_url}'.format( + url=markupsafe.escape(value.strip()), + truncated_url=markupsafe.escape( + truncate_url(value.strip(), truncate_cells) + ), ) ) elif isinstance(display_value, bytes): diff --git a/datasette/views/table.py b/datasette/views/table.py index 49c30c9c..60c092f9 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -24,6 +24,7 @@ from datasette.utils import ( path_with_removed_args, path_with_replaced_args, to_css_class, + truncate_url, urlsafe_components, value_as_boolean, ) @@ -966,8 +967,11 @@ async def display_columns_and_rows( display_value = markupsafe.Markup(" ") elif is_url(str(value).strip()): display_value = markupsafe.Markup( - '{url}'.format( - url=markupsafe.escape(value.strip()) + '{truncated_url}'.format( + url=markupsafe.escape(value.strip()), + truncated_url=markupsafe.escape( + truncate_url(value.strip(), truncate_cells) + ), ) ) elif column in table_metadata.get("units", {}) and value != "": diff --git a/tests/fixtures.py b/tests/fixtures.py index c145ac78..82d8452e 100644 --- a/tests/fixtures.py +++ b/tests/fixtures.py @@ -598,23 +598,24 @@ CREATE TABLE roadside_attractions ( pk integer primary key, name text, address text, + url text, latitude real, longitude real ); INSERT INTO roadside_attractions VALUES ( - 1, "The Mystery Spot", "465 Mystery Spot Road, Santa Cruz, CA 95065", + 1, "The Mystery Spot", "465 Mystery Spot Road, Santa Cruz, CA 95065", "https://www.mysteryspot.com/", 37.0167, -122.0024 ); INSERT INTO roadside_attractions VALUES ( - 2, "Winchester Mystery House", "525 South Winchester Boulevard, San Jose, CA 95128", + 2, "Winchester Mystery House", "525 South Winchester Boulevard, San Jose, CA 95128", "https://winchestermysteryhouse.com/", 37.3184, -121.9511 ); INSERT INTO roadside_attractions VALUES ( - 3, "Burlingame Museum of PEZ Memorabilia", "214 California Drive, Burlingame, CA 94010", + 3, "Burlingame Museum of PEZ Memorabilia", "214 California Drive, Burlingame, CA 94010", null, 37.5793, -122.3442 ); INSERT INTO roadside_attractions VALUES ( - 4, "Bigfoot Discovery Museum", "5497 Highway 9, Felton, CA 95018", + 4, "Bigfoot Discovery Museum", "5497 Highway 9, Felton, CA 95018", "https://www.bigfootdiscoveryproject.com/", 37.0414, -122.0725 ); diff --git a/tests/test_api.py b/tests/test_api.py index f6db2f9d..7a2bf91f 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -339,7 +339,7 @@ def test_database_page(app_client): }, { "name": "roadside_attractions", - "columns": ["pk", "name", "address", "latitude", "longitude"], + "columns": ["pk", "name", "address", "url", "latitude", "longitude"], "primary_keys": ["pk"], "count": 4, "hidden": False, diff --git a/tests/test_table_api.py b/tests/test_table_api.py index e56a72b5..0db04434 100644 --- a/tests/test_table_api.py +++ b/tests/test_table_api.py @@ -615,11 +615,12 @@ def test_table_through(app_client): response = app_client.get( '/fixtures/roadside_attractions.json?_through={"table":"roadside_attraction_characteristics","column":"characteristic_id","value":"1"}' ) - assert [ + assert response.json["rows"] == [ [ 3, "Burlingame Museum of PEZ Memorabilia", "214 California Drive, Burlingame, CA 94010", + None, 37.5793, -122.3442, ], @@ -627,13 +628,15 @@ def test_table_through(app_client): 4, "Bigfoot Discovery Museum", "5497 Highway 9, Felton, CA 95018", + "https://www.bigfootdiscoveryproject.com/", 37.0414, -122.0725, ], - ] == response.json["rows"] + ] + assert ( - 'where roadside_attraction_characteristics.characteristic_id = "1"' - == response.json["human_description_en"] + response.json["human_description_en"] + == 'where roadside_attraction_characteristics.characteristic_id = "1"' ) diff --git a/tests/test_table_html.py b/tests/test_table_html.py index f3808ea3..8e37468f 100644 --- a/tests/test_table_html.py +++ b/tests/test_table_html.py @@ -69,6 +69,17 @@ def test_table_cell_truncation(): td.string for td in table.findAll("td", {"class": "col-neighborhood-b352a7"}) ] + # URLs should be truncated too + response2 = client.get("/fixtures/roadside_attractions") + assert response2.status == 200 + table = Soup(response2.body, "html.parser").find("table") + tds = table.findAll("td", {"class": "col-url"}) + assert [str(td) for td in tds] == [ + '
', + '', + '', + '', + ] def test_add_filter_redirects(app_client): diff --git a/tests/test_utils.py b/tests/test_utils.py index df788767..d71a612d 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -626,3 +626,23 @@ def test_tilde_encoding(original, expected): assert actual == expected # And test round-trip assert original == utils.tilde_decode(actual) + + +@pytest.mark.parametrize( + "url,length,expected", + ( + ("https://example.com/", 5, "http…"), + ("https://example.com/foo/bar", 15, "https://exampl…"), + ("https://example.com/foo/bar/baz.jpg", 30, "https://example.com/foo/ba….jpg"), + # Extensions longer than 4 characters are not treated specially: + ("https://example.com/foo/bar/baz.jpeg2", 30, "https://example.com/foo/bar/b…"), + ( + "https://example.com/foo/bar/baz.jpeg2", + None, + "https://example.com/foo/bar/baz.jpeg2", + ), + ), +) +def test_truncate_url(url, length, expected): + actual = utils.truncate_url(url, length) + assert actual == expected From 5aa359b86907d11b3ee601510775a85a90224da8 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 16:58:30 -0700 Subject: [PATCH 144/952] Apply cell truncation on query page too, refs #1805 --- datasette/views/database.py | 7 ++++++- tests/test_html.py | 19 +++++++++++++++++++ 2 files changed, 25 insertions(+), 1 deletion(-) diff --git a/datasette/views/database.py b/datasette/views/database.py index fc344245..affbc540 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -428,7 +428,12 @@ class QueryView(DataView): "" if len(value) == 1 else "s", ) ) - + else: + display_value = str(value) + if truncate_cells and len(display_value) > truncate_cells: + display_value = ( + display_value[:truncate_cells] + "\u2026" + ) display_row.append(display_value) display_rows.append(display_row) diff --git a/tests/test_html.py b/tests/test_html.py index d6e969ad..bf915247 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -186,6 +186,25 @@ def test_row_page_does_not_truncate(): ] +def test_query_page_truncates(): + with make_app_client(settings={"truncate_cells_html": 5}) as client: + response = client.get( + "/fixtures?" + + urllib.parse.urlencode( + { + "sql": "select 'this is longer than 5' as a, 'https://example.com/' as b" + } + ) + ) + assert response.status == 200 + table = Soup(response.body, "html.parser").find("table") + tds = table.findAll("td") + assert [str(td) for td in tds] == [ + '', + '', + ] + + @pytest.mark.parametrize( "path,expected_classes", [ From bf8d84af5422606597be893cedd375020cb2b369 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 6 Sep 2022 20:34:59 -0700 Subject: [PATCH 145/952] word-wrap: anywhere on links in cells, refs #1805 --- datasette/static/app.css | 1 + 1 file changed, 1 insertion(+) diff --git a/datasette/static/app.css b/datasette/static/app.css index 712b9925..08b724f6 100644 --- a/datasette/static/app.css +++ b/datasette/static/app.css @@ -446,6 +446,7 @@ th { } table a:link { text-decoration: none; + word-wrap: anywhere; } .rows-and-columns td:before { display: block; From fb7e70d5e72a951efe4b29ad999d8915c032d021 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 9 Sep 2022 09:19:20 -0700 Subject: [PATCH 146/952] Database(is_mutable=) now defaults to True, closes #1808 Refs https://github.com/simonw/datasette-upload-dbs/issues/6 --- datasette/database.py | 3 +-- docs/internals.rst | 9 +++++---- tests/test_internals_database.py | 1 + tests/test_internals_datasette.py | 2 +- 4 files changed, 8 insertions(+), 7 deletions(-) diff --git a/datasette/database.py b/datasette/database.py index fa558045..44467370 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -28,7 +28,7 @@ AttachedDatabase = namedtuple("AttachedDatabase", ("seq", "name", "file")) class Database: def __init__( - self, ds, path=None, is_mutable=False, is_memory=False, memory_name=None + self, ds, path=None, is_mutable=True, is_memory=False, memory_name=None ): self.name = None self.route = None @@ -39,7 +39,6 @@ class Database: self.memory_name = memory_name if memory_name is not None: self.is_memory = True - self.is_mutable = True self.hash = None self.cached_size = None self._cached_table_counts = None diff --git a/docs/internals.rst b/docs/internals.rst index 20797e98..adeec1d8 100644 --- a/docs/internals.rst +++ b/docs/internals.rst @@ -426,12 +426,13 @@ The ``db`` parameter should be an instance of the ``datasette.database.Database` Database( datasette, path="path/to/my-new-database.db", - is_mutable=True, ) ) This will add a mutable database and serve it at ``/my-new-database``. +Use ``is_mutable=False`` to add an immutable database. + ``.add_database()`` returns the Database instance, with its name set as the ``database.name`` attribute. Any time you are working with a newly added database you should use the return value of ``.add_database()``, for example: .. code-block:: python @@ -671,8 +672,8 @@ Instances of the ``Database`` class can be used to execute queries against attac .. _database_constructor: -Database(ds, path=None, is_mutable=False, is_memory=False, memory_name=None) ----------------------------------------------------------------------------- +Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None) +--------------------------------------------------------------------------- The ``Database()`` constructor can be used by plugins, in conjunction with :ref:`datasette_add_database`, to create and register new databases. @@ -685,7 +686,7 @@ The arguments are as follows: Path to a SQLite database file on disk. ``is_mutable`` - boolean - Set this to ``True`` if it is possible that updates will be made to that database - otherwise Datasette will open it in immutable mode and any changes could cause undesired behavior. + Set this to ``False`` to cause Datasette to open the file in immutable mode. ``is_memory`` - boolean Use this to create non-shared memory connections. diff --git a/tests/test_internals_database.py b/tests/test_internals_database.py index 551f67e1..9e81c1d6 100644 --- a/tests/test_internals_database.py +++ b/tests/test_internals_database.py @@ -499,6 +499,7 @@ def test_mtime_ns_is_none_for_memory(app_client): def test_is_mutable(app_client): + assert Database(app_client.ds, is_memory=True).is_mutable is True assert Database(app_client.ds, is_memory=True, is_mutable=True).is_mutable is True assert Database(app_client.ds, is_memory=True, is_mutable=False).is_mutable is False diff --git a/tests/test_internals_datasette.py b/tests/test_internals_datasette.py index 1dc14cab..249920fe 100644 --- a/tests/test_internals_datasette.py +++ b/tests/test_internals_datasette.py @@ -58,7 +58,7 @@ async def test_datasette_constructor(): "route": "_memory", "path": None, "size": 0, - "is_mutable": False, + "is_mutable": True, "is_memory": True, "hash": None, } From 610425460b519e9c16d386cb81aa081c9d730ef0 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 10 Sep 2022 14:24:26 -0700 Subject: [PATCH 147/952] Add --nolock to the README Chrome demo Refs #1744 --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 1af20129..af95b85e 100644 --- a/README.md +++ b/README.md @@ -48,7 +48,7 @@ This will start a web server on port 8001 - visit http://localhost:8001/ to acce Use Chrome on OS X? You can run datasette against your browser history like so: - datasette ~/Library/Application\ Support/Google/Chrome/Default/History + datasette ~/Library/Application\ Support/Google/Chrome/Default/History --nolock Now visiting http://localhost:8001/History/downloads will show you a web interface to browse your downloads data: From b40872f5e5ae5dad331c58f75451e2d206565196 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 14 Sep 2022 14:31:54 -0700 Subject: [PATCH 148/952] prepare_jinja2_environment(datasette) argument, refs #1809 --- datasette/app.py | 2 +- datasette/hookspecs.py | 2 +- docs/plugin_hooks.rst | 9 +++++++-- tests/plugins/my_plugin.py | 3 ++- tests/test_plugins.py | 5 +++-- 5 files changed, 14 insertions(+), 7 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index aeb81687..db686670 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -345,7 +345,7 @@ class Datasette: self.jinja_env.filters["escape_sqlite"] = escape_sqlite self.jinja_env.filters["to_css_class"] = to_css_class # pylint: disable=no-member - pm.hook.prepare_jinja2_environment(env=self.jinja_env) + pm.hook.prepare_jinja2_environment(env=self.jinja_env, datasette=self) self._register_renderers() self._permission_checks = collections.deque(maxlen=200) diff --git a/datasette/hookspecs.py b/datasette/hookspecs.py index a5fb536f..34e19664 100644 --- a/datasette/hookspecs.py +++ b/datasette/hookspecs.py @@ -26,7 +26,7 @@ def prepare_connection(conn, database, datasette): @hookspec -def prepare_jinja2_environment(env): +def prepare_jinja2_environment(env, datasette): """Modify Jinja2 template environment e.g. register custom template tags""" diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 30bd75b7..62ec5c90 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -61,12 +61,15 @@ Examples: `datasette-jellyfish `_, for @@ -85,6 +88,8 @@ You can now use this filter in your custom templates like so:: Table name: {{ table|uppercase }} +Examples: `datasette-edit-templates `_ + .. _plugin_hook_extra_template_vars: extra_template_vars(template, database, table, columns, view_name, request, datasette) diff --git a/tests/plugins/my_plugin.py b/tests/plugins/my_plugin.py index 53613b7d..d49a7a34 100644 --- a/tests/plugins/my_plugin.py +++ b/tests/plugins/my_plugin.py @@ -142,8 +142,9 @@ def extra_template_vars( @hookimpl -def prepare_jinja2_environment(env): +def prepare_jinja2_environment(env, datasette): env.filters["format_numeric"] = lambda s: f"{float(s):,.0f}" + env.filters["to_hello"] = lambda s: datasette._HELLO @hookimpl diff --git a/tests/test_plugins.py b/tests/test_plugins.py index 948a40b8..590d88f6 100644 --- a/tests/test_plugins.py +++ b/tests/test_plugins.py @@ -545,11 +545,12 @@ def test_hook_register_output_renderer_can_render(app_client): @pytest.mark.asyncio async def test_hook_prepare_jinja2_environment(app_client): + app_client.ds._HELLO = "HI" template = app_client.ds.jinja_env.from_string( - "Hello there, {{ a|format_numeric }}", {"a": 3412341} + "Hello there, {{ a|format_numeric }}, {{ a|to_hello }}", {"a": 3412341} ) rendered = await app_client.ds.render_template(template) - assert "Hello there, 3,412,341" == rendered + assert "Hello there, 3,412,341, HI" == rendered def test_hook_publish_subcommand(): From 2ebcffe2226ece2a5a86722790d486a480338632 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 16 Sep 2022 12:50:52 -0700 Subject: [PATCH 149/952] Bump furo from 2022.6.21 to 2022.9.15 (#1812) Bumps [furo](https://github.com/pradyunsg/furo) from 2022.6.21 to 2022.9.15. - [Release notes](https://github.com/pradyunsg/furo/releases) - [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md) - [Commits](https://github.com/pradyunsg/furo/compare/2022.06.21...2022.09.15) --- updated-dependencies: - dependency-name: furo dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index 92fa60d0..afcba1f0 100644 --- a/setup.py +++ b/setup.py @@ -65,7 +65,7 @@ setup( setup_requires=["pytest-runner"], extras_require={ "docs": [ - "furo==2022.6.21", + "furo==2022.9.15", "sphinx-autobuild", "codespell", "blacken-docs", From ddc999ad1296e8c69cffede3e367dda059b8adad Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 16 Sep 2022 20:38:15 -0700 Subject: [PATCH 150/952] Async support for prepare_jinja2_environment, closes #1809 --- datasette/app.py | 22 ++++++++++++++--- datasette/utils/testing.py | 1 + docs/plugin_hooks.rst | 2 ++ docs/testing_plugins.rst | 30 ++++++++++++++++++++++++ tests/fixtures.py | 1 + tests/plugins/my_plugin.py | 10 ++++++-- tests/plugins/my_plugin_2.py | 6 +++++ tests/test_internals_datasette_client.py | 6 +++-- tests/test_plugins.py | 6 +++-- tests/test_routes.py | 1 + 10 files changed, 76 insertions(+), 9 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index db686670..ea3e7b43 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -208,6 +208,7 @@ class Datasette: crossdb=False, nolock=False, ): + self._startup_invoked = False assert config_dir is None or isinstance( config_dir, Path ), "config_dir= should be a pathlib.Path" @@ -344,9 +345,6 @@ class Datasette: self.jinja_env.filters["quote_plus"] = urllib.parse.quote_plus self.jinja_env.filters["escape_sqlite"] = escape_sqlite self.jinja_env.filters["to_css_class"] = to_css_class - # pylint: disable=no-member - pm.hook.prepare_jinja2_environment(env=self.jinja_env, datasette=self) - self._register_renderers() self._permission_checks = collections.deque(maxlen=200) self._root_token = secrets.token_hex(32) @@ -389,8 +387,16 @@ class Datasette: return Urls(self) async def invoke_startup(self): + # This must be called for Datasette to be in a usable state + if self._startup_invoked: + return + for hook in pm.hook.prepare_jinja2_environment( + env=self.jinja_env, datasette=self + ): + await await_me_maybe(hook) for hook in pm.hook.startup(datasette=self): await await_me_maybe(hook) + self._startup_invoked = True def sign(self, value, namespace="default"): return URLSafeSerializer(self._secret, namespace).dumps(value) @@ -933,6 +939,8 @@ class Datasette: async def render_template( self, templates, context=None, request=None, view_name=None ): + if not self._startup_invoked: + raise Exception("render_template() called before await ds.invoke_startup()") context = context or {} if isinstance(templates, Template): template = templates @@ -1495,34 +1503,42 @@ class DatasetteClient: return path async def get(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.get(self._fix(path), **kwargs) async def options(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.options(self._fix(path), **kwargs) async def head(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.head(self._fix(path), **kwargs) async def post(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.post(self._fix(path), **kwargs) async def put(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.put(self._fix(path), **kwargs) async def patch(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.patch(self._fix(path), **kwargs) async def delete(self, path, **kwargs): + await self.ds.invoke_startup() async with httpx.AsyncClient(app=self.app) as client: return await client.delete(self._fix(path), **kwargs) async def request(self, method, path, **kwargs): + await self.ds.invoke_startup() avoid_path_rewrites = kwargs.pop("avoid_path_rewrites", None) async with httpx.AsyncClient(app=self.app) as client: return await client.request( diff --git a/datasette/utils/testing.py b/datasette/utils/testing.py index 640c94e6..b28fc575 100644 --- a/datasette/utils/testing.py +++ b/datasette/utils/testing.py @@ -147,6 +147,7 @@ class TestClient: content_type=None, if_none_match=None, ): + await self.ds.invoke_startup() headers = headers or {} if content_type: headers["content-type"] = content_type diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 62ec5c90..f208e727 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -88,6 +88,8 @@ You can now use this filter in your custom templates like so:: Table name: {{ table|uppercase }} +This function can return an awaitable function if it needs to run any async code. + Examples: `datasette-edit-templates `_ .. _plugin_hook_extra_template_vars: diff --git a/docs/testing_plugins.rst b/docs/testing_plugins.rst index 992b4b0e..41f50e56 100644 --- a/docs/testing_plugins.rst +++ b/docs/testing_plugins.rst @@ -52,6 +52,36 @@ Then run the tests using pytest like so:: pytest +.. _testing_plugins_datasette_test_instance: + +Setting up a Datasette test instance +------------------------------------ + +The above example shows the easiest way to start writing tests against a Datasette instance: + +.. code-block:: python + + from datasette.app import Datasette + import pytest + + + @pytest.mark.asyncio + async def test_plugin_is_installed(): + datasette = Datasette(memory=True) + response = await datasette.client.get("/-/plugins.json") + assert response.status_code == 200 + +Creating a ``Datasette()`` instance like this as useful shortcut in tests, but there is one detail you need to be aware of. It's important to ensure that the async method ``.invoke_startup()`` is called on that instance. You can do that like this: + +.. code-block:: python + + datasette = Datasette(memory=True) + await datasette.invoke_startup() + +This method registers any :ref:`plugin_hook_startup` or :ref:`plugin_hook_prepare_jinja2_environment` plugins that might themselves need to make async calls. + +If you are using ``await datasette.client.get()`` and similar methods then you don't need to worry about this - those method calls ensure that ``.invoke_startup()`` has been called for you. + .. _testing_plugins_pdb: Using pdb for errors thrown inside Datasette diff --git a/tests/fixtures.py b/tests/fixtures.py index 82d8452e..5a875cd2 100644 --- a/tests/fixtures.py +++ b/tests/fixtures.py @@ -71,6 +71,7 @@ EXPECTED_PLUGINS = [ "handle_exception", "menu_links", "permission_allowed", + "prepare_jinja2_environment", "register_routes", "render_cell", "startup", diff --git a/tests/plugins/my_plugin.py b/tests/plugins/my_plugin.py index d49a7a34..1a41de38 100644 --- a/tests/plugins/my_plugin.py +++ b/tests/plugins/my_plugin.py @@ -143,8 +143,14 @@ def extra_template_vars( @hookimpl def prepare_jinja2_environment(env, datasette): - env.filters["format_numeric"] = lambda s: f"{float(s):,.0f}" - env.filters["to_hello"] = lambda s: datasette._HELLO + async def select_times_three(s): + db = datasette.get_database() + return (await db.execute("select 3 * ?", [int(s)])).first()[0] + + async def inner(): + env.filters["select_times_three"] = select_times_three + + return inner @hookimpl diff --git a/tests/plugins/my_plugin_2.py b/tests/plugins/my_plugin_2.py index 4df02343..cee80703 100644 --- a/tests/plugins/my_plugin_2.py +++ b/tests/plugins/my_plugin_2.py @@ -126,6 +126,12 @@ def permission_allowed(datasette, actor, action): return inner +@hookimpl +def prepare_jinja2_environment(env, datasette): + env.filters["format_numeric"] = lambda s: f"{float(s):,.0f}" + env.filters["to_hello"] = lambda s: datasette._HELLO + + @hookimpl def startup(datasette): async def inner(): diff --git a/tests/test_internals_datasette_client.py b/tests/test_internals_datasette_client.py index 8c5b5bd3..497bf475 100644 --- a/tests/test_internals_datasette_client.py +++ b/tests/test_internals_datasette_client.py @@ -1,10 +1,12 @@ from .fixtures import app_client import httpx import pytest +import pytest_asyncio -@pytest.fixture -def datasette(app_client): +@pytest_asyncio.fixture +async def datasette(app_client): + await app_client.ds.invoke_startup() return app_client.ds diff --git a/tests/test_plugins.py b/tests/test_plugins.py index 590d88f6..0ae3abf3 100644 --- a/tests/test_plugins.py +++ b/tests/test_plugins.py @@ -546,11 +546,13 @@ def test_hook_register_output_renderer_can_render(app_client): @pytest.mark.asyncio async def test_hook_prepare_jinja2_environment(app_client): app_client.ds._HELLO = "HI" + await app_client.ds.invoke_startup() template = app_client.ds.jinja_env.from_string( - "Hello there, {{ a|format_numeric }}, {{ a|to_hello }}", {"a": 3412341} + "Hello there, {{ a|format_numeric }}, {{ a|to_hello }}, {{ b|select_times_three }}", + {"a": 3412341, "b": 5}, ) rendered = await app_client.ds.render_template(template) - assert "Hello there, 3,412,341, HI" == rendered + assert "Hello there, 3,412,341, HI, 15" == rendered def test_hook_publish_subcommand(): diff --git a/tests/test_routes.py b/tests/test_routes.py index 5ae55d21..d467abe1 100644 --- a/tests/test_routes.py +++ b/tests/test_routes.py @@ -59,6 +59,7 @@ def test_routes(routes, path, expected_class, expected_matches): @pytest_asyncio.fixture async def ds_with_route(): ds = Datasette() + await ds.invoke_startup() ds.remove_database("_memory") db = Database(ds, is_memory=True, memory_name="route-name-db") ds.add_database(db, name="original-name", route="custom-route-name") From df851c117db031dec50dd4ef1ca34745920ac77a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 19 Sep 2022 16:46:39 -0700 Subject: [PATCH 151/952] Validate settings.json keys on startup, closes #1816 Refs #1814 --- datasette/app.py | 4 ++++ tests/test_config_dir.py | 20 ++++++++++++++++++-- 2 files changed, 22 insertions(+), 2 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index ea3e7b43..8873ce28 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -292,6 +292,10 @@ class Datasette: raise StartupError("config.json should be renamed to settings.json") if config_dir and (config_dir / "settings.json").exists() and not settings: settings = json.loads((config_dir / "settings.json").read_text()) + # Validate those settings + for key in settings: + if key not in DEFAULT_SETTINGS: + raise StartupError("Invalid setting '{key}' in settings.json") self._settings = dict(DEFAULT_SETTINGS, **(settings or {})) self.renderers = {} # File extension -> (renderer, can_render) functions self.version_note = version_note diff --git a/tests/test_config_dir.py b/tests/test_config_dir.py index fe927c42..e365515b 100644 --- a/tests/test_config_dir.py +++ b/tests/test_config_dir.py @@ -5,6 +5,7 @@ import pytest from datasette.app import Datasette from datasette.cli import cli from datasette.utils.sqlite import sqlite3 +from datasette.utils import StartupError from .fixtures import TestClient as _TestClient from click.testing import CliRunner @@ -27,9 +28,8 @@ body { margin-top: 3em} @pytest.fixture(scope="session") -def config_dir_client(tmp_path_factory): +def config_dir(tmp_path_factory): config_dir = tmp_path_factory.mktemp("config-dir") - plugins_dir = config_dir / "plugins" plugins_dir.mkdir() (plugins_dir / "hooray.py").write_text(PLUGIN, "utf-8") @@ -77,7 +77,23 @@ def config_dir_client(tmp_path_factory): ), "utf-8", ) + return config_dir + +def test_invalid_settings(config_dir): + previous = (config_dir / "settings.json").read_text("utf-8") + (config_dir / "settings.json").write_text( + json.dumps({"invalid": "invalid-setting"}), "utf-8" + ) + try: + with pytest.raises(StartupError): + ds = Datasette([], config_dir=config_dir) + finally: + (config_dir / "settings.json").write_text(previous, "utf-8") + + +@pytest.fixture(scope="session") +def config_dir_client(config_dir): ds = Datasette([], config_dir=config_dir) yield _TestClient(ds) From cb1e093fd361b758120aefc1a444df02462389a3 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 19 Sep 2022 18:15:40 -0700 Subject: [PATCH 152/952] Fixed error message, closes #1816 --- datasette/app.py | 4 +++- tests/test_config_dir.py | 3 ++- 2 files changed, 5 insertions(+), 2 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 8873ce28..03d1dacc 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -295,7 +295,9 @@ class Datasette: # Validate those settings for key in settings: if key not in DEFAULT_SETTINGS: - raise StartupError("Invalid setting '{key}' in settings.json") + raise StartupError( + "Invalid setting '{}' in settings.json".format(key) + ) self._settings = dict(DEFAULT_SETTINGS, **(settings or {})) self.renderers = {} # File extension -> (renderer, can_render) functions self.version_note = version_note diff --git a/tests/test_config_dir.py b/tests/test_config_dir.py index e365515b..f5ecf0d6 100644 --- a/tests/test_config_dir.py +++ b/tests/test_config_dir.py @@ -86,8 +86,9 @@ def test_invalid_settings(config_dir): json.dumps({"invalid": "invalid-setting"}), "utf-8" ) try: - with pytest.raises(StartupError): + with pytest.raises(StartupError) as ex: ds = Datasette([], config_dir=config_dir) + assert ex.value.args[0] == "Invalid setting 'invalid' in settings.json" finally: (config_dir / "settings.json").write_text(previous, "utf-8") From 212137a90b4291db9605e039f198564dae59c5d0 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 26 Sep 2022 14:14:25 -0700 Subject: [PATCH 153/952] Release 0.63a0 Refs #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816 --- datasette/version.py | 2 +- docs/changelog.rst | 17 +++++++++++++++++ 2 files changed, 18 insertions(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index 0453346c..e5ad585f 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.62" +__version__ = "0.63a0" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index f9dcc980..bd93f4cb 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,6 +4,23 @@ Changelog ========= +.. _v0_63a0: + +0.63a0 (2022-09-26) +------------------- + +- The :ref:`plugin_hook_prepare_jinja2_environment` plugin hook now accepts an optional ``datasette`` argument. Hook implementations can also now return an ``async`` function which will be awaited automatically. (:issue:`1809`) +- ``--load-extension`` option now supports entrypoints. Thanks, Alex Garcia. (`#1789 `__) +- New tutorial: `Cleaning data with sqlite-utils and Datasette `__. +- Facet size can now be set per-table with the new ``facet_size`` table metadata option. (:issue:`1804`) +- ``truncate_cells_html`` setting now also affects long URLs in columns. (:issue:`1805`) +- ``Database(is_mutable=)`` now defaults to ``True``. (:issue:`1808`) +- Non-JavaScript textarea now increases height to fit the SQL query. (:issue:`1786`) +- More detailed command descriptions on the :ref:`CLI reference ` page. (:issue:`1787`) +- Datasette no longer enforces upper bounds on its depenedencies. (:issue:`1800`) +- Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. (`#1794 `__) +- The ``settings.json`` file used in :ref:`config_dir` is now validated on startup. (:issue:`1816`) + .. _v0_62: 0.62 (2022-08-14) From 5f9f567acbc58c9fcd88af440e68034510fb5d2b Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 26 Sep 2022 16:06:01 -0700 Subject: [PATCH 154/952] Show SQL query when reporting time limit error, closes #1819 --- datasette/database.py | 5 ++++- datasette/views/base.py | 21 +++++++++++++-------- tests/test_api.py | 12 +++++++++++- tests/test_html.py | 10 +++++++--- 4 files changed, 35 insertions(+), 13 deletions(-) diff --git a/datasette/database.py b/datasette/database.py index 44467370..46094bd7 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -476,7 +476,10 @@ class WriteTask: class QueryInterrupted(Exception): - pass + def __init__(self, e, sql, params): + self.e = e + self.sql = sql + self.params = params class MultipleValues(Exception): diff --git a/datasette/views/base.py b/datasette/views/base.py index 221e1882..67aa3a42 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -1,10 +1,12 @@ import asyncio import csv import hashlib -import re import sys +import textwrap import time import urllib +from markupsafe import escape + import pint @@ -24,11 +26,9 @@ from datasette.utils import ( path_with_removed_args, path_with_format, sqlite3, - HASH_LENGTH, ) from datasette.utils.asgi import ( AsgiStream, - Forbidden, NotFound, Response, BadRequest, @@ -371,13 +371,18 @@ class DataView(BaseView): ) = response_or_template_contexts else: data, extra_template_data, templates = response_or_template_contexts - except QueryInterrupted: + except QueryInterrupted as ex: raise DatasetteError( - """ - SQL query took too long. The time limit is controlled by the + textwrap.dedent( + """ +

SQL query took too long. The time limit is controlled by the sql_time_limit_ms - configuration option. - """, + configuration option.

+
{}
+ """.format( + escape(ex.sql) + ) + ).strip(), title="SQL Interrupted", status=400, message_is_html=True, diff --git a/tests/test_api.py b/tests/test_api.py index 7a2bf91f..ad74d16e 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -656,7 +656,17 @@ def test_custom_sql(app_client): def test_sql_time_limit(app_client_shorter_time_limit): response = app_client_shorter_time_limit.get("/fixtures.json?sql=select+sleep(0.5)") assert 400 == response.status - assert "SQL Interrupted" == response.json["title"] + assert response.json == { + "ok": False, + "error": ( + "

SQL query took too long. The time limit is controlled by the\n" + 'sql_time_limit_ms\n' + "configuration option.

\n" + "
select sleep(0.5)
" + ), + "status": 400, + "title": "SQL Interrupted", + } def test_custom_sql_time_limit(app_client): diff --git a/tests/test_html.py b/tests/test_html.py index bf915247..a99b0b6c 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -168,10 +168,14 @@ def test_disallowed_custom_sql_pragma(app_client): def test_sql_time_limit(app_client_shorter_time_limit): response = app_client_shorter_time_limit.get("/fixtures?sql=select+sleep(0.5)") assert 400 == response.status - expected_html_fragment = """ + expected_html_fragments = [ + """ sql_time_limit_ms - """.strip() - assert expected_html_fragment in response.text + """.strip(), + "
select sleep(0.5)
", + ] + for expected_html_fragment in expected_html_fragments: + assert expected_html_fragment in response.text def test_row_page_does_not_truncate(): From 7fb4ea4e39a15e1f7d3202949794d98af1cfa272 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 27 Sep 2022 21:06:40 -0700 Subject: [PATCH 155/952] Update note about render_cell signature, refs #1826 --- docs/plugin_hooks.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index f208e727..c9cab8ab 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -9,7 +9,7 @@ Each plugin can implement one or more hooks using the ``@hookimpl`` decorator ag When you implement a plugin hook you can accept any or all of the parameters that are documented as being passed to that hook. -For example, you can implement the ``render_cell`` plugin hook like this even though the full documented hook signature is ``render_cell(value, column, table, database, datasette)``: +For example, you can implement the ``render_cell`` plugin hook like this even though the full documented hook signature is ``render_cell(row, value, column, table, database, datasette)``: .. code-block:: python From 984b1df12cf19a6731889fc0665bb5f622e07b7c Mon Sep 17 00:00:00 2001 From: Adam Simpson Date: Wed, 28 Sep 2022 00:21:36 -0400 Subject: [PATCH 156/952] Add documentation for serving via OpenRC (#1825) * Add documentation for serving via OpenRC --- docs/deploying.rst | 30 +++++++++++++++++++++--------- 1 file changed, 21 insertions(+), 9 deletions(-) diff --git a/docs/deploying.rst b/docs/deploying.rst index d4ad8836..c8552758 100644 --- a/docs/deploying.rst +++ b/docs/deploying.rst @@ -74,18 +74,30 @@ Once the service has started you can confirm that Datasette is running on port 8 curl 127.0.0.1:8000/-/versions.json # Should output JSON showing the installed version -Datasette will not be accessible from outside the server because it is listening on ``127.0.0.1``. You can expose it by instead listening on ``0.0.0.0``, but a better way is to set up a proxy such as ``nginx``. +Datasette will not be accessible from outside the server because it is listening on ``127.0.0.1``. You can expose it by instead listening on ``0.0.0.0``, but a better way is to set up a proxy such as ``nginx`` - see :ref:`deploying_proxy`. -Ubuntu offer `a tutorial on installing nginx `__. Once it is installed you can add configuration to proxy traffic through to Datasette that looks like this:: +.. _deploying_openrc: - server { - server_name mysubdomain.myhost.net; +Running Datasette using OpenRC +=============================== +OpenRC is the service manager on non-systemd Linux distributions like `Alpine Linux `__ and `Gentoo `__. - location / { - proxy_pass http://127.0.0.1:8000/; - proxy_set_header Host $host; - } - } +Create an init script at ``/etc/init.d/datasette`` with the following contents: + +.. code-block:: sh + + #!/sbin/openrc-run + + name="datasette" + command="datasette" + command_args="serve -h 0.0.0.0 /path/to/db.db" + command_background=true + pidfile="/run/${RC_SVCNAME}.pid" + +You then need to configure the service to run at boot and start it:: + + rc-update add datasette + rc-service datasette start .. _deploying_buildpacks: From 34defdc10aa293294ca01cfab70780755447e1d7 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 28 Sep 2022 17:39:36 -0700 Subject: [PATCH 157/952] Browse the plugins directory --- docs/writing_plugins.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/writing_plugins.rst b/docs/writing_plugins.rst index 01ee8c90..a3fc88ec 100644 --- a/docs/writing_plugins.rst +++ b/docs/writing_plugins.rst @@ -234,7 +234,7 @@ To avoid accidentally conflicting with a database file that may be loaded into D - ``/-/upload-excel`` -Try to avoid registering URLs that clash with other plugins that your users might have installed. There is no central repository of reserved URL paths (yet) but you can review existing plugins by browsing the `datasette-plugin topic `__ on GitHub. +Try to avoid registering URLs that clash with other plugins that your users might have installed. There is no central repository of reserved URL paths (yet) but you can review existing plugins by browsing the `plugins directory `. If your plugin includes functionality that relates to a specific database you could also register a URL route like this: From c92c4318e9892101f75fa158410c0a12c1d80b6e Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 30 Sep 2022 10:55:40 -0700 Subject: [PATCH 158/952] Bump furo from 2022.9.15 to 2022.9.29 (#1827) Bumps [furo](https://github.com/pradyunsg/furo) from 2022.9.15 to 2022.9.29. - [Release notes](https://github.com/pradyunsg/furo/releases) - [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md) - [Commits](https://github.com/pradyunsg/furo/compare/2022.09.15...2022.09.29) --- updated-dependencies: - dependency-name: furo dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index afcba1f0..fe258adb 100644 --- a/setup.py +++ b/setup.py @@ -65,7 +65,7 @@ setup( setup_requires=["pytest-runner"], extras_require={ "docs": [ - "furo==2022.9.15", + "furo==2022.9.29", "sphinx-autobuild", "codespell", "blacken-docs", From 883e326dd6ef95f854f7750ef2d4b0e17082fa96 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 2 Oct 2022 14:26:16 -0700 Subject: [PATCH 159/952] Drop word-wrap: anywhere, refs #1828, #1805 --- datasette/static/app.css | 1 - 1 file changed, 1 deletion(-) diff --git a/datasette/static/app.css b/datasette/static/app.css index 08b724f6..712b9925 100644 --- a/datasette/static/app.css +++ b/datasette/static/app.css @@ -446,7 +446,6 @@ th { } table a:link { text-decoration: none; - word-wrap: anywhere; } .rows-and-columns td:before { display: block; From 4218c9cd742b79b1e3cb80878e42b7e39d16ded2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 4 Oct 2022 11:45:36 -0700 Subject: [PATCH 160/952] reST markup fix --- docs/plugin_hooks.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index c9cab8ab..832a76b0 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -268,7 +268,7 @@ you have one: def extra_js_urls(): return ["/-/static-plugins/your-plugin/app.js"] -Note that `your-plugin` here should be the hyphenated plugin name - the name that is displayed in the list on the `/-/plugins` debug page. +Note that ``your-plugin`` here should be the hyphenated plugin name - the name that is displayed in the list on the ``/-/plugins`` debug page. If your code uses `JavaScript modules `__ you should include the ``"module": True`` key. See :ref:`customization_css_and_javascript` for more details. From b6ba117b7978b58b40e3c3c2b723b92c3010ed53 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 4 Oct 2022 18:25:52 -0700 Subject: [PATCH 161/952] Clarify request or None for two hooks --- docs/plugin_hooks.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst index 832a76b0..b61f953a 100644 --- a/docs/plugin_hooks.rst +++ b/docs/plugin_hooks.rst @@ -1281,7 +1281,7 @@ menu_links(datasette, actor, request) ``actor`` - dictionary or None The currently authenticated :ref:`actor `. -``request`` - :ref:`internals_request` +``request`` - :ref:`internals_request` or None The current HTTP request. This can be ``None`` if the request object is not available. This hook allows additional items to be included in the menu displayed by Datasette's top right menu icon. @@ -1330,7 +1330,7 @@ table_actions(datasette, actor, database, table, request) ``table`` - string The name of the table. -``request`` - :ref:`internals_request` +``request`` - :ref:`internals_request` or None The current HTTP request. This can be ``None`` if the request object is not available. This hook allows table actions to be displayed in a menu accessed via an action icon at the top of the table page. It should return a list of ``{"href": "...", "label": "..."}`` menu items. From bbf33a763537a1d913180b22bd3b5fe4a5e5b252 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 4 Oct 2022 21:32:11 -0700 Subject: [PATCH 162/952] Test for bool(results), closes #1832 --- tests/test_internals_database.py | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/tests/test_internals_database.py b/tests/test_internals_database.py index 9e81c1d6..4e33beed 100644 --- a/tests/test_internals_database.py +++ b/tests/test_internals_database.py @@ -30,6 +30,14 @@ async def test_results_first(db): assert isinstance(row, sqlite3.Row) +@pytest.mark.asyncio +@pytest.mark.parametrize("expected", (True, False)) +async def test_results_bool(db, expected): + where = "" if expected else "where pk = 0" + results = await db.execute("select * from facetable {}".format(where)) + assert bool(results) is expected + + @pytest.mark.parametrize( "query,expected", [ From eff112498ecc499323c26612d707908831446d25 Mon Sep 17 00:00:00 2001 From: Forest Gregg Date: Thu, 6 Oct 2022 16:06:06 -0400 Subject: [PATCH 163/952] Useuse inspect data for hash and file size on startup Thanks, @fgregg Closes #1834 --- datasette/database.py | 10 +++++++--- 1 file changed, 7 insertions(+), 3 deletions(-) diff --git a/datasette/database.py b/datasette/database.py index 46094bd7..d75bd70c 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -48,9 +48,13 @@ class Database: self._read_connection = None self._write_connection = None if not self.is_mutable and not self.is_memory: - p = Path(path) - self.hash = inspect_hash(p) - self.cached_size = p.stat().st_size + if self.ds.inspect_data and self.ds.inspect_data.get(self.name): + self.hash = self.ds.inspect_data[self.name]["hash"] + self.cached_size = self.ds.inspect_data[self.name]["size"] + else: + p = Path(path) + self.hash = inspect_hash(p) + self.cached_size = p.stat().st_size @property def cached_table_counts(self): From b7fec7f9020b79c1fe60cc5a2def86b50eeb5af9 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Fri, 7 Oct 2022 16:03:09 -0700 Subject: [PATCH 164/952] .sqlite/.sqlite3 extensions for config directory mode Closes #1646 --- datasette/app.py | 5 ++++- docs/settings.rst | 2 +- tests/test_config_dir.py | 11 +++++------ 3 files changed, 10 insertions(+), 8 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 03d1dacc..32a911c2 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -217,7 +217,10 @@ class Datasette: self._secret = secret or secrets.token_hex(32) self.files = tuple(files or []) + tuple(immutables or []) if config_dir: - self.files += tuple([str(p) for p in config_dir.glob("*.db")]) + db_files = [] + for ext in ("db", "sqlite", "sqlite3"): + db_files.extend(config_dir.glob("*.{}".format(ext))) + self.files += tuple(str(f) for f in db_files) if ( config_dir and (config_dir / "inspect-data.json").exists() diff --git a/docs/settings.rst b/docs/settings.rst index 8437fb04..a6d50543 100644 --- a/docs/settings.rst +++ b/docs/settings.rst @@ -46,7 +46,7 @@ Datasette will detect the files in that directory and automatically configure it The files that can be included in this directory are as follows. All are optional. -* ``*.db`` - SQLite database files that will be served by Datasette +* ``*.db`` (or ``*.sqlite3`` or ``*.sqlite``) - SQLite database files that will be served by Datasette * ``metadata.json`` - :ref:`metadata` for those databases - ``metadata.yaml`` or ``metadata.yml`` can be used as well * ``inspect-data.json`` - the result of running ``datasette inspect *.db --inspect-file=inspect-data.json`` from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running * ``settings.json`` - settings that would normally be passed using ``--setting`` - here they should be stored as a JSON object of key/value pairs diff --git a/tests/test_config_dir.py b/tests/test_config_dir.py index f5ecf0d6..c2af3836 100644 --- a/tests/test_config_dir.py +++ b/tests/test_config_dir.py @@ -49,7 +49,7 @@ def config_dir(tmp_path_factory): (config_dir / "metadata.json").write_text(json.dumps(METADATA), "utf-8") (config_dir / "settings.json").write_text(json.dumps(SETTINGS), "utf-8") - for dbname in ("demo.db", "immutable.db"): + for dbname in ("demo.db", "immutable.db", "j.sqlite3", "k.sqlite"): db = sqlite3.connect(str(config_dir / dbname)) db.executescript( """ @@ -151,12 +151,11 @@ def test_databases(config_dir_client): response = config_dir_client.get("/-/databases.json") assert 200 == response.status databases = response.json - assert 2 == len(databases) + assert 4 == len(databases) databases.sort(key=lambda d: d["name"]) - assert "demo" == databases[0]["name"] - assert databases[0]["is_mutable"] - assert "immutable" == databases[1]["name"] - assert not databases[1]["is_mutable"] + for db, expected_name in zip(databases, ("demo", "immutable", "j", "k")): + assert expected_name == db["name"] + assert db["is_mutable"] == (expected_name != "immutable") @pytest.mark.parametrize("filename", ("metadata.yml", "metadata.yaml")) From 1a5e5f2aa951e5bd731067a49819efba68fbe8ef Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 13 Oct 2022 14:42:52 -0700 Subject: [PATCH 165/952] Refactor breadcrumbs to respect permissions, refs #1831 --- datasette/app.py | 40 ++++++++++++++++++++++ datasette/templates/_crumbs.html | 15 ++++++++ datasette/templates/base.html | 4 +-- datasette/templates/database.html | 9 ----- datasette/templates/error.html | 7 ---- datasette/templates/logout.html | 7 ---- datasette/templates/permissions_debug.html | 7 ---- datasette/templates/query.html | 8 ++--- datasette/templates/row.html | 9 ++--- datasette/templates/show_json.html | 7 ---- datasette/templates/table.html | 8 ++--- tests/test_permissions.py | 1 + tests/test_plugins.py | 2 +- 13 files changed, 65 insertions(+), 59 deletions(-) create mode 100644 datasette/templates/_crumbs.html diff --git a/datasette/app.py b/datasette/app.py index 32a911c2..5fa4955c 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -631,6 +631,44 @@ class Datasette: else: return [] + async def _crumb_items(self, request, table=None, database=None): + crumbs = [] + # Top-level link + if await self.permission_allowed( + actor=request.actor, action="view-instance", default=True + ): + crumbs.append({"href": self.urls.instance(), "label": "home"}) + # Database link + if database: + if await self.permission_allowed( + actor=request.actor, + action="view-database", + resource=database, + default=True, + ): + crumbs.append( + { + "href": self.urls.database(database), + "label": database, + } + ) + # Table link + if table: + assert database, "table= requires database=" + if await self.permission_allowed( + actor=request.actor, + action="view-table", + resource=(database, table), + default=True, + ): + crumbs.append( + { + "href": self.urls.table(database, table), + "label": table, + } + ) + return crumbs + async def permission_allowed(self, actor, action, resource=None, default=False): """Check permissions using the permissions_allowed plugin hook""" result = None @@ -1009,6 +1047,8 @@ class Datasette: template_context = { **context, **{ + "request": request, + "crumb_items": self._crumb_items, "urls": self.urls, "actor": request.actor if request else None, "menu_links": menu_links, diff --git a/datasette/templates/_crumbs.html b/datasette/templates/_crumbs.html new file mode 100644 index 00000000..bd1ff0da --- /dev/null +++ b/datasette/templates/_crumbs.html @@ -0,0 +1,15 @@ +{% macro nav(request, database=None, table=None) -%} +{% if crumb_items is defined %} + {% set items=crumb_items(request=request, database=database, table=table) %} + {% if items %} +

+ {% for item in items %} + {{ item.label }} + {% if not loop.last %} + / + {% endif %} + {% endfor %} +

+ {% endif %} +{% endif %} +{%- endmacro %} diff --git a/datasette/templates/base.html b/datasette/templates/base.html index c3a71acb..87c939ac 100644 --- a/datasette/templates/base.html +++ b/datasette/templates/base.html @@ -1,4 +1,4 @@ - +{% import "_crumbs.html" as crumbs with context %} {% block title %}{% endblock %} @@ -17,7 +17,7 @@
http…http…\xa0http…this …http…
+ Content-Type: application/json + Authorization: Bearer dstok_ + { + "row": { + "column1": "value1", + "column2": "value2" + } + } + +If successful, this will return a ``201`` status code and the newly inserted row, for example: + +.. code-block:: json + + { + "row": { + "id": 1, + "column1": "value1", + "column2": "value2" + } + } From f6ca86987ba9d7d48eccf2cfe0bfc94942003844 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 06:56:11 -0700 Subject: [PATCH 197/952] Delete mirror-master-and-main.yml Closes #1865 --- .github/workflows/mirror-master-and-main.yml | 21 -------------------- 1 file changed, 21 deletions(-) delete mode 100644 .github/workflows/mirror-master-and-main.yml diff --git a/.github/workflows/mirror-master-and-main.yml b/.github/workflows/mirror-master-and-main.yml deleted file mode 100644 index 8418df40..00000000 --- a/.github/workflows/mirror-master-and-main.yml +++ /dev/null @@ -1,21 +0,0 @@ -name: Mirror "master" and "main" branches -on: - push: - branches: - - master - - main - -jobs: - mirror: - runs-on: ubuntu-latest - steps: - - name: Mirror to "master" - uses: zofrex/mirror-branch@ea152f124954fa4eb26eea3fe0dbe313a3a08d94 - with: - target-branch: master - force: false - - name: Mirror to "main" - uses: zofrex/mirror-branch@ea152f124954fa4eb26eea3fe0dbe313a3a08d94 - with: - target-branch: main - force: false From 5f6be3c48b661f74198b8fc85361d3ad6657880e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 11:47:41 -0700 Subject: [PATCH 198/952] Better comment handling in SQL regex, refs #1860 --- datasette/utils/__init__.py | 9 +++++---- tests/test_utils.py | 1 + 2 files changed, 6 insertions(+), 4 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index 977a66d6..5acfb8b4 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -208,16 +208,16 @@ class InvalidSql(Exception): # Allow SQL to start with a /* */ or -- comment comment_re = ( # Start of string, then any amount of whitespace - r"^(\s*" + r"^\s*(" + # Comment that starts with -- and ends at a newline r"(?:\-\-.*?\n\s*)" + - # Comment that starts with /* and ends with */ - r"|(?:/\*[\s\S]*?\*/)" + # Comment that starts with /* and ends with */ - but does not have */ in it + r"|(?:\/\*((?!\*\/)[\s\S])*\*\/)" + # Whitespace - r")*\s*" + r"\s*)*\s*" ) allowed_sql_res = [ @@ -228,6 +228,7 @@ allowed_sql_res = [ re.compile(comment_re + r"explain\s+with\b"), re.compile(comment_re + r"explain\s+query\s+plan\s+with\b"), ] + allowed_pragmas = ( "database_list", "foreign_key_list", diff --git a/tests/test_utils.py b/tests/test_utils.py index e89f1e6b..c1589107 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -142,6 +142,7 @@ def test_custom_json_encoder(obj, expected): "PRAGMA case_sensitive_like = true", "SELECT * FROM pragma_not_on_allow_list('idx52')", "/* This comment is not valid. select 1", + "/**/\nupdate foo set bar = 1\n/* test */ select 1", ], ) def test_validate_sql_select_bad(bad_sql): From d2ca13b699d441a201c55cb72ff96919d3cd22bf Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 11:50:54 -0700 Subject: [PATCH 199/952] Add test for /* multi line */ comment, refs #1860 --- tests/test_utils.py | 1 + 1 file changed, 1 insertion(+) diff --git a/tests/test_utils.py b/tests/test_utils.py index c1589107..8b64f865 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -174,6 +174,7 @@ def test_validate_sql_select_bad(bad_sql): " /* comment */\nselect 1", " /* comment */select 1", "/* comment */\n -- another\n /* one more */ select 1", + "/* This comment \n has multiple lines */\nselect 1", ], ) def test_validate_sql_select_good(good_sql): From 918f3561208ee58c44773d30e21bace7d7c7cf3b Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 06:56:11 -0700 Subject: [PATCH 200/952] Delete mirror-master-and-main.yml Closes #1865 --- .github/workflows/mirror-master-and-main.yml | 21 -------------------- 1 file changed, 21 deletions(-) delete mode 100644 .github/workflows/mirror-master-and-main.yml diff --git a/.github/workflows/mirror-master-and-main.yml b/.github/workflows/mirror-master-and-main.yml deleted file mode 100644 index 8418df40..00000000 --- a/.github/workflows/mirror-master-and-main.yml +++ /dev/null @@ -1,21 +0,0 @@ -name: Mirror "master" and "main" branches -on: - push: - branches: - - master - - main - -jobs: - mirror: - runs-on: ubuntu-latest - steps: - - name: Mirror to "master" - uses: zofrex/mirror-branch@ea152f124954fa4eb26eea3fe0dbe313a3a08d94 - with: - target-branch: master - force: false - - name: Mirror to "main" - uses: zofrex/mirror-branch@ea152f124954fa4eb26eea3fe0dbe313a3a08d94 - with: - target-branch: main - force: false From b597bb6b3e7c4b449654bbfa5b01ceff3eb3cb33 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 11:47:41 -0700 Subject: [PATCH 201/952] Better comment handling in SQL regex, refs #1860 --- datasette/utils/__init__.py | 9 +++++---- tests/test_utils.py | 1 + 2 files changed, 6 insertions(+), 4 deletions(-) diff --git a/datasette/utils/__init__.py b/datasette/utils/__init__.py index 977a66d6..5acfb8b4 100644 --- a/datasette/utils/__init__.py +++ b/datasette/utils/__init__.py @@ -208,16 +208,16 @@ class InvalidSql(Exception): # Allow SQL to start with a /* */ or -- comment comment_re = ( # Start of string, then any amount of whitespace - r"^(\s*" + r"^\s*(" + # Comment that starts with -- and ends at a newline r"(?:\-\-.*?\n\s*)" + - # Comment that starts with /* and ends with */ - r"|(?:/\*[\s\S]*?\*/)" + # Comment that starts with /* and ends with */ - but does not have */ in it + r"|(?:\/\*((?!\*\/)[\s\S])*\*\/)" + # Whitespace - r")*\s*" + r"\s*)*\s*" ) allowed_sql_res = [ @@ -228,6 +228,7 @@ allowed_sql_res = [ re.compile(comment_re + r"explain\s+with\b"), re.compile(comment_re + r"explain\s+query\s+plan\s+with\b"), ] + allowed_pragmas = ( "database_list", "foreign_key_list", diff --git a/tests/test_utils.py b/tests/test_utils.py index e89f1e6b..c1589107 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -142,6 +142,7 @@ def test_custom_json_encoder(obj, expected): "PRAGMA case_sensitive_like = true", "SELECT * FROM pragma_not_on_allow_list('idx52')", "/* This comment is not valid. select 1", + "/**/\nupdate foo set bar = 1\n/* test */ select 1", ], ) def test_validate_sql_select_bad(bad_sql): From 6958e21b5c2012adf5655d2512cb4106490d10f2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 11:50:54 -0700 Subject: [PATCH 202/952] Add test for /* multi line */ comment, refs #1860 --- tests/test_utils.py | 1 + 1 file changed, 1 insertion(+) diff --git a/tests/test_utils.py b/tests/test_utils.py index c1589107..8b64f865 100644 --- a/tests/test_utils.py +++ b/tests/test_utils.py @@ -174,6 +174,7 @@ def test_validate_sql_select_bad(bad_sql): " /* comment */\nselect 1", " /* comment */select 1", "/* comment */\n -- another\n /* one more */ select 1", + "/* This comment \n has multiple lines */\nselect 1", ], ) def test_validate_sql_select_good(good_sql): From a51608090b5ee37593078f71d18b33767ef3af79 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 12:06:18 -0700 Subject: [PATCH 203/952] Slight tweak to insert row API design, refs #1851 https://github.com/simonw/datasette/issues/1851#issuecomment-1292997608 --- datasette/views/table.py | 10 +++++----- docs/json_api.rst | 4 ++-- 2 files changed, 7 insertions(+), 7 deletions(-) diff --git a/datasette/views/table.py b/datasette/views/table.py index 74d1c532..056b7b04 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -131,11 +131,11 @@ class TableView(DataView): # TODO: handle form-encoded data raise BadRequest("Must send JSON data") data = json.loads(await request.post_body()) - if "row" not in data: - raise BadRequest('Must send "row" data') - row = data["row"] + if "insert" not in data: + raise BadRequest('Must send a "insert" key containing a dictionary') + row = data["insert"] if not isinstance(row, dict): - raise BadRequest("row must be a dictionary") + raise BadRequest("insert must be a dictionary") # Verify all columns exist columns = await db.table_columns(table_name) pks = await db.primary_keys(table_name) @@ -165,7 +165,7 @@ class TableView(DataView): ).first() return Response.json( { - "row": dict(new_row), + "inserted_row": dict(new_row), }, status=201, ) diff --git a/docs/json_api.rst b/docs/json_api.rst index b339a738..2ed8a354 100644 --- a/docs/json_api.rst +++ b/docs/json_api.rst @@ -476,7 +476,7 @@ This requires the :ref:`permissions_insert_row` permission. Content-Type: application/json Authorization: Bearer dstok_ { - "row": { + "insert": { "column1": "value1", "column2": "value2" } @@ -487,7 +487,7 @@ If successful, this will return a ``201`` status code and the newly inserted row .. code-block:: json { - "row": { + "inserted_row": { "id": 1, "column1": "value1", "column2": "value2" From a2a5dff709c6f1676ac30b5e734c2763002562cf Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 12:08:26 -0700 Subject: [PATCH 204/952] Missing tests for insert row API, refs #1851 --- tests/test_api_write.py | 38 ++++++++++++++++++++++++++++++++++++++ 1 file changed, 38 insertions(+) create mode 100644 tests/test_api_write.py diff --git a/tests/test_api_write.py b/tests/test_api_write.py new file mode 100644 index 00000000..86c221d0 --- /dev/null +++ b/tests/test_api_write.py @@ -0,0 +1,38 @@ +from datasette.app import Datasette +from datasette.utils import sqlite3 +import pytest +import time + + +@pytest.fixture +def ds_write(tmp_path_factory): + db_directory = tmp_path_factory.mktemp("dbs") + db_path = str(db_directory / "data.db") + db = sqlite3.connect(str(db_path)) + db.execute("vacuum") + db.execute("create table docs (id integer primary key, title text, score float)") + ds = Datasette([db_path]) + yield ds + db.close() + + +@pytest.mark.asyncio +async def test_write_row(ds_write): + token = "dstok_{}".format( + ds_write.sign( + {"a": "root", "token": "dstok", "t": int(time.time())}, namespace="token" + ) + ) + response = await ds_write.client.post( + "/data/docs", + json={"insert": {"title": "Test", "score": 1.0}}, + headers={ + "Authorization": "Bearer {}".format(token), + "Content-Type": "application/json", + }, + ) + expected_row = {"id": 1, "title": "Test", "score": 1.0} + assert response.status_code == 201 + assert response.json()["inserted_row"] == expected_row + rows = (await ds_write.get_database("data").execute("select * from docs")).rows + assert dict(rows[0]) == expected_row From 6e788b49edf4f842c0817f006eb9d865778eea5e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 13:17:18 -0700 Subject: [PATCH 205/952] New URL design /db/table/-/insert, refs #1851 --- datasette/app.py | 6 +++- datasette/views/table.py | 69 +++++++++++++++++++++++++++++++++++++++- docs/json_api.rst | 18 ++++++----- tests/test_api_write.py | 6 ++-- 4 files changed, 86 insertions(+), 13 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 894d7f0f..8bc5fe36 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -39,7 +39,7 @@ from .views.special import ( PermissionsDebugView, MessagesDebugView, ) -from .views.table import TableView +from .views.table import TableView, TableInsertView from .views.row import RowView from .renderer import json_renderer from .url_builder import Urls @@ -1262,6 +1262,10 @@ class Datasette: RowView.as_view(self), r"/(?P[^\/\.]+)/(?P
[^/]+?)/(?P[^/]+?)(\.(?P\w+))?$", ) + add_route( + TableInsertView.as_view(self), + r"/(?P[^\/\.]+)/(?P
[^\/\.]+)/-/insert$", + ) return [ # Compile any strings to regular expressions ((re.compile(pattern) if isinstance(pattern, str) else pattern), view) diff --git a/datasette/views/table.py b/datasette/views/table.py index 056b7b04..be3d4f93 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -30,7 +30,7 @@ from datasette.utils import ( ) from datasette.utils.asgi import BadRequest, Forbidden, NotFound, Response from datasette.filters import Filters -from .base import DataView, DatasetteError, ureg +from .base import BaseView, DataView, DatasetteError, ureg from .database import QueryView LINK_WITH_LABEL = ( @@ -1077,3 +1077,70 @@ async def display_columns_and_rows( } columns = [first_column] + columns return columns, cell_rows + + +class TableInsertView(BaseView): + name = "table-insert" + + def __init__(self, datasette): + self.ds = datasette + + async def post(self, request): + database_route = tilde_decode(request.url_vars["database"]) + try: + db = self.ds.get_database(route=database_route) + except KeyError: + raise NotFound("Database not found: {}".format(database_route)) + database_name = db.name + table_name = tilde_decode(request.url_vars["table"]) + # Table must exist (may handle table creation in the future) + db = self.ds.get_database(database_name) + if not await db.table_exists(table_name): + raise NotFound("Table not found: {}".format(table_name)) + # Must have insert-row permission + if not await self.ds.permission_allowed( + request.actor, "insert-row", resource=(database_name, table_name) + ): + raise Forbidden("Permission denied") + if request.headers.get("content-type") != "application/json": + # TODO: handle form-encoded data + raise BadRequest("Must send JSON data") + data = json.loads(await request.post_body()) + if "row" not in data: + raise BadRequest('Must send a "row" key containing a dictionary') + row = data["row"] + if not isinstance(row, dict): + raise BadRequest("row must be a dictionary") + # Verify all columns exist + columns = await db.table_columns(table_name) + pks = await db.primary_keys(table_name) + for key in row: + if key not in columns: + raise BadRequest("Column not found: {}".format(key)) + if key in pks: + raise BadRequest( + "Cannot insert into primary key column: {}".format(key) + ) + # Perform the insert + sql = "INSERT INTO [{table}] ({columns}) VALUES ({values})".format( + table=escape_sqlite(table_name), + columns=", ".join(escape_sqlite(c) for c in row), + values=", ".join("?" for c in row), + ) + cursor = await db.execute_write(sql, list(row.values())) + # Return the new row + rowid = cursor.lastrowid + new_row = ( + await db.execute( + "SELECT * FROM [{table}] WHERE rowid = ?".format( + table=escape_sqlite(table_name) + ), + [rowid], + ) + ).first() + return Response.json( + { + "inserted": [dict(new_row)], + }, + status=201, + ) diff --git a/docs/json_api.rst b/docs/json_api.rst index 2ed8a354..4a7961f2 100644 --- a/docs/json_api.rst +++ b/docs/json_api.rst @@ -463,7 +463,7 @@ The JSON write API Datasette provides a write API for JSON data. This is a POST-only API that requires an authenticated API token, see :ref:`CreateTokenView`. -.. _json_api_write_insert_row: +.. _TableInsertView: Inserting a single row ~~~~~~~~~~~~~~~~~~~~~~ @@ -472,11 +472,11 @@ This requires the :ref:`permissions_insert_row` permission. :: - POST //
+ POST //
/-/insert Content-Type: application/json Authorization: Bearer dstok_ { - "insert": { + "row": { "column1": "value1", "column2": "value2" } @@ -487,9 +487,11 @@ If successful, this will return a ``201`` status code and the newly inserted row .. code-block:: json { - "inserted_row": { - "id": 1, - "column1": "value1", - "column2": "value2" - } + "inserted": [ + { + "id": 1, + "column1": "value1", + "column2": "value2" + } + ] } diff --git a/tests/test_api_write.py b/tests/test_api_write.py index 86c221d0..e8222e43 100644 --- a/tests/test_api_write.py +++ b/tests/test_api_write.py @@ -24,8 +24,8 @@ async def test_write_row(ds_write): ) ) response = await ds_write.client.post( - "/data/docs", - json={"insert": {"title": "Test", "score": 1.0}}, + "/data/docs/-/insert", + json={"row": {"title": "Test", "score": 1.0}}, headers={ "Authorization": "Bearer {}".format(token), "Content-Type": "application/json", @@ -33,6 +33,6 @@ async def test_write_row(ds_write): ) expected_row = {"id": 1, "title": "Test", "score": 1.0} assert response.status_code == 201 - assert response.json()["inserted_row"] == expected_row + assert response.json()["inserted"] == [expected_row] rows = (await ds_write.get_database("data").execute("select * from docs")).rows assert dict(rows[0]) == expected_row From b912d92b651c4f0b5137da924d135654511f0fe0 Mon Sep 17 00:00:00 2001 From: Forest Gregg Date: Thu, 27 Oct 2022 16:51:20 -0400 Subject: [PATCH 206/952] Make hash and size a lazy property (#1837) * use inspect data for hash and file size * make hash and cached_size lazy properties * move hash property near size --- datasette/database.py | 36 ++++++++++++++++++++++++------------ 1 file changed, 24 insertions(+), 12 deletions(-) diff --git a/datasette/database.py b/datasette/database.py index d75bd70c..af1df0a8 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -39,7 +39,7 @@ class Database: self.memory_name = memory_name if memory_name is not None: self.is_memory = True - self.hash = None + self.cached_hash = None self.cached_size = None self._cached_table_counts = None self._write_thread = None @@ -47,14 +47,6 @@ class Database: # These are used when in non-threaded mode: self._read_connection = None self._write_connection = None - if not self.is_mutable and not self.is_memory: - if self.ds.inspect_data and self.ds.inspect_data.get(self.name): - self.hash = self.ds.inspect_data[self.name]["hash"] - self.cached_size = self.ds.inspect_data[self.name]["size"] - else: - p = Path(path) - self.hash = inspect_hash(p) - self.cached_size = p.stat().st_size @property def cached_table_counts(self): @@ -266,14 +258,34 @@ class Database: results = await self.execute_fn(sql_operation_in_thread) return results + @property + def hash(self): + if self.cached_hash is not None: + return self.cached_hash + elif self.is_mutable or self.is_memory: + return None + elif self.ds.inspect_data and self.ds.inspect_data.get(self.name): + self.cached_hash = self.ds.inspect_data[self.name]["hash"] + return self.cached_hash + else: + p = Path(self.path) + self.cached_hash = inspect_hash(p) + return self.cached_hash + @property def size(self): - if self.is_memory: - return 0 if self.cached_size is not None: return self.cached_size - else: + elif self.is_memory: + return 0 + elif self.is_mutable: return Path(self.path).stat().st_size + elif self.ds.inspect_data and self.ds.inspect_data.get(self.name): + self.cached_size = self.ds.inspect_data[self.name]["size"] + return self.cached_size + else: + self.cached_size = Path(self.path).stat().st_size + return self.cached_size async def table_counts(self, limit=10): if not self.is_mutable and self.cached_table_counts is not None: From 2c36e45447494cd7505440943367e29ec57c8e72 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 27 Oct 2022 13:51:45 -0700 Subject: [PATCH 207/952] Bump black from 22.8.0 to 22.10.0 (#1839) Bumps [black](https://github.com/psf/black) from 22.8.0 to 22.10.0. - [Release notes](https://github.com/psf/black/releases) - [Changelog](https://github.com/psf/black/blob/main/CHANGES.md) - [Commits](https://github.com/psf/black/compare/22.8.0...22.10.0) --- updated-dependencies: - dependency-name: black dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index fe258adb..625557ae 100644 --- a/setup.py +++ b/setup.py @@ -76,7 +76,7 @@ setup( "pytest-xdist>=2.2.1", "pytest-asyncio>=0.17", "beautifulsoup4>=4.8.1", - "black==22.8.0", + "black==22.10.0", "blacken-docs==1.12.1", "pytest-timeout>=1.4.2", "trustme>=0.7", From e5e0459a0b60608cb5e9ff83f6b41f59e6cafdfd Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 13:58:00 -0700 Subject: [PATCH 208/952] Release notes for 0.63, refs #1869 --- docs/changelog.rst | 44 +++++++++++++++++++++++++------------------- 1 file changed, 25 insertions(+), 19 deletions(-) diff --git a/docs/changelog.rst b/docs/changelog.rst index 2255dcce..01957e4f 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,36 +4,42 @@ Changelog ========= -.. _v0_63a1: +.. _v0_63: -0.63a1 (2022-10-23) -------------------- +0.63 (2022-10-27) +----------------- +Features +~~~~~~~~ + +- Now tested against Python 3.11. Docker containers used by ``datasette publish`` and ``datasette package`` both now use that version of Python. (:issue:`1853`) +- ``--load-extension`` option now supports entrypoints. Thanks, Alex Garcia. (`#1789 `__) +- Facet size can now be set per-table with the new ``facet_size`` table metadata option. (:issue:`1804`) +- The :ref:`setting_truncate_cells_html` setting now also affects long URLs in columns. (:issue:`1805`) +- The non-JavaScript SQL editor textarea now increases height to fit the SQL query. (:issue:`1786`) +- Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. (`#1794 `__) +- The ``settings.json`` file used in :ref:`config_dir` is now validated on startup. (:issue:`1816`) +- SQL queries can now include leading SQL comments, using ``/* ... */`` or ``-- ...`` syntax. Thanks, Charles Nepote. (:issue:`1860`) - SQL query is now re-displayed when terminated with a time limit error. (:issue:`1819`) -- New documentation on :ref:`deploying_openrc` - thanks, Adam Simpson. (`#1825 `__) - The :ref:`inspect data ` mechanism is now used to speed up server startup - thanks, Forest Gregg. (:issue:`1834`) - In :ref:`config_dir` databases with filenames ending in ``.sqlite`` or ``.sqlite3`` are now automatically added to the Datasette instance. (:issue:`1646`) - Breadcrumb navigation display now respects the current user's permissions. (:issue:`1831`) -- Screenshots in the documentation are now maintained using `shot-scraper `__, as described in `Automating screenshots for the Datasette documentation using shot-scraper `__. (:issue:`1844`) -- The :ref:`datasette.check_visibility() ` method now accepts an optional ``permissions=`` list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. (:issue:`1829`) - -.. _v0_63a0: - -0.63a0 (2022-09-26) -------------------- +Plugin hooks and internals +~~~~~~~~~~~~~~~~~~~~~~~~~~ - The :ref:`plugin_hook_prepare_jinja2_environment` plugin hook now accepts an optional ``datasette`` argument. Hook implementations can also now return an ``async`` function which will be awaited automatically. (:issue:`1809`) -- ``--load-extension`` option now supports entrypoints. Thanks, Alex Garcia. (`#1789 `__) -- New tutorial: `Cleaning data with sqlite-utils and Datasette `__. -- Facet size can now be set per-table with the new ``facet_size`` table metadata option. (:issue:`1804`) -- ``truncate_cells_html`` setting now also affects long URLs in columns. (:issue:`1805`) - ``Database(is_mutable=)`` now defaults to ``True``. (:issue:`1808`) -- Non-JavaScript textarea now increases height to fit the SQL query. (:issue:`1786`) -- More detailed command descriptions on the :ref:`CLI reference ` page. (:issue:`1787`) +- The :ref:`datasette.check_visibility() ` method now accepts an optional ``permissions=`` list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. (:issue:`1829`) - Datasette no longer enforces upper bounds on its dependencies. (:issue:`1800`) -- Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. (`#1794 `__) -- The ``settings.json`` file used in :ref:`config_dir` is now validated on startup. (:issue:`1816`) + +Documentation +~~~~~~~~~~~~~ + +- New tutorial: `Cleaning data with sqlite-utils and Datasette `__. +- Screenshots in the documentation are now maintained using `shot-scraper `__, as described in `Automating screenshots for the Datasette documentation using shot-scraper `__. (:issue:`1844`) +- More detailed command descriptions on the :ref:`CLI reference ` page. (:issue:`1787`) +- New documentation on :ref:`deploying_openrc` - thanks, Adam Simpson. (`#1825 `__) .. _v0_62: From bf00b0b59b6692bdec597ac9db4e0b497c5a47b4 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 15:11:26 -0700 Subject: [PATCH 209/952] Release 0.63 Refs #1646, #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816, #1819, #1825, #1829, #1831, #1834, #1844, #1853, #1860 Closes #1869 --- datasette/version.py | 2 +- docs/changelog.rst | 2 ++ 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index eb36da45..ac012640 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.63a1" +__version__ = "0.63" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index 01957e4f..f573afb3 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -9,6 +9,8 @@ Changelog 0.63 (2022-10-27) ----------------- +See `Datasette 0.63: The annotated release notes `__ for more background on the changes in this release. + Features ~~~~~~~~ From 2ea60e12d90b7cec03ebab728854d3ec4d553f54 Mon Sep 17 00:00:00 2001 From: Forest Gregg Date: Thu, 27 Oct 2022 16:51:20 -0400 Subject: [PATCH 210/952] Make hash and size a lazy property (#1837) * use inspect data for hash and file size * make hash and cached_size lazy properties * move hash property near size --- datasette/database.py | 36 ++++++++++++++++++++++++------------ 1 file changed, 24 insertions(+), 12 deletions(-) diff --git a/datasette/database.py b/datasette/database.py index d75bd70c..af1df0a8 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -39,7 +39,7 @@ class Database: self.memory_name = memory_name if memory_name is not None: self.is_memory = True - self.hash = None + self.cached_hash = None self.cached_size = None self._cached_table_counts = None self._write_thread = None @@ -47,14 +47,6 @@ class Database: # These are used when in non-threaded mode: self._read_connection = None self._write_connection = None - if not self.is_mutable and not self.is_memory: - if self.ds.inspect_data and self.ds.inspect_data.get(self.name): - self.hash = self.ds.inspect_data[self.name]["hash"] - self.cached_size = self.ds.inspect_data[self.name]["size"] - else: - p = Path(path) - self.hash = inspect_hash(p) - self.cached_size = p.stat().st_size @property def cached_table_counts(self): @@ -266,14 +258,34 @@ class Database: results = await self.execute_fn(sql_operation_in_thread) return results + @property + def hash(self): + if self.cached_hash is not None: + return self.cached_hash + elif self.is_mutable or self.is_memory: + return None + elif self.ds.inspect_data and self.ds.inspect_data.get(self.name): + self.cached_hash = self.ds.inspect_data[self.name]["hash"] + return self.cached_hash + else: + p = Path(self.path) + self.cached_hash = inspect_hash(p) + return self.cached_hash + @property def size(self): - if self.is_memory: - return 0 if self.cached_size is not None: return self.cached_size - else: + elif self.is_memory: + return 0 + elif self.is_mutable: return Path(self.path).stat().st_size + elif self.ds.inspect_data and self.ds.inspect_data.get(self.name): + self.cached_size = self.ds.inspect_data[self.name]["size"] + return self.cached_size + else: + self.cached_size = Path(self.path).stat().st_size + return self.cached_size async def table_counts(self, limit=10): if not self.is_mutable and self.cached_table_counts is not None: From 641bc4453b5ef1dff0b2fc7dfad0b692be7aa61c Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 27 Oct 2022 13:51:45 -0700 Subject: [PATCH 211/952] Bump black from 22.8.0 to 22.10.0 (#1839) Bumps [black](https://github.com/psf/black) from 22.8.0 to 22.10.0. - [Release notes](https://github.com/psf/black/releases) - [Changelog](https://github.com/psf/black/blob/main/CHANGES.md) - [Commits](https://github.com/psf/black/compare/22.8.0...22.10.0) --- updated-dependencies: - dependency-name: black dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index fe258adb..625557ae 100644 --- a/setup.py +++ b/setup.py @@ -76,7 +76,7 @@ setup( "pytest-xdist>=2.2.1", "pytest-asyncio>=0.17", "beautifulsoup4>=4.8.1", - "black==22.8.0", + "black==22.10.0", "blacken-docs==1.12.1", "pytest-timeout>=1.4.2", "trustme>=0.7", From 26af9b9c4a6c62ee15870caa1c7bc455165d3b11 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 13:58:00 -0700 Subject: [PATCH 212/952] Release notes for 0.63, refs #1869 --- docs/changelog.rst | 44 +++++++++++++++++++++++++------------------- 1 file changed, 25 insertions(+), 19 deletions(-) diff --git a/docs/changelog.rst b/docs/changelog.rst index 2255dcce..01957e4f 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,36 +4,42 @@ Changelog ========= -.. _v0_63a1: +.. _v0_63: -0.63a1 (2022-10-23) -------------------- +0.63 (2022-10-27) +----------------- +Features +~~~~~~~~ + +- Now tested against Python 3.11. Docker containers used by ``datasette publish`` and ``datasette package`` both now use that version of Python. (:issue:`1853`) +- ``--load-extension`` option now supports entrypoints. Thanks, Alex Garcia. (`#1789 `__) +- Facet size can now be set per-table with the new ``facet_size`` table metadata option. (:issue:`1804`) +- The :ref:`setting_truncate_cells_html` setting now also affects long URLs in columns. (:issue:`1805`) +- The non-JavaScript SQL editor textarea now increases height to fit the SQL query. (:issue:`1786`) +- Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. (`#1794 `__) +- The ``settings.json`` file used in :ref:`config_dir` is now validated on startup. (:issue:`1816`) +- SQL queries can now include leading SQL comments, using ``/* ... */`` or ``-- ...`` syntax. Thanks, Charles Nepote. (:issue:`1860`) - SQL query is now re-displayed when terminated with a time limit error. (:issue:`1819`) -- New documentation on :ref:`deploying_openrc` - thanks, Adam Simpson. (`#1825 `__) - The :ref:`inspect data ` mechanism is now used to speed up server startup - thanks, Forest Gregg. (:issue:`1834`) - In :ref:`config_dir` databases with filenames ending in ``.sqlite`` or ``.sqlite3`` are now automatically added to the Datasette instance. (:issue:`1646`) - Breadcrumb navigation display now respects the current user's permissions. (:issue:`1831`) -- Screenshots in the documentation are now maintained using `shot-scraper `__, as described in `Automating screenshots for the Datasette documentation using shot-scraper `__. (:issue:`1844`) -- The :ref:`datasette.check_visibility() ` method now accepts an optional ``permissions=`` list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. (:issue:`1829`) - -.. _v0_63a0: - -0.63a0 (2022-09-26) -------------------- +Plugin hooks and internals +~~~~~~~~~~~~~~~~~~~~~~~~~~ - The :ref:`plugin_hook_prepare_jinja2_environment` plugin hook now accepts an optional ``datasette`` argument. Hook implementations can also now return an ``async`` function which will be awaited automatically. (:issue:`1809`) -- ``--load-extension`` option now supports entrypoints. Thanks, Alex Garcia. (`#1789 `__) -- New tutorial: `Cleaning data with sqlite-utils and Datasette `__. -- Facet size can now be set per-table with the new ``facet_size`` table metadata option. (:issue:`1804`) -- ``truncate_cells_html`` setting now also affects long URLs in columns. (:issue:`1805`) - ``Database(is_mutable=)`` now defaults to ``True``. (:issue:`1808`) -- Non-JavaScript textarea now increases height to fit the SQL query. (:issue:`1786`) -- More detailed command descriptions on the :ref:`CLI reference ` page. (:issue:`1787`) +- The :ref:`datasette.check_visibility() ` method now accepts an optional ``permissions=`` list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. (:issue:`1829`) - Datasette no longer enforces upper bounds on its dependencies. (:issue:`1800`) -- Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. (`#1794 `__) -- The ``settings.json`` file used in :ref:`config_dir` is now validated on startup. (:issue:`1816`) + +Documentation +~~~~~~~~~~~~~ + +- New tutorial: `Cleaning data with sqlite-utils and Datasette `__. +- Screenshots in the documentation are now maintained using `shot-scraper `__, as described in `Automating screenshots for the Datasette documentation using shot-scraper `__. (:issue:`1844`) +- More detailed command descriptions on the :ref:`CLI reference ` page. (:issue:`1787`) +- New documentation on :ref:`deploying_openrc` - thanks, Adam Simpson. (`#1825 `__) .. _v0_62: From 61171f01549549e5fb25c72b13280d941d96dbf1 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 15:11:26 -0700 Subject: [PATCH 213/952] Release 0.63 Refs #1646, #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816, #1819, #1825, #1829, #1831, #1834, #1844, #1853, #1860 Closes #1869 --- datasette/version.py | 2 +- docs/changelog.rst | 2 ++ 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index eb36da45..ac012640 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "0.63a1" +__version__ = "0.63" __version_info__ = tuple(__version__.split(".")) diff --git a/docs/changelog.rst b/docs/changelog.rst index 01957e4f..f573afb3 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -9,6 +9,8 @@ Changelog 0.63 (2022-10-27) ----------------- +See `Datasette 0.63: The annotated release notes `__ for more background on the changes in this release. + Features ~~~~~~~~ From c9b5f5d598e7f85cd3e1ce020351a27da334408b Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Thu, 27 Oct 2022 17:58:36 -0700 Subject: [PATCH 214/952] Depend on sqlite-utils>=3.30 Decided to use the most recent version in case I decide later to use the flatten() utility function. Refs #1850 --- setup.py | 1 + 1 file changed, 1 insertion(+) diff --git a/setup.py b/setup.py index 625557ae..99e2a4ad 100644 --- a/setup.py +++ b/setup.py @@ -57,6 +57,7 @@ setup( "PyYAML>=5.3", "mergedeep>=1.1.1", "itsdangerous>=1.1", + "sqlite-utils>=3.30", ], entry_points=""" [console_scripts] From c35859ae3df163406f1a1895ccf9803e933b2d8e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 29 Oct 2022 23:03:45 -0700 Subject: [PATCH 215/952] API for bulk inserts, closes #1866 --- datasette/app.py | 5 ++ datasette/views/table.py | 136 +++++++++++++++++++++---------- docs/cli-reference.rst | 2 + docs/json_api.rst | 48 ++++++++++- docs/settings.rst | 11 +++ tests/test_api.py | 1 + tests/test_api_write.py | 168 +++++++++++++++++++++++++++++++++++++-- 7 files changed, 320 insertions(+), 51 deletions(-) diff --git a/datasette/app.py b/datasette/app.py index 8bc5fe36..f80d3792 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -99,6 +99,11 @@ SETTINGS = ( 1000, "Maximum rows that can be returned from a table or custom query", ), + Setting( + "max_insert_rows", + 100, + "Maximum rows that can be inserted at a time using the bulk insert API", + ), Setting( "num_sql_threads", 3, diff --git a/datasette/views/table.py b/datasette/views/table.py index be3d4f93..fd203036 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -30,6 +30,7 @@ from datasette.utils import ( ) from datasette.utils.asgi import BadRequest, Forbidden, NotFound, Response from datasette.filters import Filters +import sqlite_utils from .base import BaseView, DataView, DatasetteError, ureg from .database import QueryView @@ -1085,62 +1086,109 @@ class TableInsertView(BaseView): def __init__(self, datasette): self.ds = datasette + async def _validate_data(self, request, db, table_name): + errors = [] + + def _errors(errors): + return None, errors, {} + + if request.headers.get("content-type") != "application/json": + # TODO: handle form-encoded data + return _errors(["Invalid content-type, must be application/json"]) + body = await request.post_body() + try: + data = json.loads(body) + except json.JSONDecodeError as e: + return _errors(["Invalid JSON: {}".format(e)]) + if not isinstance(data, dict): + return _errors(["JSON must be a dictionary"]) + keys = data.keys() + # keys must contain "row" or "rows" + if "row" not in keys and "rows" not in keys: + return _errors(['JSON must have one or other of "row" or "rows"']) + rows = [] + if "row" in keys: + if "rows" in keys: + return _errors(['Cannot use "row" and "rows" at the same time']) + row = data["row"] + if not isinstance(row, dict): + return _errors(['"row" must be a dictionary']) + rows = [row] + data["return_rows"] = True + else: + rows = data["rows"] + if not isinstance(rows, list): + return _errors(['"rows" must be a list']) + for row in rows: + if not isinstance(row, dict): + return _errors(['"rows" must be a list of dictionaries']) + # Does this exceed max_insert_rows? + max_insert_rows = self.ds.setting("max_insert_rows") + if len(rows) > max_insert_rows: + return _errors( + ["Too many rows, maximum allowed is {}".format(max_insert_rows)] + ) + # Validate columns of each row + columns = await db.table_columns(table_name) + # TODO: There are cases where pks are OK, if not using auto-incrementing pk + pks = await db.primary_keys(table_name) + allowed_columns = set(columns) - set(pks) + for i, row in enumerate(rows): + invalid_columns = set(row.keys()) - allowed_columns + if invalid_columns: + errors.append( + "Row {} has invalid columns: {}".format( + i, ", ".join(sorted(invalid_columns)) + ) + ) + if errors: + return _errors(errors) + extra = {key: data[key] for key in data if key not in ("rows", "row")} + return rows, errors, extra + async def post(self, request): + def _error(messages, status=400): + return Response.json({"ok": False, "errors": messages}, status=status) + database_route = tilde_decode(request.url_vars["database"]) try: db = self.ds.get_database(route=database_route) except KeyError: - raise NotFound("Database not found: {}".format(database_route)) + return _error(["Database not found: {}".format(database_route)], 404) database_name = db.name table_name = tilde_decode(request.url_vars["table"]) + # Table must exist (may handle table creation in the future) db = self.ds.get_database(database_name) if not await db.table_exists(table_name): - raise NotFound("Table not found: {}".format(table_name)) + return _error(["Table not found: {}".format(table_name)], 404) # Must have insert-row permission if not await self.ds.permission_allowed( request.actor, "insert-row", resource=(database_name, table_name) ): - raise Forbidden("Permission denied") - if request.headers.get("content-type") != "application/json": - # TODO: handle form-encoded data - raise BadRequest("Must send JSON data") - data = json.loads(await request.post_body()) - if "row" not in data: - raise BadRequest('Must send a "row" key containing a dictionary') - row = data["row"] - if not isinstance(row, dict): - raise BadRequest("row must be a dictionary") - # Verify all columns exist - columns = await db.table_columns(table_name) - pks = await db.primary_keys(table_name) - for key in row: - if key not in columns: - raise BadRequest("Column not found: {}".format(key)) - if key in pks: - raise BadRequest( - "Cannot insert into primary key column: {}".format(key) + return _error(["Permission denied"], 403) + rows, errors, extra = await self._validate_data(request, db, table_name) + if errors: + return _error(errors, 400) + + should_return = bool(extra.get("return_rows", False)) + # Insert rows + def insert_rows(conn): + table = sqlite_utils.Database(conn)[table_name] + if should_return: + rowids = [] + for row in rows: + rowids.append(table.insert(row).last_rowid) + return list( + table.rows_where( + "rowid in ({})".format(",".join("?" for _ in rowids)), rowids + ) ) - # Perform the insert - sql = "INSERT INTO [{table}] ({columns}) VALUES ({values})".format( - table=escape_sqlite(table_name), - columns=", ".join(escape_sqlite(c) for c in row), - values=", ".join("?" for c in row), - ) - cursor = await db.execute_write(sql, list(row.values())) - # Return the new row - rowid = cursor.lastrowid - new_row = ( - await db.execute( - "SELECT * FROM [{table}] WHERE rowid = ?".format( - table=escape_sqlite(table_name) - ), - [rowid], - ) - ).first() - return Response.json( - { - "inserted": [dict(new_row)], - }, - status=201, - ) + else: + table.insert_all(rows) + + rows = await db.execute_write_fn(insert_rows) + result = {"ok": True} + if should_return: + result["inserted"] = rows + return Response.json(result, status=201) diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index 56156568..649a3dcd 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -213,6 +213,8 @@ These can be passed to ``datasette serve`` using ``datasette serve --setting nam (default=100) max_returned_rows Maximum rows that can be returned from a table or custom query (default=1000) + max_insert_rows Maximum rows that can be inserted at a time using + the bulk insert API (default=1000) num_sql_threads Number of threads in the thread pool for executing SQLite queries (default=3) sql_time_limit_ms Time limit for a SQL query in milliseconds diff --git a/docs/json_api.rst b/docs/json_api.rst index 4a7961f2..01558c23 100644 --- a/docs/json_api.rst +++ b/docs/json_api.rst @@ -465,11 +465,13 @@ Datasette provides a write API for JSON data. This is a POST-only API that requi .. _TableInsertView: -Inserting a single row -~~~~~~~~~~~~~~~~~~~~~~ +Inserting rows +~~~~~~~~~~~~~~ This requires the :ref:`permissions_insert_row` permission. +A single row can be inserted using the ``"row"`` key: + :: POST //
/-/insert @@ -495,3 +497,45 @@ If successful, this will return a ``201`` status code and the newly inserted row } ] } + +To insert multiple rows at a time, use the same API method but send a list of dictionaries as the ``"rows"`` key: + +:: + + POST //
/-/insert + Content-Type: application/json + Authorization: Bearer dstok_ + { + "rows": [ + { + "column1": "value1", + "column2": "value2" + }, + { + "column1": "value3", + "column2": "value4" + } + ] + } + +If successful, this will return a ``201`` status code and an empty ``{}`` response body. + +To return the newly inserted rows, add the ``"return_rows": true`` key to the request body: + +.. code-block:: json + + { + "rows": [ + { + "column1": "value1", + "column2": "value2" + }, + { + "column1": "value3", + "column2": "value4" + } + ], + "return_rows": true + } + +This will return the same ``"inserted"`` key as the single row example above. There is a small performance penalty for using this option. diff --git a/docs/settings.rst b/docs/settings.rst index a990c78c..b86b18bd 100644 --- a/docs/settings.rst +++ b/docs/settings.rst @@ -96,6 +96,17 @@ You can increase or decrease this limit like so:: datasette mydatabase.db --setting max_returned_rows 2000 +.. _setting_max_insert_rows: + +max_insert_rows +~~~~~~~~~~~~~~~ + +Maximum rows that can be inserted at a time using the bulk insert API, see :ref:`TableInsertView`. Defaults to 100. + +You can increase or decrease this limit like so:: + + datasette mydatabase.db --setting max_insert_rows 1000 + .. _setting_num_sql_threads: num_sql_threads diff --git a/tests/test_api.py b/tests/test_api.py index fc171421..ebd675b9 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -804,6 +804,7 @@ def test_settings_json(app_client): "facet_suggest_time_limit_ms": 50, "facet_time_limit_ms": 200, "max_returned_rows": 100, + "max_insert_rows": 100, "sql_time_limit_ms": 200, "allow_download": True, "allow_signed_tokens": True, diff --git a/tests/test_api_write.py b/tests/test_api_write.py index e8222e43..4a5a58aa 100644 --- a/tests/test_api_write.py +++ b/tests/test_api_write.py @@ -18,11 +18,7 @@ def ds_write(tmp_path_factory): @pytest.mark.asyncio async def test_write_row(ds_write): - token = "dstok_{}".format( - ds_write.sign( - {"a": "root", "token": "dstok", "t": int(time.time())}, namespace="token" - ) - ) + token = write_token(ds_write) response = await ds_write.client.post( "/data/docs/-/insert", json={"row": {"title": "Test", "score": 1.0}}, @@ -36,3 +32,165 @@ async def test_write_row(ds_write): assert response.json()["inserted"] == [expected_row] rows = (await ds_write.get_database("data").execute("select * from docs")).rows assert dict(rows[0]) == expected_row + + +@pytest.mark.asyncio +@pytest.mark.parametrize("return_rows", (True, False)) +async def test_write_rows(ds_write, return_rows): + token = write_token(ds_write) + data = {"rows": [{"title": "Test {}".format(i), "score": 1.0} for i in range(20)]} + if return_rows: + data["return_rows"] = True + response = await ds_write.client.post( + "/data/docs/-/insert", + json=data, + headers={ + "Authorization": "Bearer {}".format(token), + "Content-Type": "application/json", + }, + ) + assert response.status_code == 201 + actual_rows = [ + dict(r) + for r in ( + await ds_write.get_database("data").execute("select * from docs") + ).rows + ] + assert len(actual_rows) == 20 + assert actual_rows == [ + {"id": i + 1, "title": "Test {}".format(i), "score": 1.0} for i in range(20) + ] + assert response.json()["ok"] is True + if return_rows: + assert response.json()["inserted"] == actual_rows + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + "path,input,special_case,expected_status,expected_errors", + ( + ( + "/data2/docs/-/insert", + {}, + None, + 404, + ["Database not found: data2"], + ), + ( + "/data/docs2/-/insert", + {}, + None, + 404, + ["Table not found: docs2"], + ), + ( + "/data/docs/-/insert", + {"rows": [{"title": "Test"} for i in range(10)]}, + "bad_token", + 403, + ["Permission denied"], + ), + ( + "/data/docs/-/insert", + {}, + "invalid_json", + 400, + [ + "Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)" + ], + ), + ( + "/data/docs/-/insert", + {}, + "invalid_content_type", + 400, + ["Invalid content-type, must be application/json"], + ), + ( + "/data/docs/-/insert", + [], + None, + 400, + ["JSON must be a dictionary"], + ), + ( + "/data/docs/-/insert", + {"row": "blah"}, + None, + 400, + ['"row" must be a dictionary'], + ), + ( + "/data/docs/-/insert", + {"blah": "blah"}, + None, + 400, + ['JSON must have one or other of "row" or "rows"'], + ), + ( + "/data/docs/-/insert", + {"rows": "blah"}, + None, + 400, + ['"rows" must be a list'], + ), + ( + "/data/docs/-/insert", + {"rows": ["blah"]}, + None, + 400, + ['"rows" must be a list of dictionaries'], + ), + ( + "/data/docs/-/insert", + {"rows": [{"title": "Test"} for i in range(101)]}, + None, + 400, + ["Too many rows, maximum allowed is 100"], + ), + # Validate columns of each row + ( + "/data/docs/-/insert", + {"rows": [{"title": "Test", "bad": 1, "worse": 2} for i in range(2)]}, + None, + 400, + [ + "Row 0 has invalid columns: bad, worse", + "Row 1 has invalid columns: bad, worse", + ], + ), + ), +) +async def test_write_row_errors( + ds_write, path, input, special_case, expected_status, expected_errors +): + token = write_token(ds_write) + if special_case == "bad_token": + token += "bad" + kwargs = dict( + json=input, + headers={ + "Authorization": "Bearer {}".format(token), + "Content-Type": "text/plain" + if special_case == "invalid_content_type" + else "application/json", + }, + ) + if special_case == "invalid_json": + del kwargs["json"] + kwargs["content"] = "{bad json" + response = await ds_write.client.post( + path, + **kwargs, + ) + assert response.status_code == expected_status + assert response.json()["ok"] is False + assert response.json()["errors"] == expected_errors + + +def write_token(ds): + return "dstok_{}".format( + ds.sign( + {"a": "root", "token": "dstok", "t": int(time.time())}, namespace="token" + ) + ) From f6bf2d8045cc239fe34357342bff1440561c8909 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 29 Oct 2022 23:20:11 -0700 Subject: [PATCH 216/952] Initial prototype of API explorer at /-/api, refs #1871 --- datasette/app.py | 5 ++ datasette/templates/api_explorer.html | 73 +++++++++++++++++++++++++++ datasette/views/special.py | 8 +++ tests/test_docs.py | 2 +- 4 files changed, 87 insertions(+), 1 deletion(-) create mode 100644 datasette/templates/api_explorer.html diff --git a/datasette/app.py b/datasette/app.py index f80d3792..c3d802a4 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -33,6 +33,7 @@ from .views.special import ( JsonDataView, PatternPortfolioView, AuthTokenView, + ApiExplorerView, CreateTokenView, LogoutView, AllowDebugView, @@ -1235,6 +1236,10 @@ class Datasette: CreateTokenView.as_view(self), r"/-/create-token$", ) + add_route( + ApiExplorerView.as_view(self), + r"/-/api$", + ) add_route( LogoutView.as_view(self), r"/-/logout$", diff --git a/datasette/templates/api_explorer.html b/datasette/templates/api_explorer.html new file mode 100644 index 00000000..034bee60 --- /dev/null +++ b/datasette/templates/api_explorer.html @@ -0,0 +1,73 @@ +{% extends "base.html" %} + +{% block title %}API Explorer{% endblock %} + +{% block content %} + +

API Explorer

+ +

Use this tool to try out the Datasette write API.

+ +{% if errors %} + {% for error in errors %} +

{{ error }}

+ {% endfor %} +{% endif %} + + +
+ + +
+
+ + +
+
+ +
+

+ + + + +{% endblock %} diff --git a/datasette/views/special.py b/datasette/views/special.py index b754a2f0..9922a621 100644 --- a/datasette/views/special.py +++ b/datasette/views/special.py @@ -235,3 +235,11 @@ class CreateTokenView(BaseView): "token_bits": token_bits, }, ) + + +class ApiExplorerView(BaseView): + name = "api_explorer" + has_json_alternate = False + + async def get(self, request): + return await self.render(["api_explorer.html"], request) diff --git a/tests/test_docs.py b/tests/test_docs.py index cd5a6c13..e9b813fe 100644 --- a/tests/test_docs.py +++ b/tests/test_docs.py @@ -62,7 +62,7 @@ def documented_views(): if first_word.endswith("View"): view_labels.add(first_word) # We deliberately don't document these: - view_labels.update(("PatternPortfolioView", "AuthTokenView")) + view_labels.update(("PatternPortfolioView", "AuthTokenView", "ApiExplorerView")) return view_labels From 9eb9ffae3ddd4e8ff0b713bf6fd6a0afed3368d7 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 30 Oct 2022 13:09:55 -0700 Subject: [PATCH 217/952] Drop API token requirement from API explorer, refs #1871 --- datasette/default_permissions.py | 9 +++++++++ datasette/templates/api_explorer.html | 13 ++++--------- 2 files changed, 13 insertions(+), 9 deletions(-) diff --git a/datasette/default_permissions.py b/datasette/default_permissions.py index 87684e2a..151ba2b5 100644 --- a/datasette/default_permissions.py +++ b/datasette/default_permissions.py @@ -131,3 +131,12 @@ def register_commands(cli): if debug: click.echo("\nDecoded:\n") click.echo(json.dumps(ds.unsign(token, namespace="token"), indent=2)) + + +@hookimpl +def skip_csrf(scope): + # Skip CSRF check for requests with content-type: application/json + if scope["type"] == "http": + headers = scope.get("headers") or {} + if dict(headers).get(b"content-type") == b"application/json": + return True diff --git a/datasette/templates/api_explorer.html b/datasette/templates/api_explorer.html index 034bee60..01b182d8 100644 --- a/datasette/templates/api_explorer.html +++ b/datasette/templates/api_explorer.html @@ -15,16 +15,13 @@ {% endif %}
-
- - -
- +
-
- +
+ +

@@ -46,7 +43,6 @@ form.addEventListener("submit", (ev) => { var formData = new FormData(form); var json = formData.get('json'); var path = formData.get('path'); - var token = formData.get('token'); // Validate JSON try { var data = JSON.parse(json); @@ -60,7 +56,6 @@ form.addEventListener("submit", (ev) => { body: json, headers: { 'Content-Type': 'application/json', - 'Authorization': `Bearer ${token}` } }).then(r => r.json()).then(r => { alert(JSON.stringify(r, null, 2)); From fedbfcc36873366143195d8fe124e1859bf88346 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sun, 30 Oct 2022 14:49:07 -0700 Subject: [PATCH 218/952] Neater display of output and errors in API explorer, refs #1871 --- datasette/templates/api_explorer.html | 22 +++++++++++++++++++++- 1 file changed, 21 insertions(+), 1 deletion(-) diff --git a/datasette/templates/api_explorer.html b/datasette/templates/api_explorer.html index 01b182d8..38fdb7bc 100644 --- a/datasette/templates/api_explorer.html +++ b/datasette/templates/api_explorer.html @@ -26,6 +26,12 @@

+ + """.format( escape(ex.sql) ) diff --git a/tests/test_api.py b/tests/test_api.py index ad74d16e..4027a7a5 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -662,7 +662,11 @@ def test_sql_time_limit(app_client_shorter_time_limit): "

SQL query took too long. The time limit is controlled by the\n" 'sql_time_limit_ms\n' "configuration option.

\n" - "
select sleep(0.5)
" + '\n' + "" ), "status": 400, "title": "SQL Interrupted", diff --git a/tests/test_html.py b/tests/test_html.py index 4b394199..7cfe9d90 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -172,7 +172,7 @@ def test_sql_time_limit(app_client_shorter_time_limit): """ sql_time_limit_ms """.strip(), - "
select sleep(0.5)
", + '', ] for expected_html_fragment in expected_html_fragments: assert expected_html_fragment in response.text From 93a02281dad2f23da84210f6ae9c63777ad8af5e Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 1 Nov 2022 10:22:26 -0700 Subject: [PATCH 223/952] Show interrupted query in resizing textarea, closes #1876 --- datasette/views/base.py | 6 +++++- tests/test_api.py | 6 +++++- tests/test_html.py | 2 +- 3 files changed, 11 insertions(+), 3 deletions(-) diff --git a/datasette/views/base.py b/datasette/views/base.py index 67aa3a42..6b01fdd2 100644 --- a/datasette/views/base.py +++ b/datasette/views/base.py @@ -378,7 +378,11 @@ class DataView(BaseView):

SQL query took too long. The time limit is controlled by the sql_time_limit_ms configuration option.

-
{}
+ + """.format( escape(ex.sql) ) diff --git a/tests/test_api.py b/tests/test_api.py index ebd675b9..de0223e2 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -662,7 +662,11 @@ def test_sql_time_limit(app_client_shorter_time_limit): "

SQL query took too long. The time limit is controlled by the\n" 'sql_time_limit_ms\n' "configuration option.

\n" - "
select sleep(0.5)
" + '\n' + "" ), "status": 400, "title": "SQL Interrupted", diff --git a/tests/test_html.py b/tests/test_html.py index 4b394199..7cfe9d90 100644 --- a/tests/test_html.py +++ b/tests/test_html.py @@ -172,7 +172,7 @@ def test_sql_time_limit(app_client_shorter_time_limit): """ sql_time_limit_ms """.strip(), - "
select sleep(0.5)
", + '', ] for expected_html_fragment in expected_html_fragments: assert expected_html_fragment in response.text From 9bec7c38eb93cde5afb16df9bdd96aea2a5b0459 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 1 Nov 2022 11:07:59 -0700 Subject: [PATCH 224/952] ignore and replace options for bulk inserts, refs #1873 Also removed the rule that you cannot include primary keys in the rows you insert. And added validation that catches invalid parameters in the incoming JSON. And renamed "inserted" to "rows" in the returned JSON for return_rows: true --- datasette/views/table.py | 41 ++++++++++++++------ docs/json_api.rst | 4 +- tests/test_api_write.py | 83 ++++++++++++++++++++++++++++++++++++++-- 3 files changed, 111 insertions(+), 17 deletions(-) diff --git a/datasette/views/table.py b/datasette/views/table.py index 1e3d566e..7692a4e3 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -1107,6 +1107,7 @@ class TableInsertView(BaseView): if not isinstance(data, dict): return _errors(["JSON must be a dictionary"]) keys = data.keys() + # keys must contain "row" or "rows" if "row" not in keys and "rows" not in keys: return _errors(['JSON must have one or other of "row" or "rows"']) @@ -1126,19 +1127,31 @@ class TableInsertView(BaseView): for row in rows: if not isinstance(row, dict): return _errors(['"rows" must be a list of dictionaries']) + # Does this exceed max_insert_rows? max_insert_rows = self.ds.setting("max_insert_rows") if len(rows) > max_insert_rows: return _errors( ["Too many rows, maximum allowed is {}".format(max_insert_rows)] ) + + # Validate other parameters + extras = { + key: value for key, value in data.items() if key not in ("row", "rows") + } + valid_extras = {"return_rows", "ignore", "replace"} + invalid_extras = extras.keys() - valid_extras + if invalid_extras: + return _errors( + ['Invalid parameter: "{}"'.format('", "'.join(sorted(invalid_extras)))] + ) + if extras.get("ignore") and extras.get("replace"): + return _errors(['Cannot use "ignore" and "replace" at the same time']) + # Validate columns of each row - columns = await db.table_columns(table_name) - # TODO: There are cases where pks are OK, if not using auto-incrementing pk - pks = await db.primary_keys(table_name) - allowed_columns = set(columns) - set(pks) + columns = set(await db.table_columns(table_name)) for i, row in enumerate(rows): - invalid_columns = set(row.keys()) - allowed_columns + invalid_columns = set(row.keys()) - columns if invalid_columns: errors.append( "Row {} has invalid columns: {}".format( @@ -1147,8 +1160,7 @@ class TableInsertView(BaseView): ) if errors: return _errors(errors) - extra = {key: data[key] for key in data if key not in ("rows", "row")} - return rows, errors, extra + return rows, errors, extras async def post(self, request): database_route = tilde_decode(request.url_vars["database"]) @@ -1168,18 +1180,23 @@ class TableInsertView(BaseView): request.actor, "insert-row", resource=(database_name, table_name) ): return _error(["Permission denied"], 403) - rows, errors, extra = await self._validate_data(request, db, table_name) + rows, errors, extras = await self._validate_data(request, db, table_name) if errors: return _error(errors, 400) - should_return = bool(extra.get("return_rows", False)) + ignore = extras.get("ignore") + replace = extras.get("replace") + + should_return = bool(extras.get("return_rows", False)) # Insert rows def insert_rows(conn): table = sqlite_utils.Database(conn)[table_name] if should_return: rowids = [] for row in rows: - rowids.append(table.insert(row).last_rowid) + rowids.append( + table.insert(row, ignore=ignore, replace=replace).last_rowid + ) return list( table.rows_where( "rowid in ({})".format(",".join("?" for _ in rowids)), @@ -1187,12 +1204,12 @@ class TableInsertView(BaseView): ) ) else: - table.insert_all(rows) + table.insert_all(rows, ignore=ignore, replace=replace) rows = await db.execute_write_fn(insert_rows) result = {"ok": True} if should_return: - result["inserted"] = rows + result["rows"] = rows return Response.json(result, status=201) diff --git a/docs/json_api.rst b/docs/json_api.rst index da4500ab..34c13211 100644 --- a/docs/json_api.rst +++ b/docs/json_api.rst @@ -489,7 +489,7 @@ If successful, this will return a ``201`` status code and the newly inserted row .. code-block:: json { - "inserted": [ + "rows": [ { "id": 1, "column1": "value1", @@ -538,7 +538,7 @@ To return the newly inserted rows, add the ``"return_rows": true`` key to the re "return_rows": true } -This will return the same ``"inserted"`` key as the single row example above. There is a small performance penalty for using this option. +This will return the same ``"rows"`` key as the single row example above. There is a small performance penalty for using this option. .. _RowDeleteView: diff --git a/tests/test_api_write.py b/tests/test_api_write.py index 1cfba104..d0b0f324 100644 --- a/tests/test_api_write.py +++ b/tests/test_api_write.py @@ -37,7 +37,7 @@ async def test_write_row(ds_write): ) expected_row = {"id": 1, "title": "Test", "score": 1.0} assert response.status_code == 201 - assert response.json()["inserted"] == [expected_row] + assert response.json()["rows"] == [expected_row] rows = (await ds_write.get_database("data").execute("select * from docs")).rows assert dict(rows[0]) == expected_row @@ -70,7 +70,7 @@ async def test_write_rows(ds_write, return_rows): ] assert response.json()["ok"] is True if return_rows: - assert response.json()["inserted"] == actual_rows + assert response.json()["rows"] == actual_rows @pytest.mark.asyncio @@ -156,6 +156,27 @@ async def test_write_rows(ds_write, return_rows): 400, ["Too many rows, maximum allowed is 100"], ), + ( + "/data/docs/-/insert", + {"rows": [{"title": "Test"}], "ignore": True, "replace": True}, + None, + 400, + ['Cannot use "ignore" and "replace" at the same time'], + ), + ( + "/data/docs/-/insert", + {"rows": [{"title": "Test"}], "invalid_param": True}, + None, + 400, + ['Invalid parameter: "invalid_param"'], + ), + ( + "/data/docs/-/insert", + {"rows": [{"title": "Test"}], "one": True, "two": True}, + None, + 400, + ['Invalid parameter: "one", "two"'], + ), # Validate columns of each row ( "/data/docs/-/insert", @@ -196,6 +217,62 @@ async def test_write_row_errors( assert response.json()["errors"] == expected_errors +@pytest.mark.asyncio +@pytest.mark.parametrize( + "ignore,replace,expected_rows", + ( + ( + True, + False, + [ + {"id": 1, "title": "Exists", "score": None}, + ], + ), + ( + False, + True, + [ + {"id": 1, "title": "One", "score": None}, + ], + ), + ), +) +@pytest.mark.parametrize("should_return", (True, False)) +async def test_insert_ignore_replace( + ds_write, ignore, replace, expected_rows, should_return +): + await ds_write.get_database("data").execute_write( + "insert into docs (id, title) values (1, 'Exists')" + ) + token = write_token(ds_write) + data = {"rows": [{"id": 1, "title": "One"}]} + if ignore: + data["ignore"] = True + if replace: + data["replace"] = True + if should_return: + data["return_rows"] = True + response = await ds_write.client.post( + "/data/docs/-/insert", + json=data, + headers={ + "Authorization": "Bearer {}".format(token), + "Content-Type": "application/json", + }, + ) + assert response.status_code == 201 + actual_rows = [ + dict(r) + for r in ( + await ds_write.get_database("data").execute("select * from docs") + ).rows + ] + assert actual_rows == expected_rows + assert response.json()["ok"] is True + if should_return: + assert response.json()["rows"] == expected_rows + + @pytest.mark.asyncio @pytest.mark.parametrize("scenario", ("no_token", "no_perm", "bad_table", "has_perm")) async def test_delete_row(ds_write, scenario): @@ -217,7 +294,7 @@ async def test_delete_row(ds_write, scenario): }, ) assert insert_response.status_code == 201 - pk = insert_response.json()["inserted"][0]["id"] + pk = insert_response.json()["rows"][0]["id"] path = "/data/{}/{}/-/delete".format( "docs" if scenario != "bad_table" else "bad_table", pk From 497290beaf32e6b779f9683ef15f1c5bc142a41a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 1 Nov 2022 12:59:17 -0700 Subject: [PATCH 225/952] Handle database errors in /-/insert, refs #1866, #1873 Also improved API explorer to show HTTP status of response, refs #1871 --- datasette/templates/api_explorer.html | 14 +++++++++----- datasette/views/table.py | 5 ++++- tests/test_api_write.py | 11 +++++++++++ 3 files changed, 24 insertions(+), 6 deletions(-) diff --git a/datasette/templates/api_explorer.html b/datasette/templates/api_explorer.html index 38fdb7bc..93bacde3 100644 --- a/datasette/templates/api_explorer.html +++ b/datasette/templates/api_explorer.html @@ -27,7 +27,8 @@ @@ -64,12 +65,15 @@ form.addEventListener("submit", (ev) => { headers: { 'Content-Type': 'application/json', } - }).then(r => r.json()).then(r => { + }).then(r => { + document.getElementById('response-status').textContent = r.status; + return r.json(); + }).then(data => { var errorList = output.querySelector('.errors'); - if (r.errors) { + if (data.errors) { errorList.style.display = 'block'; errorList.innerHTML = ''; - r.errors.forEach(error => { + data.errors.forEach(error => { var li = document.createElement('li'); li.textContent = error; errorList.appendChild(li); @@ -77,7 +81,7 @@ form.addEventListener("submit", (ev) => { } else { errorList.style.display = 'none'; } - output.querySelector('pre').innerText = JSON.stringify(r, null, 2); + output.querySelector('pre').innerText = JSON.stringify(data, null, 2); output.style.display = 'block'; }).catch(err => { alert("Error: " + err); diff --git a/datasette/views/table.py b/datasette/views/table.py index 7692a4e3..61227206 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -1206,7 +1206,10 @@ class TableInsertView(BaseView): else: table.insert_all(rows, ignore=ignore, replace=replace) - rows = await db.execute_write_fn(insert_rows) + try: + rows = await db.execute_write_fn(insert_rows) + except Exception as e: + return _error([str(e)]) result = {"ok": True} if should_return: result["rows"] = rows diff --git a/tests/test_api_write.py b/tests/test_api_write.py index d0b0f324..0b567f48 100644 --- a/tests/test_api_write.py +++ b/tests/test_api_write.py @@ -156,6 +156,13 @@ async def test_write_rows(ds_write, return_rows): 400, ["Too many rows, maximum allowed is 100"], ), + ( + "/data/docs/-/insert", + {"rows": [{"id": 1, "title": "Test"}]}, + "duplicate_id", + 400, + ["UNIQUE constraint failed: docs.id"], + ), ( "/data/docs/-/insert", {"rows": [{"title": "Test"}], "ignore": True, "replace": True}, @@ -194,6 +201,10 @@ async def test_write_row_errors( ds_write, path, input, special_case, expected_status, expected_errors ): token = write_token(ds_write) + if special_case == "duplicate_id": + await ds_write.get_database("data").execute_write( + "insert into docs (id) values (1)" + ) if special_case == "bad_token": token += "bad" kwargs = dict( From 0b166befc0096fca30d71e19608a928d59c331a4 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Tue, 1 Nov 2022 17:31:22 -0700 Subject: [PATCH 226/952] API explorer can now do GET, has JSON syntax highlighting Refs #1871 --- .../static/json-format-highlight-1.0.1.js | 43 +++++++++++ datasette/templates/api_explorer.html | 77 +++++++++++++++---- 2 files changed, 103 insertions(+), 17 deletions(-) create mode 100644 datasette/static/json-format-highlight-1.0.1.js diff --git a/datasette/static/json-format-highlight-1.0.1.js b/datasette/static/json-format-highlight-1.0.1.js new file mode 100644 index 00000000..e87c76e1 --- /dev/null +++ b/datasette/static/json-format-highlight-1.0.1.js @@ -0,0 +1,43 @@ +/* +https://github.com/luyilin/json-format-highlight +From https://unpkg.com/json-format-highlight@1.0.1/dist/json-format-highlight.js +MIT Licensed +*/ +(function (global, factory) { + typeof exports === 'object' && typeof module !== 'undefined' ? module.exports = factory() : + typeof define === 'function' && define.amd ? define(factory) : + (global.jsonFormatHighlight = factory()); +}(this, (function () { 'use strict'; + +var defaultColors = { + keyColor: 'dimgray', + numberColor: 'lightskyblue', + stringColor: 'lightcoral', + trueColor: 'lightseagreen', + falseColor: '#f66578', + nullColor: 'cornflowerblue' +}; + +function index (json, colorOptions) { + if ( colorOptions === void 0 ) colorOptions = {}; + + if (!json) { return; } + if (typeof json !== 'string') { + json = JSON.stringify(json, null, 2); + } + var colors = Object.assign({}, defaultColors, colorOptions); + json = json.replace(/&/g, '&').replace(//g, '>'); + return json.replace(/("(\\u[a-zA-Z0-9]{4}|\\[^u]|[^\\"])*"(\s*:)?|\b(true|false|null)\b|-?\d+(?:\.\d*)?(?:[eE][+]?\d+)?)/g, function (match) { + var color = colors.numberColor; + if (/^"/.test(match)) { + color = /:$/.test(match) ? colors.keyColor : colors.stringColor; + } else { + color = /true/.test(match) ? colors.trueColor : /false/.test(match) ? colors.falseColor : /null/.test(match) ? colors.nullColor : color; + } + return ("" + match + ""); + }); +} + +return index; + +}))); diff --git a/datasette/templates/api_explorer.html b/datasette/templates/api_explorer.html index 93bacde3..de5337e3 100644 --- a/datasette/templates/api_explorer.html +++ b/datasette/templates/api_explorer.html @@ -2,6 +2,10 @@ {% block title %}API Explorer{% endblock %} +{% block extra_head %} + +{% endblock %} + {% block content %}

API Explorer

@@ -14,17 +18,30 @@ {% endfor %} {% endif %} -
-
- - -
-
- - -
-

- +
+ GET +
+
+ + + +
+ +
+
+ POST +
+
+ + +
+
+ + +
+

+ +
{% else %} - {% if not canned_write and not error %} + {% if not canned_query_write and not error %}

0 results

{% endif %} {% endif %} diff --git a/datasette/views/database.py b/datasette/views/database.py index 0770a380..658c35e6 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -1,4 +1,3 @@ -from asyncinject import Registry from dataclasses import dataclass, field from typing import Callable from urllib.parse import parse_qsl, urlencode @@ -33,7 +32,7 @@ from datasette.utils import ( from datasette.utils.asgi import AsgiFileDownload, NotFound, Response, Forbidden from datasette.plugins import pm -from .base import BaseView, DatasetteError, DataView, View, _error, stream_csv +from .base import BaseView, DatasetteError, View, _error, stream_csv class DatabaseView(View): @@ -57,7 +56,7 @@ class DatabaseView(View): sql = (request.args.get("sql") or "").strip() if sql: - return await query_view(request, datasette) + return await QueryView()(request, datasette) if format_ not in ("html", "json"): raise NotFound("Invalid format: {}".format(format_)) @@ -65,10 +64,6 @@ class DatabaseView(View): metadata = (datasette.metadata("databases") or {}).get(database, {}) datasette.update_with_inherited_metadata(metadata) - table_counts = await db.table_counts(5) - hidden_table_names = set(await db.hidden_table_names()) - all_foreign_keys = await db.get_all_foreign_keys() - sql_views = [] for view_name in await db.view_names(): view_visible, view_private = await datasette.check_visibility( @@ -196,8 +191,13 @@ class QueryContext: # urls: dict = field( # metadata={"help": "Object containing URL helpers like `database()`"} # ) - canned_write: bool = field( - metadata={"help": "Boolean indicating if this canned query allows writes"} + canned_query_write: bool = field( + metadata={ + "help": "Boolean indicating if this is a canned query that allows writes" + } + ) + metadata: dict = field( + metadata={"help": "Metadata about the database or the canned query"} ) db_is_immutable: bool = field( metadata={"help": "Boolean indicating if this database is immutable"} @@ -232,7 +232,6 @@ class QueryContext: show_hide_hidden: str = field( metadata={"help": "Hidden input field for the _show_sql parameter"} ) - metadata: dict = field(metadata={"help": "Metadata about the query/database"}) database_color: Callable = field( metadata={"help": "Function that returns a color for a given database name"} ) @@ -242,6 +241,12 @@ class QueryContext: alternate_url_json: str = field( metadata={"help": "URL for alternate JSON version of this page"} ) + # TODO: refactor this to somewhere else, probably ds.render_template() + select_templates: list = field( + metadata={ + "help": "List of templates that were considered for rendering this page" + } + ) async def get_tables(datasette, request, db): @@ -320,287 +325,105 @@ async def database_download(request, datasette): ) -async def query_view( - request, - datasette, - # canned_query=None, - # _size=None, - # named_parameters=None, - # write=False, -): - db = await datasette.resolve_database(request) - database = db.name - # Flattened because of ?sql=&name1=value1&name2=value2 feature - params = {key: request.args.get(key) for key in request.args} - sql = None - if "sql" in params: - sql = params.pop("sql") - if "_shape" in params: - params.pop("_shape") +class QueryView(View): + async def post(self, request, datasette): + from datasette.app import TableNotFound - # extras come from original request.args to avoid being flattened - extras = request.args.getlist("_extra") + db = await datasette.resolve_database(request) - # TODO: Behave differently for canned query here: - await datasette.ensure_permissions(request.actor, [("execute-sql", database)]) - - _, private = await datasette.check_visibility( - request.actor, - permissions=[ - ("view-database", database), - "view-instance", - ], - ) - - extra_args = {} - if params.get("_timelimit"): - extra_args["custom_time_limit"] = int(params["_timelimit"]) - - format_ = request.url_vars.get("format") or "html" - query_error = None - try: - validate_sql_select(sql) - results = await datasette.execute( - database, sql, params, truncate=True, **extra_args - ) - columns = results.columns - rows = results.rows - except QueryInterrupted as ex: - raise DatasetteError( - textwrap.dedent( - """ -

SQL query took too long. The time limit is controlled by the - sql_time_limit_ms - configuration option.

- - - """.format( - markupsafe.escape(ex.sql) - ) - ).strip(), - title="SQL Interrupted", - status=400, - message_is_html=True, - ) - except sqlite3.DatabaseError as ex: - query_error = str(ex) - results = None - rows = [] - columns = [] - except (sqlite3.OperationalError, InvalidSql) as ex: - raise DatasetteError(str(ex), title="Invalid SQL", status=400) - except sqlite3.OperationalError as ex: - raise DatasetteError(str(ex)) - except DatasetteError: - raise - - # Handle formats from plugins - if format_ == "csv": - - async def fetch_data_for_csv(request, _next=None): - results = await db.execute(sql, params, truncate=True) - data = {"rows": results.rows, "columns": results.columns} - return data, None, None - - return await stream_csv(datasette, fetch_data_for_csv, request, db.name) - elif format_ in datasette.renderers.keys(): - # Dispatch request to the correct output format renderer - # (CSV is not handled here due to streaming) - result = call_with_supported_arguments( - datasette.renderers[format_][0], - datasette=datasette, - columns=columns, - rows=rows, - sql=sql, - query_name=None, - database=database, - table=None, - request=request, - view_name="table", - truncated=results.truncated if results else False, - error=query_error, - # These will be deprecated in Datasette 1.0: - args=request.args, - data={"rows": rows, "columns": columns}, - ) - if asyncio.iscoroutine(result): - result = await result - if result is None: - raise NotFound("No data") - if isinstance(result, dict): - r = Response( - body=result.get("body"), - status=result.get("status_code") or 200, - content_type=result.get("content_type", "text/plain"), - headers=result.get("headers"), + # We must be a canned query + table_found = False + try: + await datasette.resolve_table(request) + table_found = True + except TableNotFound as table_not_found: + canned_query = await datasette.get_canned_query( + table_not_found.database_name, table_not_found.table, request.actor ) - elif isinstance(result, Response): - r = result - # if status_code is not None: - # # Over-ride the status code - # r.status = status_code - else: - assert False, f"{result} should be dict or Response" - elif format_ == "html": - headers = {} - templates = [f"query-{to_css_class(database)}.html", "query.html"] - template = datasette.jinja_env.select_template(templates) - alternate_url_json = datasette.absolute_url( - request, - datasette.urls.path(path_with_format(request=request, format="json")), - ) - data = {} - headers.update( - { - "Link": '{}; rel="alternate"; type="application/json+datasette"'.format( - alternate_url_json - ) - } - ) - metadata = (datasette.metadata("databases") or {}).get(database, {}) - datasette.update_with_inherited_metadata(metadata) + if canned_query is None: + raise + if table_found: + # That should not have happened + raise DatasetteError("Unexpected table found on POST", status=404) - renderers = {} - for key, (_, can_render) in datasette.renderers.items(): - it_can_render = call_with_supported_arguments( - can_render, - datasette=datasette, - columns=data.get("columns") or [], - rows=data.get("rows") or [], - sql=data.get("query", {}).get("sql", None), - query_name=data.get("query_name"), - database=database, - table=data.get("table"), - request=request, - view_name="database", + # If database is immutable, return an error + if not db.is_mutable: + raise Forbidden("Database is immutable") + + # Process the POST + body = await request.post_body() + body = body.decode("utf-8").strip() + if body.startswith("{") and body.endswith("}"): + params = json.loads(body) + # But we want key=value strings + for key, value in params.items(): + params[key] = str(value) + else: + params = dict(parse_qsl(body, keep_blank_values=True)) + # Should we return JSON? + should_return_json = ( + request.headers.get("accept") == "application/json" + or request.args.get("_json") + or params.get("_json") + ) + params_for_query = MagicParameters(params, request, datasette) + ok = None + redirect_url = None + try: + cursor = await db.execute_write(canned_query["sql"], params_for_query) + message = canned_query.get( + "on_success_message" + ) or "Query executed, {} row{} affected".format( + cursor.rowcount, "" if cursor.rowcount == 1 else "s" + ) + message_type = datasette.INFO + redirect_url = canned_query.get("on_success_redirect") + ok = True + except Exception as ex: + message = canned_query.get("on_error_message") or str(ex) + message_type = datasette.ERROR + redirect_url = canned_query.get("on_error_redirect") + ok = False + if should_return_json: + return Response.json( + { + "ok": ok, + "message": message, + "redirect": redirect_url, + } ) - it_can_render = await await_me_maybe(it_can_render) - if it_can_render: - renderers[key] = datasette.urls.path( - path_with_format(request=request, format=key) - ) - - allow_execute_sql = await datasette.permission_allowed( - request.actor, "execute-sql", database - ) - - show_hide_hidden = "" - if metadata.get("hide_sql"): - if bool(params.get("_show_sql")): - show_hide_link = path_with_removed_args(request, {"_show_sql"}) - show_hide_text = "hide" - show_hide_hidden = '' - else: - show_hide_link = path_with_added_args(request, {"_show_sql": 1}) - show_hide_text = "show" else: - if bool(params.get("_hide_sql")): - show_hide_link = path_with_removed_args(request, {"_hide_sql"}) - show_hide_text = "show" - show_hide_hidden = '' - else: - show_hide_link = path_with_added_args(request, {"_hide_sql": 1}) - show_hide_text = "hide" - hide_sql = show_hide_text == "show" + datasette.add_message(request, message, message_type) + return Response.redirect(redirect_url or request.path) - # Extract any :named parameters - named_parameters = await derive_named_parameters( - datasette.get_database(database), sql - ) - named_parameter_values = { - named_parameter: params.get(named_parameter) or "" - for named_parameter in named_parameters - if not named_parameter.startswith("_") - } + async def get(self, request, datasette): + from datasette.app import TableNotFound - # Set to blank string if missing from params - for named_parameter in named_parameters: - if named_parameter not in params and not named_parameter.startswith("_"): - params[named_parameter] = "" - - r = Response.html( - await datasette.render_template( - template, - QueryContext( - database=database, - query={ - "sql": sql, - "params": params, - }, - canned_query=None, - private=private, - canned_write=False, - db_is_immutable=not db.is_mutable, - error=query_error, - hide_sql=hide_sql, - show_hide_link=datasette.urls.path(show_hide_link), - show_hide_text=show_hide_text, - editable=True, # TODO - allow_execute_sql=allow_execute_sql, - tables=await get_tables(datasette, request, db), - named_parameter_values=named_parameter_values, - edit_sql_url="todo", - display_rows=await display_rows( - datasette, database, request, rows, columns - ), - table_columns=await _table_columns(datasette, database) - if allow_execute_sql - else {}, - columns=columns, - renderers=renderers, - url_csv=datasette.urls.path( - path_with_format( - request=request, format="csv", extra_qs={"_size": "max"} - ) - ), - show_hide_hidden=markupsafe.Markup(show_hide_hidden), - metadata=metadata, - database_color=lambda _: "#ff0000", - alternate_url_json=alternate_url_json, - ), - request=request, - view_name="database", - ), - headers=headers, - ) - else: - assert False, "Invalid format: {}".format(format_) - if datasette.cors: - add_cors_headers(r.headers) - return r - - -class QueryView(DataView): - async def data( - self, - request, - sql, - editable=True, - canned_query=None, - metadata=None, - _size=None, - named_parameters=None, - write=False, - default_labels=None, - ): - db = await self.ds.resolve_database(request) + db = await datasette.resolve_database(request) database = db.name - params = {key: request.args.get(key) for key in request.args} - if "sql" in params: - params.pop("sql") - if "_shape" in params: - params.pop("_shape") + + # Are we a canned query? + canned_query = None + canned_query_write = False + if "table" in request.url_vars: + try: + await datasette.resolve_table(request) + except TableNotFound as table_not_found: + # Was this actually a canned query? + canned_query = await datasette.get_canned_query( + table_not_found.database_name, table_not_found.table, request.actor + ) + if canned_query is None: + raise + canned_query_write = bool(canned_query.get("write")) private = False if canned_query: # Respect canned query permissions - visible, private = await self.ds.check_visibility( + visible, private = await datasette.check_visibility( request.actor, permissions=[ - ("view-query", (database, canned_query)), + ("view-query", (database, canned_query["name"])), ("view-database", database), "view-instance", ], @@ -609,18 +432,32 @@ class QueryView(DataView): raise Forbidden("You do not have permission to view this query") else: - await self.ds.ensure_permissions(request.actor, [("execute-sql", database)]) + await datasette.ensure_permissions( + request.actor, [("execute-sql", database)] + ) + + # Flattened because of ?sql=&name1=value1&name2=value2 feature + params = {key: request.args.get(key) for key in request.args} + sql = None + + if canned_query: + sql = canned_query["sql"] + elif "sql" in params: + sql = params.pop("sql") # Extract any :named parameters - named_parameters = named_parameters or await derive_named_parameters( - self.ds.get_database(database), sql - ) + named_parameters = [] + if canned_query and canned_query.get("params"): + named_parameters = canned_query["params"] + if not named_parameters: + named_parameters = await derive_named_parameters( + datasette.get_database(database), sql + ) named_parameter_values = { named_parameter: params.get(named_parameter) or "" for named_parameter in named_parameters if not named_parameter.startswith("_") } - # Set to blank string if missing from params for named_parameter in named_parameters: if named_parameter not in params and not named_parameter.startswith("_"): @@ -629,212 +466,159 @@ class QueryView(DataView): extra_args = {} if params.get("_timelimit"): extra_args["custom_time_limit"] = int(params["_timelimit"]) - if _size: - extra_args["page_size"] = _size - templates = [f"query-{to_css_class(database)}.html", "query.html"] - if canned_query: - templates.insert( - 0, - f"query-{to_css_class(database)}-{to_css_class(canned_query)}.html", - ) + format_ = request.url_vars.get("format") or "html" query_error = None + results = None + rows = [] + columns = [] - # Execute query - as write or as read - if write: - if request.method == "POST": - # If database is immutable, return an error - if not db.is_mutable: - raise Forbidden("Database is immutable") - body = await request.post_body() - body = body.decode("utf-8").strip() - if body.startswith("{") and body.endswith("}"): - params = json.loads(body) - # But we want key=value strings - for key, value in params.items(): - params[key] = str(value) - else: - params = dict(parse_qsl(body, keep_blank_values=True)) - # Should we return JSON? - should_return_json = ( - request.headers.get("accept") == "application/json" - or request.args.get("_json") - or params.get("_json") - ) - if canned_query: - params_for_query = MagicParameters(params, request, self.ds) - else: - params_for_query = params - ok = None - try: - cursor = await self.ds.databases[database].execute_write( - sql, params_for_query - ) - message = metadata.get( - "on_success_message" - ) or "Query executed, {} row{} affected".format( - cursor.rowcount, "" if cursor.rowcount == 1 else "s" - ) - message_type = self.ds.INFO - redirect_url = metadata.get("on_success_redirect") - ok = True - except Exception as e: - message = metadata.get("on_error_message") or str(e) - message_type = self.ds.ERROR - redirect_url = metadata.get("on_error_redirect") - ok = False - if should_return_json: - return Response.json( - { - "ok": ok, - "message": message, - "redirect": redirect_url, - } - ) - else: - self.ds.add_message(request, message, message_type) - return self.redirect(request, redirect_url or request.path) - else: + params_for_query = params - async def extra_template(): - return { - "request": request, - "db_is_immutable": not db.is_mutable, - "path_with_added_args": path_with_added_args, - "path_with_removed_args": path_with_removed_args, - "named_parameter_values": named_parameter_values, - "canned_query": canned_query, - "success_message": request.args.get("_success") or "", - "canned_write": True, - } - - return ( - { - "database": database, - "rows": [], - "truncated": False, - "columns": [], - "query": {"sql": sql, "params": params}, - "private": private, - }, - extra_template, - templates, - ) - else: # Not a write - if canned_query: - params_for_query = MagicParameters(params, request, self.ds) - else: - params_for_query = params + if not canned_query_write: try: - results = await self.ds.execute( + if not canned_query: + # For regular queries we only allow SELECT, plus other rules + validate_sql_select(sql) + else: + # Canned queries can run magic parameters + params_for_query = MagicParameters(params, request, datasette) + results = await datasette.execute( database, sql, params_for_query, truncate=True, **extra_args ) - columns = [r[0] for r in results.description] - except sqlite3.DatabaseError as e: - query_error = e + columns = results.columns + rows = results.rows + except QueryInterrupted as ex: + raise DatasetteError( + textwrap.dedent( + """ +

SQL query took too long. The time limit is controlled by the + sql_time_limit_ms + configuration option.

+ + + """.format( + markupsafe.escape(ex.sql) + ) + ).strip(), + title="SQL Interrupted", + status=400, + message_is_html=True, + ) + except sqlite3.DatabaseError as ex: + query_error = str(ex) results = None + rows = [] columns = [] + except (sqlite3.OperationalError, InvalidSql) as ex: + raise DatasetteError(str(ex), title="Invalid SQL", status=400) + except sqlite3.OperationalError as ex: + raise DatasetteError(str(ex)) + except DatasetteError: + raise - allow_execute_sql = await self.ds.permission_allowed( - request.actor, "execute-sql", database - ) + # Handle formats from plugins + if format_ == "csv": - async def extra_template(): - display_rows = [] - truncate_cells = self.ds.setting("truncate_cells_html") - for row in results.rows if results else []: - display_row = [] - for column, value in zip(results.columns, row): - display_value = value - # Let the plugins have a go - # pylint: disable=no-member - plugin_display_value = None - for candidate in pm.hook.render_cell( - row=row, - value=value, - column=column, - table=None, - database=database, - datasette=self.ds, - request=request, - ): - candidate = await await_me_maybe(candidate) - if candidate is not None: - plugin_display_value = candidate - break - if plugin_display_value is not None: - display_value = plugin_display_value - else: - if value in ("", None): - display_value = markupsafe.Markup(" ") - elif is_url(str(display_value).strip()): - display_value = markupsafe.Markup( - '{truncated_url}'.format( - url=markupsafe.escape(value.strip()), - truncated_url=markupsafe.escape( - truncate_url(value.strip(), truncate_cells) - ), - ) - ) - elif isinstance(display_value, bytes): - blob_url = path_with_format( - request=request, - format="blob", - extra_qs={ - "_blob_column": column, - "_blob_hash": hashlib.sha256( - display_value - ).hexdigest(), - }, - ) - formatted = format_bytes(len(value)) - display_value = markupsafe.Markup( - '<Binary: {:,} byte{}>'.format( - blob_url, - ' title="{}"'.format(formatted) - if "bytes" not in formatted - else "", - len(value), - "" if len(value) == 1 else "s", - ) - ) - else: - display_value = str(value) - if truncate_cells and len(display_value) > truncate_cells: - display_value = ( - display_value[:truncate_cells] + "\u2026" - ) - display_row.append(display_value) - display_rows.append(display_row) + async def fetch_data_for_csv(request, _next=None): + results = await db.execute(sql, params, truncate=True) + data = {"rows": results.rows, "columns": results.columns} + return data, None, None - # Show 'Edit SQL' button only if: - # - User is allowed to execute SQL - # - SQL is an approved SELECT statement - # - No magic parameters, so no :_ in the SQL string - edit_sql_url = None - is_validated_sql = False - try: - validate_sql_select(sql) - is_validated_sql = True - except InvalidSql: - pass - if allow_execute_sql and is_validated_sql and ":_" not in sql: - edit_sql_url = ( - self.ds.urls.database(database) - + "?" - + urlencode( - { - **{ - "sql": sql, - }, - **named_parameter_values, - } - ) + return await stream_csv(datasette, fetch_data_for_csv, request, db.name) + elif format_ in datasette.renderers.keys(): + # Dispatch request to the correct output format renderer + # (CSV is not handled here due to streaming) + result = call_with_supported_arguments( + datasette.renderers[format_][0], + datasette=datasette, + columns=columns, + rows=rows, + sql=sql, + query_name=canned_query["name"] if canned_query else None, + database=database, + table=None, + request=request, + view_name="table", + truncated=results.truncated if results else False, + error=query_error, + # These will be deprecated in Datasette 1.0: + args=request.args, + data={"rows": rows, "columns": columns}, + ) + if asyncio.iscoroutine(result): + result = await result + if result is None: + raise NotFound("No data") + if isinstance(result, dict): + r = Response( + body=result.get("body"), + status=result.get("status_code") or 200, + content_type=result.get("content_type", "text/plain"), + headers=result.get("headers"), + ) + elif isinstance(result, Response): + r = result + # if status_code is not None: + # # Over-ride the status code + # r.status = status_code + else: + assert False, f"{result} should be dict or Response" + elif format_ == "html": + headers = {} + templates = [f"query-{to_css_class(database)}.html", "query.html"] + if canned_query: + templates.insert( + 0, + f"query-{to_css_class(database)}-{to_css_class(canned_query['name'])}.html", ) + template = datasette.jinja_env.select_template(templates) + alternate_url_json = datasette.absolute_url( + request, + datasette.urls.path(path_with_format(request=request, format="json")), + ) + data = {} + headers.update( + { + "Link": '{}; rel="alternate"; type="application/json+datasette"'.format( + alternate_url_json + ) + } + ) + metadata = (datasette.metadata("databases") or {}).get(database, {}) + datasette.update_with_inherited_metadata(metadata) + + renderers = {} + for key, (_, can_render) in datasette.renderers.items(): + it_can_render = call_with_supported_arguments( + can_render, + datasette=datasette, + columns=data.get("columns") or [], + rows=data.get("rows") or [], + sql=data.get("query", {}).get("sql", None), + query_name=data.get("query_name"), + database=database, + table=data.get("table"), + request=request, + view_name="database", + ) + it_can_render = await await_me_maybe(it_can_render) + if it_can_render: + renderers[key] = datasette.urls.path( + path_with_format(request=request, format=key) + ) + + allow_execute_sql = await datasette.permission_allowed( + request.actor, "execute-sql", database + ) + show_hide_hidden = "" - if metadata.get("hide_sql"): + if canned_query and canned_query.get("hide_sql"): if bool(params.get("_show_sql")): show_hide_link = path_with_removed_args(request, {"_show_sql"}) show_hide_text = "hide" @@ -855,42 +639,86 @@ class QueryView(DataView): show_hide_link = path_with_added_args(request, {"_hide_sql": 1}) show_hide_text = "hide" hide_sql = show_hide_text == "show" - return { - "display_rows": display_rows, - "custom_sql": True, - "named_parameter_values": named_parameter_values, - "editable": editable, - "canned_query": canned_query, - "edit_sql_url": edit_sql_url, - "metadata": metadata, - "settings": self.ds.settings_dict(), - "request": request, - "show_hide_link": self.ds.urls.path(show_hide_link), - "show_hide_text": show_hide_text, - "show_hide_hidden": markupsafe.Markup(show_hide_hidden), - "hide_sql": hide_sql, - "table_columns": await _table_columns(self.ds, database) - if allow_execute_sql - else {}, - } - return ( - { - "ok": not query_error, - "database": database, - "query_name": canned_query, - "rows": results.rows if results else [], - "truncated": results.truncated if results else False, - "columns": columns, - "query": {"sql": sql, "params": params}, - "error": str(query_error) if query_error else None, - "private": private, - "allow_execute_sql": allow_execute_sql, - }, - extra_template, - templates, - 400 if query_error else 200, - ) + # Show 'Edit SQL' button only if: + # - User is allowed to execute SQL + # - SQL is an approved SELECT statement + # - No magic parameters, so no :_ in the SQL string + edit_sql_url = None + is_validated_sql = False + try: + validate_sql_select(sql) + is_validated_sql = True + except InvalidSql: + pass + if allow_execute_sql and is_validated_sql and ":_" not in sql: + edit_sql_url = ( + datasette.urls.database(database) + + "?" + + urlencode( + { + **{ + "sql": sql, + }, + **named_parameter_values, + } + ) + ) + + r = Response.html( + await datasette.render_template( + template, + QueryContext( + database=database, + query={ + "sql": sql, + "params": params, + }, + canned_query=canned_query["name"] if canned_query else None, + private=private, + canned_query_write=canned_query_write, + db_is_immutable=not db.is_mutable, + error=query_error, + hide_sql=hide_sql, + show_hide_link=datasette.urls.path(show_hide_link), + show_hide_text=show_hide_text, + editable=not canned_query, + allow_execute_sql=allow_execute_sql, + tables=await get_tables(datasette, request, db), + named_parameter_values=named_parameter_values, + edit_sql_url=edit_sql_url, + display_rows=await display_rows( + datasette, database, request, rows, columns + ), + table_columns=await _table_columns(datasette, database) + if allow_execute_sql + else {}, + columns=columns, + renderers=renderers, + url_csv=datasette.urls.path( + path_with_format( + request=request, format="csv", extra_qs={"_size": "max"} + ) + ), + show_hide_hidden=markupsafe.Markup(show_hide_hidden), + metadata=canned_query or metadata, + database_color=lambda _: "#ff0000", + alternate_url_json=alternate_url_json, + select_templates=[ + f"{'*' if template_name == template.name else ''}{template_name}" + for template_name in templates + ], + ), + request=request, + view_name="database", + ), + headers=headers, + ) + else: + assert False, "Invalid format: {}".format(format_) + if datasette.cors: + add_cors_headers(r.headers) + return r class MagicParameters(dict): diff --git a/datasette/views/table.py b/datasette/views/table.py index 77acfd95..28264e92 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -9,7 +9,6 @@ import markupsafe from datasette.plugins import pm from datasette.database import QueryInterrupted from datasette import tracer -from datasette.renderer import json_renderer from datasette.utils import ( add_cors_headers, await_me_maybe, @@ -21,7 +20,6 @@ from datasette.utils import ( tilde_encode, escape_sqlite, filters_should_redirect, - format_bytes, is_url, path_from_row_pks, path_with_added_args, @@ -38,7 +36,7 @@ from datasette.utils import ( from datasette.utils.asgi import BadRequest, Forbidden, NotFound, Response from datasette.filters import Filters import sqlite_utils -from .base import BaseView, DataView, DatasetteError, ureg, _error, stream_csv +from .base import BaseView, DatasetteError, ureg, _error, stream_csv from .database import QueryView LINK_WITH_LABEL = ( @@ -698,57 +696,6 @@ async def table_view(datasette, request): return response -class CannedQueryView(DataView): - def __init__(self, datasette): - self.ds = datasette - - async def post(self, request): - from datasette.app import TableNotFound - - try: - await self.ds.resolve_table(request) - except TableNotFound as e: - # Was this actually a canned query? - canned_query = await self.ds.get_canned_query( - e.database_name, e.table, request.actor - ) - if canned_query: - # Handle POST to a canned query - return await QueryView(self.ds).data( - request, - canned_query["sql"], - metadata=canned_query, - editable=False, - canned_query=e.table, - named_parameters=canned_query.get("params"), - write=bool(canned_query.get("write")), - ) - - return Response.text("Method not allowed", status=405) - - async def data(self, request, **kwargs): - from datasette.app import TableNotFound - - try: - await self.ds.resolve_table(request) - except TableNotFound as not_found: - canned_query = await self.ds.get_canned_query( - not_found.database_name, not_found.table, request.actor - ) - if canned_query: - return await QueryView(self.ds).data( - request, - canned_query["sql"], - metadata=canned_query, - editable=False, - canned_query=not_found.table, - named_parameters=canned_query.get("params"), - write=bool(canned_query.get("write")), - ) - else: - raise - - async def table_view_traced(datasette, request): from datasette.app import TableNotFound @@ -761,10 +708,7 @@ async def table_view_traced(datasette, request): ) # If this is a canned query, not a table, then dispatch to QueryView instead if canned_query: - if request.method == "POST": - return await CannedQueryView(datasette).post(request) - else: - return await CannedQueryView(datasette).get(request) + return await QueryView()(request, datasette) else: raise diff --git a/tests/test_canned_queries.py b/tests/test_canned_queries.py index d6a88733..e9ad3239 100644 --- a/tests/test_canned_queries.py +++ b/tests/test_canned_queries.py @@ -95,12 +95,12 @@ def test_insert(canned_write_client): csrftoken_from=True, cookies={"foo": "bar"}, ) - assert 302 == response.status - assert "/data/add_name?success" == response.headers["Location"] messages = canned_write_client.ds.unsign( response.cookies["ds_messages"], "messages" ) - assert [["Query executed, 1 row affected", 1]] == messages + assert messages == [["Query executed, 1 row affected", 1]] + assert response.status == 302 + assert response.headers["Location"] == "/data/add_name?success" @pytest.mark.parametrize( @@ -382,11 +382,11 @@ def test_magic_parameters_cannot_be_used_in_arbitrary_queries(magic_parameters_c def test_canned_write_custom_template(canned_write_client): response = canned_write_client.get("/data/update_name") assert response.status == 200 + assert "!!!CUSTOM_UPDATE_NAME_TEMPLATE!!!" in response.text assert ( "" in response.text ) - assert "!!!CUSTOM_UPDATE_NAME_TEMPLATE!!!" in response.text # And test for link rel=alternate while we're here: assert ( '' From 8920d425f4d417cfd998b61016c5ff3530cd34e1 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 10:20:58 -0700 Subject: [PATCH 468/952] 1.0a3 release notes, smaller changes section - refs #2135 --- docs/changelog.rst | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) diff --git a/docs/changelog.rst b/docs/changelog.rst index ee48d075..b4416f94 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -4,6 +4,25 @@ Changelog ========= +.. _v1_0_a3: + +1.0a3 (2023-08-09) +------------------ + +This alpha release previews the updated design for Datasette's default JSON API. + +Smaller changes +~~~~~~~~~~~~~~~ + +- Datasette documentation now shows YAML examples for :ref:`metadata` by default, with a tab interface for switching to JSON. (:issue:`1153`) +- :ref:`plugin_register_output_renderer` plugins now have access to ``error`` and ``truncated`` arguments, allowing them to display error messages and take into account truncated results. (:issue:`2130`) +- ``render_cell()`` plugin hook now also supports an optional ``request`` argument. (:issue:`2007`) +- New ``Justfile`` to support development workflows for Datasette using `Just `__. +- ``datasette.render_template()`` can now accepts a ``datasette.views.Context`` subclass as an alternative to a dictionary. (:issue:`2127`) +- ``datasette install -e path`` option for editable installations, useful while developing plugins. (:issue:`2106`) +- When started with the ``--cors`` option Datasette now serves an ``Access-Control-Max-Age: 3600`` header, ensuring CORS OPTIONS requests are repeated no more than once an hour. (:issue:`2079`) +- Fixed a bug where the ``_internal`` database could display ``None`` instead of ``null`` for in-memory databases. (:issue:`1970`) + .. _v0_64_2: 0.64.2 (2023-03-08) From e34d09c6ec16ff5e7717e112afdad67f7c05a62a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 12:01:59 -0700 Subject: [PATCH 469/952] Don't include columns in query JSON, refs #2136 --- datasette/renderer.py | 8 +++++++- datasette/views/database.py | 2 +- tests/test_api.py | 1 - tests/test_cli_serve_get.py | 11 ++++++----- 4 files changed, 14 insertions(+), 8 deletions(-) diff --git a/datasette/renderer.py b/datasette/renderer.py index 0bd74e81..224031a7 100644 --- a/datasette/renderer.py +++ b/datasette/renderer.py @@ -27,7 +27,7 @@ def convert_specific_columns_to_json(rows, columns, json_cols): return new_rows -def json_renderer(args, data, error, truncated=None): +def json_renderer(request, args, data, error, truncated=None): """Render a response as JSON""" status_code = 200 @@ -106,6 +106,12 @@ def json_renderer(args, data, error, truncated=None): "status": 400, "title": None, } + + # Don't include "columns" in output + # https://github.com/simonw/datasette/issues/2136 + if isinstance(data, dict) and "columns" not in request.args.getlist("_extra"): + data.pop("columns", None) + # Handle _nl option for _shape=array nl = args.get("_nl", "") if nl and shape == "array": diff --git a/datasette/views/database.py b/datasette/views/database.py index 658c35e6..cf76f3c2 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -548,7 +548,7 @@ class QueryView(View): error=query_error, # These will be deprecated in Datasette 1.0: args=request.args, - data={"rows": rows, "columns": columns}, + data={"ok": True, "rows": rows, "columns": columns}, ) if asyncio.iscoroutine(result): result = await result diff --git a/tests/test_api.py b/tests/test_api.py index 28415a0b..f96f571e 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -649,7 +649,6 @@ async def test_custom_sql(ds_client): {"content": "RENDER_CELL_DEMO"}, {"content": "RENDER_CELL_ASYNC"}, ], - "columns": ["content"], "ok": True, "truncated": False, } diff --git a/tests/test_cli_serve_get.py b/tests/test_cli_serve_get.py index 2e0390bb..dc7fc1e2 100644 --- a/tests/test_cli_serve_get.py +++ b/tests/test_cli_serve_get.py @@ -34,11 +34,12 @@ def test_serve_with_get(tmp_path_factory): "/_memory.json?sql=select+sqlite_version()", ], ) - assert 0 == result.exit_code, result.output - assert { - "truncated": False, - "columns": ["sqlite_version()"], - }.items() <= json.loads(result.output).items() + assert result.exit_code == 0, result.output + data = json.loads(result.output) + # Should have a single row with a single column + assert len(data["rows"]) == 1 + assert list(data["rows"][0].keys()) == ["sqlite_version()"] + assert set(data.keys()) == {"rows", "ok", "truncated"} # The plugin should have created hello.txt assert (plugins_dir / "hello.txt").read_text() == "hello" From 856ca68d94708c6e94673cb6bc28bf3e3ca17845 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 12:04:40 -0700 Subject: [PATCH 470/952] Update default JSON representation docs, refs #2135 --- docs/json_api.rst | 24 ++++++++++++++++-------- 1 file changed, 16 insertions(+), 8 deletions(-) diff --git a/docs/json_api.rst b/docs/json_api.rst index c273c2a8..16b997eb 100644 --- a/docs/json_api.rst +++ b/docs/json_api.rst @@ -9,10 +9,10 @@ through the Datasette user interface can also be accessed as JSON via the API. To access the API for a page, either click on the ``.json`` link on that page or edit the URL and add a ``.json`` extension to it. -.. _json_api_shapes: +.. _json_api_default: -Different shapes ----------------- +Default representation +---------------------- The default JSON representation of data from a SQLite table or custom query looks like this: @@ -21,7 +21,6 @@ looks like this: { "ok": true, - "next": null, "rows": [ { "id": 3, @@ -39,13 +38,22 @@ looks like this: "id": 1, "name": "San Francisco" } - ] + ], + "truncated": false } -The ``rows`` key is a list of objects, each one representing a row. ``next`` indicates if -there is another page, and ``ok`` is always ``true`` if an error did not occur. +``"ok"`` is always ``true`` if an error did not occur. -If ``next`` is present then the next page in the pagination set can be retrieved using ``?_next=VALUE``. +The ``"rows"`` key is a list of objects, each one representing a row. + +The ``"truncated"`` key lets you know if the query was truncated. This can happen if a SQL query returns more than 1,000 results (or the :ref:`setting_max_returned_rows` setting). + +For table pages, an additional key ``"next"`` may be present. This indicates that the next page in the pagination set can be retrieved using ``?_next=VALUE``. + +.. _json_api_shapes: + +Different shapes +---------------- The ``_shape`` parameter can be used to access alternative formats for the ``rows`` key which may be more convenient for your application. There are three From 90cb9ca58d910f49e8f117bbdd94df6f0855cf99 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 12:11:16 -0700 Subject: [PATCH 471/952] JSON changes in release notes, refs #2135 --- docs/changelog.rst | 35 ++++++++++++++++++++++++++++++++++- 1 file changed, 34 insertions(+), 1 deletion(-) diff --git a/docs/changelog.rst b/docs/changelog.rst index b4416f94..4c70855b 100644 --- a/docs/changelog.rst +++ b/docs/changelog.rst @@ -9,7 +9,40 @@ Changelog 1.0a3 (2023-08-09) ------------------ -This alpha release previews the updated design for Datasette's default JSON API. +This alpha release previews the updated design for Datasette's default JSON API. (:issue:`782`) + +The new :ref:`default JSON representation ` for both table pages (``/dbname/table.json``) and arbitrary SQL queries (``/dbname.json?sql=...``) is now shaped like this: + +.. code-block:: json + + { + "ok": true, + "rows": [ + { + "id": 3, + "name": "Detroit" + }, + { + "id": 2, + "name": "Los Angeles" + }, + { + "id": 4, + "name": "Memnonia" + }, + { + "id": 1, + "name": "San Francisco" + } + ], + "truncated": false + } + +Tables will include an additional ``"next"`` key for pagination, which can be passed to ``?_next=`` to fetch the next page of results. + +The various ``?_shape=`` options continue to work as before - see :ref:`json_api_shapes` for details. + +A new ``?_extra=`` mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in :issue:`262`. Smaller changes ~~~~~~~~~~~~~~~ From 19ab4552e212c9845a59461cc73e82d5ae8c278a Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 12:13:11 -0700 Subject: [PATCH 472/952] Release 1.0a3 Closes #2135 Refs #262, #782, #1153, #1970, #2007, #2079, #2106, #2127, #2130 --- datasette/version.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/datasette/version.py b/datasette/version.py index 3b81ab21..61dee464 100644 --- a/datasette/version.py +++ b/datasette/version.py @@ -1,2 +1,2 @@ -__version__ = "1.0a2" +__version__ = "1.0a3" __version_info__ = tuple(__version__.split(".")) From 4a42476bb7ce4c5ed941f944115dedd9bce34656 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 15:04:16 -0700 Subject: [PATCH 473/952] datasette plugins --requirements, closes #2133 --- datasette/cli.py | 12 ++++++++++-- docs/cli-reference.rst | 1 + docs/plugins.rst | 32 ++++++++++++++++++++++++++++---- tests/test_cli.py | 3 +++ 4 files changed, 42 insertions(+), 6 deletions(-) diff --git a/datasette/cli.py b/datasette/cli.py index 32266888..21fd25d6 100644 --- a/datasette/cli.py +++ b/datasette/cli.py @@ -223,15 +223,23 @@ pm.hook.publish_subcommand(publish=publish) @cli.command() @click.option("--all", help="Include built-in default plugins", is_flag=True) +@click.option( + "--requirements", help="Output requirements.txt of installed plugins", is_flag=True +) @click.option( "--plugins-dir", type=click.Path(exists=True, file_okay=False, dir_okay=True), help="Path to directory containing custom plugins", ) -def plugins(all, plugins_dir): +def plugins(all, requirements, plugins_dir): """List currently installed plugins""" app = Datasette([], plugins_dir=plugins_dir) - click.echo(json.dumps(app._plugins(all=all), indent=4)) + if requirements: + for plugin in app._plugins(): + if plugin["version"]: + click.echo("{}=={}".format(plugin["name"], plugin["version"])) + else: + click.echo(json.dumps(app._plugins(all=all), indent=4)) @cli.command() diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index 2177fc9e..7a96d311 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -282,6 +282,7 @@ Output JSON showing all currently installed plugins, their versions, whether the Options: --all Include built-in default plugins + --requirements Output requirements.txt of installed plugins --plugins-dir DIRECTORY Path to directory containing custom plugins --help Show this message and exit. diff --git a/docs/plugins.rst b/docs/plugins.rst index 979f94dd..19bfdd0c 100644 --- a/docs/plugins.rst +++ b/docs/plugins.rst @@ -90,7 +90,12 @@ You can see a list of installed plugins by navigating to the ``/-/plugins`` page You can also use the ``datasette plugins`` command:: - $ datasette plugins + datasette plugins + +Which outputs: + +.. code-block:: json + [ { "name": "datasette_json_html", @@ -107,7 +112,8 @@ You can also use the ``datasette plugins`` command:: cog.out("\n") result = CliRunner().invoke(cli.cli, ["plugins", "--all"]) # cog.out() with text containing newlines was unindenting for some reason - cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::\n") + cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:\n") + cog.outl(".. code-block:: json\n") plugins = [p for p in json.loads(result.output) if p["name"].startswith("datasette.")] indented = textwrap.indent(json.dumps(plugins, indent=4), " ") for line in indented.split("\n"): @@ -115,7 +121,9 @@ You can also use the ``datasette plugins`` command:: cog.out("\n\n") .. ]]] -If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:: +If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette: + +.. code-block:: json [ { @@ -236,6 +244,22 @@ If you run ``datasette plugins --all`` it will include default plugins that ship You can add the ``--plugins-dir=`` option to include any plugins found in that directory. +Add ``--requirements`` to output a list of installed plugins that can then be installed in another Datasette instance using ``datasette install -r requirements.txt``:: + + datasette plugins --requirements + +The output will look something like this:: + + datasette-codespaces==0.1.1 + datasette-graphql==2.2 + datasette-json-html==1.0.1 + datasette-pretty-json==0.2.2 + datasette-x-forwarded-host==0.1 + +To write that to a ``requirements.txt`` file, run this:: + + datasette plugins --requirements > requirements.txt + .. _plugins_configuration: Plugin configuration @@ -390,7 +414,7 @@ Any values embedded in ``metadata.yaml`` will be visible to anyone who views the If you are publishing your data using the :ref:`datasette publish ` family of commands, you can use the ``--plugin-secret`` option to set these secrets at publish time. For example, using Heroku you might run the following command:: - $ datasette publish heroku my_database.db \ + datasette publish heroku my_database.db \ --name my-heroku-app-demo \ --install=datasette-auth-github \ --plugin-secret datasette-auth-github client_id your_client_id \ diff --git a/tests/test_cli.py b/tests/test_cli.py index 75724f61..056e2821 100644 --- a/tests/test_cli.py +++ b/tests/test_cli.py @@ -108,6 +108,9 @@ def test_plugins_cli(app_client): assert set(names).issuperset({p["name"] for p in EXPECTED_PLUGINS}) # And the following too: assert set(names).issuperset(DEFAULT_PLUGINS) + # --requirements should be empty because there are no installed non-plugins-dir plugins + result3 = runner.invoke(cli, ["plugins", "--requirements"]) + assert result3.output == "" def test_metadata_yaml(): From a3593c901580ea50854c3e0774b0ba0126e8a76f Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 17:32:07 -0700 Subject: [PATCH 474/952] on_success_message_sql, closes #2138 --- datasette/views/database.py | 29 ++++++++++++++++---- docs/sql_queries.rst | 21 ++++++++++---- tests/test_canned_queries.py | 53 +++++++++++++++++++++++++++++++----- 3 files changed, 85 insertions(+), 18 deletions(-) diff --git a/datasette/views/database.py b/datasette/views/database.py index cf76f3c2..79b3f88d 100644 --- a/datasette/views/database.py +++ b/datasette/views/database.py @@ -360,6 +360,10 @@ class QueryView(View): params[key] = str(value) else: params = dict(parse_qsl(body, keep_blank_values=True)) + + # Don't ever send csrftoken as a SQL parameter + params.pop("csrftoken", None) + # Should we return JSON? should_return_json = ( request.headers.get("accept") == "application/json" @@ -371,12 +375,27 @@ class QueryView(View): redirect_url = None try: cursor = await db.execute_write(canned_query["sql"], params_for_query) - message = canned_query.get( - "on_success_message" - ) or "Query executed, {} row{} affected".format( - cursor.rowcount, "" if cursor.rowcount == 1 else "s" - ) + # success message can come from on_success_message or on_success_message_sql + message = None message_type = datasette.INFO + on_success_message_sql = canned_query.get("on_success_message_sql") + if on_success_message_sql: + try: + message_result = ( + await db.execute(on_success_message_sql, params_for_query) + ).first() + if message_result: + message = message_result[0] + except Exception as ex: + message = "Error running on_success_message_sql: {}".format(ex) + message_type = datasette.ERROR + if not message: + message = canned_query.get( + "on_success_message" + ) or "Query executed, {} row{} affected".format( + cursor.rowcount, "" if cursor.rowcount == 1 else "s" + ) + redirect_url = canned_query.get("on_success_redirect") ok = True except Exception as ex: diff --git a/docs/sql_queries.rst b/docs/sql_queries.rst index 3c2cb228..1ae07e1f 100644 --- a/docs/sql_queries.rst +++ b/docs/sql_queries.rst @@ -392,6 +392,7 @@ This configuration will create a page at ``/mydatabase/add_name`` displaying a f You can customize how Datasette represents success and errors using the following optional properties: - ``on_success_message`` - the message shown when a query is successful +- ``on_success_message_sql`` - alternative to ``on_success_message``: a SQL query that should be executed to generate the message - ``on_success_redirect`` - the path or URL the user is redirected to on success - ``on_error_message`` - the message shown when a query throws an error - ``on_error_redirect`` - the path or URL the user is redirected to on error @@ -405,11 +406,12 @@ For example: "queries": { "add_name": { "sql": "INSERT INTO names (name) VALUES (:name)", + "params": ["name"], "write": True, - "on_success_message": "Name inserted", + "on_success_message_sql": "select 'Name inserted: ' || :name", "on_success_redirect": "/mydatabase/names", "on_error_message": "Name insert failed", - "on_error_redirect": "/mydatabase" + "on_error_redirect": "/mydatabase", } } } @@ -426,8 +428,10 @@ For example: queries: add_name: sql: INSERT INTO names (name) VALUES (:name) + params: + - name write: true - on_success_message: Name inserted + on_success_message_sql: 'select ''Name inserted: '' || :name' on_success_redirect: /mydatabase/names on_error_message: Name insert failed on_error_redirect: /mydatabase @@ -443,8 +447,11 @@ For example: "queries": { "add_name": { "sql": "INSERT INTO names (name) VALUES (:name)", + "params": [ + "name" + ], "write": true, - "on_success_message": "Name inserted", + "on_success_message_sql": "select 'Name inserted: ' || :name", "on_success_redirect": "/mydatabase/names", "on_error_message": "Name insert failed", "on_error_redirect": "/mydatabase" @@ -455,10 +462,12 @@ For example: } .. [[[end]]] -You can use ``"params"`` to explicitly list the named parameters that should be displayed as form fields - otherwise they will be automatically detected. +You can use ``"params"`` to explicitly list the named parameters that should be displayed as form fields - otherwise they will be automatically detected. ``"params"`` is not necessary in the above example, since without it ``"name"`` would be automatically detected from the query. You can pre-populate form fields when the page first loads using a query string, e.g. ``/mydatabase/add_name?name=Prepopulated``. The user will have to submit the form to execute the query. +If you specify a query in ``"on_success_message_sql"``, that query will be executed after the main query. The first column of the first row return by that query will be displayed as a success message. Named parameters from the main query will be made available to the success message query as well. + .. _canned_queries_magic_parameters: Magic parameters @@ -589,7 +598,7 @@ The JSON response will look like this: "redirect": "/data/add_name" } -The ``"message"`` and ``"redirect"`` values here will take into account ``on_success_message``, ``on_success_redirect``, ``on_error_message`` and ``on_error_redirect``, if they have been set. +The ``"message"`` and ``"redirect"`` values here will take into account ``on_success_message``, ``on_success_message_sql``, ``on_success_redirect``, ``on_error_message`` and ``on_error_redirect``, if they have been set. .. _pagination: diff --git a/tests/test_canned_queries.py b/tests/test_canned_queries.py index e9ad3239..5256c24c 100644 --- a/tests/test_canned_queries.py +++ b/tests/test_canned_queries.py @@ -31,9 +31,15 @@ def canned_write_client(tmpdir): }, "add_name_specify_id": { "sql": "insert into names (rowid, name) values (:rowid, :name)", + "on_success_message_sql": "select 'Name added: ' || :name || ' with rowid ' || :rowid", "write": True, "on_error_redirect": "/data/add_name_specify_id?error", }, + "add_name_specify_id_with_error_in_on_success_message_sql": { + "sql": "insert into names (rowid, name) values (:rowid, :name)", + "on_success_message_sql": "select this is bad SQL", + "write": True, + }, "delete_name": { "sql": "delete from names where rowid = :rowid", "write": True, @@ -179,6 +185,34 @@ def test_insert_error(canned_write_client): ) +def test_on_success_message_sql(canned_write_client): + response = canned_write_client.post( + "/data/add_name_specify_id", + {"rowid": 5, "name": "Should be OK"}, + csrftoken_from=True, + ) + assert response.status == 302 + assert response.headers["Location"] == "/data/add_name_specify_id" + messages = canned_write_client.ds.unsign( + response.cookies["ds_messages"], "messages" + ) + assert messages == [["Name added: Should be OK with rowid 5", 1]] + + +def test_error_in_on_success_message_sql(canned_write_client): + response = canned_write_client.post( + "/data/add_name_specify_id_with_error_in_on_success_message_sql", + {"rowid": 1, "name": "Should fail"}, + csrftoken_from=True, + ) + messages = canned_write_client.ds.unsign( + response.cookies["ds_messages"], "messages" + ) + assert messages == [ + ["Error running on_success_message_sql: no such column: bad", 3] + ] + + def test_custom_params(canned_write_client): response = canned_write_client.get("/data/update_name?extra=foo") assert '' in response.text @@ -232,21 +266,22 @@ def test_canned_query_permissions_on_database_page(canned_write_client): query_names = { q["name"] for q in canned_write_client.get("/data.json").json["queries"] } - assert { + assert query_names == { + "add_name_specify_id_with_error_in_on_success_message_sql", + "from_hook", + "update_name", + "add_name_specify_id", + "from_async_hook", "canned_read", "add_name", - "add_name_specify_id", - "update_name", - "from_async_hook", - "from_hook", - } == query_names + } # With auth shows four response = canned_write_client.get( "/data.json", cookies={"ds_actor": canned_write_client.actor_cookie({"id": "root"})}, ) - assert 200 == response.status + assert response.status == 200 query_names_and_private = sorted( [ {"name": q["name"], "private": q["private"]} @@ -257,6 +292,10 @@ def test_canned_query_permissions_on_database_page(canned_write_client): assert query_names_and_private == [ {"name": "add_name", "private": False}, {"name": "add_name_specify_id", "private": False}, + { + "name": "add_name_specify_id_with_error_in_on_success_message_sql", + "private": False, + }, {"name": "canned_read", "private": False}, {"name": "delete_name", "private": True}, {"name": "from_async_hook", "private": False}, From 33251d04e78d575cca62bb59069bb43a7d924746 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Wed, 9 Aug 2023 17:56:27 -0700 Subject: [PATCH 475/952] Canned query write counters demo, refs #2134 --- .github/workflows/deploy-latest.yml | 30 +++++++++++++++++++++++++++++ 1 file changed, 30 insertions(+) diff --git a/.github/workflows/deploy-latest.yml b/.github/workflows/deploy-latest.yml index ed60376c..4746aa07 100644 --- a/.github/workflows/deploy-latest.yml +++ b/.github/workflows/deploy-latest.yml @@ -57,6 +57,36 @@ jobs: db.route = "alternative-route" ' > plugins/alternative_route.py cp fixtures.db fixtures2.db + - name: And the counters writable canned query demo + run: | + cat > plugins/counters.py < Date: Thu, 10 Aug 2023 22:16:19 -0700 Subject: [PATCH 476/952] Fixed display of database color Closes #2139, closes #2119 --- datasette/database.py | 7 +++++++ datasette/templates/database.html | 2 +- datasette/templates/query.html | 2 +- datasette/templates/row.html | 2 +- datasette/templates/table.html | 2 +- datasette/views/base.py | 4 ---- datasette/views/database.py | 8 +++----- datasette/views/index.py | 4 +--- datasette/views/row.py | 4 +++- datasette/views/table.py | 2 +- tests/test_html.py | 20 ++++++++++++++++++++ 11 files changed, 39 insertions(+), 18 deletions(-) diff --git a/datasette/database.py b/datasette/database.py index d8043c24..af39ac9e 100644 --- a/datasette/database.py +++ b/datasette/database.py @@ -1,6 +1,7 @@ import asyncio from collections import namedtuple from pathlib import Path +import hashlib import janus import queue import sys @@ -62,6 +63,12 @@ class Database: } return self._cached_table_counts + @property + def color(self): + if self.hash: + return self.hash[:6] + return hashlib.md5(self.name.encode("utf8")).hexdigest()[:6] + def suggest_name(self): if self.path: return Path(self.path).stem diff --git a/datasette/templates/database.html b/datasette/templates/database.html index 7acf0369..3d4dae07 100644 --- a/datasette/templates/database.html +++ b/datasette/templates/database.html @@ -10,7 +10,7 @@ {% block body_class %}db db-{{ database|to_css_class }}{% endblock %} {% block content %} -